Nov 22 01:34:21 np0005531888 kernel: Linux version 5.14.0-639.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025
Nov 22 01:34:21 np0005531888 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 22 01:34:21 np0005531888 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 01:34:21 np0005531888 kernel: BIOS-provided physical RAM map:
Nov 22 01:34:21 np0005531888 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 22 01:34:21 np0005531888 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 22 01:34:21 np0005531888 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 22 01:34:21 np0005531888 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 22 01:34:21 np0005531888 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 22 01:34:21 np0005531888 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 22 01:34:21 np0005531888 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 22 01:34:21 np0005531888 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 22 01:34:21 np0005531888 kernel: NX (Execute Disable) protection: active
Nov 22 01:34:21 np0005531888 kernel: APIC: Static calls initialized
Nov 22 01:34:21 np0005531888 kernel: SMBIOS 2.8 present.
Nov 22 01:34:21 np0005531888 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 22 01:34:21 np0005531888 kernel: Hypervisor detected: KVM
Nov 22 01:34:21 np0005531888 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 22 01:34:21 np0005531888 kernel: kvm-clock: using sched offset of 10674275702 cycles
Nov 22 01:34:21 np0005531888 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 22 01:34:21 np0005531888 kernel: tsc: Detected 2799.998 MHz processor
Nov 22 01:34:21 np0005531888 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 22 01:34:21 np0005531888 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 22 01:34:21 np0005531888 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 22 01:34:21 np0005531888 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 22 01:34:21 np0005531888 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 22 01:34:21 np0005531888 kernel: Using GB pages for direct mapping
Nov 22 01:34:21 np0005531888 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 22 01:34:21 np0005531888 kernel: ACPI: Early table checksum verification disabled
Nov 22 01:34:21 np0005531888 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 22 01:34:21 np0005531888 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:21 np0005531888 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:21 np0005531888 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:21 np0005531888 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 22 01:34:21 np0005531888 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:21 np0005531888 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:21 np0005531888 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 22 01:34:21 np0005531888 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 22 01:34:21 np0005531888 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 22 01:34:21 np0005531888 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 22 01:34:21 np0005531888 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 22 01:34:21 np0005531888 kernel: No NUMA configuration found
Nov 22 01:34:21 np0005531888 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 22 01:34:21 np0005531888 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 22 01:34:21 np0005531888 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 22 01:34:21 np0005531888 kernel: Zone ranges:
Nov 22 01:34:21 np0005531888 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 22 01:34:21 np0005531888 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 22 01:34:21 np0005531888 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 22 01:34:21 np0005531888 kernel:  Device   empty
Nov 22 01:34:21 np0005531888 kernel: Movable zone start for each node
Nov 22 01:34:21 np0005531888 kernel: Early memory node ranges
Nov 22 01:34:21 np0005531888 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 22 01:34:21 np0005531888 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 22 01:34:21 np0005531888 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 22 01:34:21 np0005531888 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 22 01:34:21 np0005531888 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 22 01:34:21 np0005531888 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 22 01:34:21 np0005531888 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 22 01:34:21 np0005531888 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 22 01:34:21 np0005531888 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 22 01:34:21 np0005531888 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 22 01:34:21 np0005531888 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 22 01:34:21 np0005531888 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 22 01:34:21 np0005531888 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 22 01:34:21 np0005531888 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 22 01:34:21 np0005531888 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 22 01:34:21 np0005531888 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 22 01:34:21 np0005531888 kernel: TSC deadline timer available
Nov 22 01:34:21 np0005531888 kernel: CPU topo: Max. logical packages:   8
Nov 22 01:34:21 np0005531888 kernel: CPU topo: Max. logical dies:       8
Nov 22 01:34:21 np0005531888 kernel: CPU topo: Max. dies per package:   1
Nov 22 01:34:21 np0005531888 kernel: CPU topo: Max. threads per core:   1
Nov 22 01:34:21 np0005531888 kernel: CPU topo: Num. cores per package:     1
Nov 22 01:34:21 np0005531888 kernel: CPU topo: Num. threads per package:   1
Nov 22 01:34:21 np0005531888 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 22 01:34:21 np0005531888 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 22 01:34:21 np0005531888 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 22 01:34:21 np0005531888 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 22 01:34:21 np0005531888 kernel: Booting paravirtualized kernel on KVM
Nov 22 01:34:21 np0005531888 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 22 01:34:21 np0005531888 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 22 01:34:21 np0005531888 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 22 01:34:21 np0005531888 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 22 01:34:21 np0005531888 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 01:34:21 np0005531888 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64", will be passed to user space.
Nov 22 01:34:21 np0005531888 kernel: random: crng init done
Nov 22 01:34:21 np0005531888 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: Fallback order for Node 0: 0 
Nov 22 01:34:21 np0005531888 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 22 01:34:21 np0005531888 kernel: Policy zone: Normal
Nov 22 01:34:21 np0005531888 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 22 01:34:21 np0005531888 kernel: software IO TLB: area num 8.
Nov 22 01:34:21 np0005531888 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 22 01:34:21 np0005531888 kernel: ftrace: allocating 49298 entries in 193 pages
Nov 22 01:34:21 np0005531888 kernel: ftrace: allocated 193 pages with 3 groups
Nov 22 01:34:21 np0005531888 kernel: Dynamic Preempt: voluntary
Nov 22 01:34:21 np0005531888 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 22 01:34:21 np0005531888 kernel: rcu: #011RCU event tracing is enabled.
Nov 22 01:34:21 np0005531888 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 22 01:34:21 np0005531888 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 22 01:34:21 np0005531888 kernel: #011Rude variant of Tasks RCU enabled.
Nov 22 01:34:21 np0005531888 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 22 01:34:21 np0005531888 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 22 01:34:21 np0005531888 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 22 01:34:21 np0005531888 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 01:34:21 np0005531888 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 01:34:21 np0005531888 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 01:34:21 np0005531888 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 22 01:34:21 np0005531888 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 22 01:34:21 np0005531888 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 22 01:34:21 np0005531888 kernel: Console: colour VGA+ 80x25
Nov 22 01:34:21 np0005531888 kernel: printk: console [ttyS0] enabled
Nov 22 01:34:21 np0005531888 kernel: ACPI: Core revision 20230331
Nov 22 01:34:21 np0005531888 kernel: APIC: Switch to symmetric I/O mode setup
Nov 22 01:34:21 np0005531888 kernel: x2apic enabled
Nov 22 01:34:21 np0005531888 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 22 01:34:21 np0005531888 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 22 01:34:21 np0005531888 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 22 01:34:21 np0005531888 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 22 01:34:21 np0005531888 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 22 01:34:21 np0005531888 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 22 01:34:21 np0005531888 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 22 01:34:21 np0005531888 kernel: Spectre V2 : Mitigation: Retpolines
Nov 22 01:34:21 np0005531888 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 22 01:34:21 np0005531888 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 22 01:34:21 np0005531888 kernel: RETBleed: Mitigation: untrained return thunk
Nov 22 01:34:21 np0005531888 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 22 01:34:21 np0005531888 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 22 01:34:21 np0005531888 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 22 01:34:21 np0005531888 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 22 01:34:21 np0005531888 kernel: x86/bugs: return thunk changed
Nov 22 01:34:21 np0005531888 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 22 01:34:21 np0005531888 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 22 01:34:21 np0005531888 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 22 01:34:21 np0005531888 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 22 01:34:21 np0005531888 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 22 01:34:21 np0005531888 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 22 01:34:21 np0005531888 kernel: Freeing SMP alternatives memory: 40K
Nov 22 01:34:21 np0005531888 kernel: pid_max: default: 32768 minimum: 301
Nov 22 01:34:21 np0005531888 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 22 01:34:21 np0005531888 kernel: landlock: Up and running.
Nov 22 01:34:21 np0005531888 kernel: Yama: becoming mindful.
Nov 22 01:34:21 np0005531888 kernel: SELinux:  Initializing.
Nov 22 01:34:21 np0005531888 kernel: LSM support for eBPF active
Nov 22 01:34:21 np0005531888 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 22 01:34:21 np0005531888 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 22 01:34:21 np0005531888 kernel: ... version:                0
Nov 22 01:34:21 np0005531888 kernel: ... bit width:              48
Nov 22 01:34:21 np0005531888 kernel: ... generic registers:      6
Nov 22 01:34:21 np0005531888 kernel: ... value mask:             0000ffffffffffff
Nov 22 01:34:21 np0005531888 kernel: ... max period:             00007fffffffffff
Nov 22 01:34:21 np0005531888 kernel: ... fixed-purpose events:   0
Nov 22 01:34:21 np0005531888 kernel: ... event mask:             000000000000003f
Nov 22 01:34:21 np0005531888 kernel: signal: max sigframe size: 1776
Nov 22 01:34:21 np0005531888 kernel: rcu: Hierarchical SRCU implementation.
Nov 22 01:34:21 np0005531888 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 22 01:34:21 np0005531888 kernel: smp: Bringing up secondary CPUs ...
Nov 22 01:34:21 np0005531888 kernel: smpboot: x86: Booting SMP configuration:
Nov 22 01:34:21 np0005531888 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 22 01:34:21 np0005531888 kernel: smp: Brought up 1 node, 8 CPUs
Nov 22 01:34:21 np0005531888 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 22 01:34:21 np0005531888 kernel: node 0 deferred pages initialised in 9ms
Nov 22 01:34:21 np0005531888 kernel: Memory: 7765868K/8388068K available (16384K kernel code, 5786K rwdata, 13900K rodata, 4188K init, 7176K bss, 616268K reserved, 0K cma-reserved)
Nov 22 01:34:21 np0005531888 kernel: devtmpfs: initialized
Nov 22 01:34:21 np0005531888 kernel: x86/mm: Memory block size: 128MB
Nov 22 01:34:21 np0005531888 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 22 01:34:21 np0005531888 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: pinctrl core: initialized pinctrl subsystem
Nov 22 01:34:21 np0005531888 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 22 01:34:21 np0005531888 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 22 01:34:21 np0005531888 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 22 01:34:21 np0005531888 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 22 01:34:21 np0005531888 kernel: audit: initializing netlink subsys (disabled)
Nov 22 01:34:21 np0005531888 kernel: audit: type=2000 audit(1763793259.196:1): state=initialized audit_enabled=0 res=1
Nov 22 01:34:21 np0005531888 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 22 01:34:21 np0005531888 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 22 01:34:21 np0005531888 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 22 01:34:21 np0005531888 kernel: cpuidle: using governor menu
Nov 22 01:34:21 np0005531888 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 22 01:34:21 np0005531888 kernel: PCI: Using configuration type 1 for base access
Nov 22 01:34:21 np0005531888 kernel: PCI: Using configuration type 1 for extended access
Nov 22 01:34:21 np0005531888 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 22 01:34:21 np0005531888 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 22 01:34:21 np0005531888 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 22 01:34:21 np0005531888 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 22 01:34:21 np0005531888 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 22 01:34:21 np0005531888 kernel: Demotion targets for Node 0: null
Nov 22 01:34:21 np0005531888 kernel: cryptd: max_cpu_qlen set to 1000
Nov 22 01:34:21 np0005531888 kernel: ACPI: Added _OSI(Module Device)
Nov 22 01:34:21 np0005531888 kernel: ACPI: Added _OSI(Processor Device)
Nov 22 01:34:21 np0005531888 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 22 01:34:21 np0005531888 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 22 01:34:21 np0005531888 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 22 01:34:21 np0005531888 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 22 01:34:21 np0005531888 kernel: ACPI: Interpreter enabled
Nov 22 01:34:21 np0005531888 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 22 01:34:21 np0005531888 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 22 01:34:21 np0005531888 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 22 01:34:21 np0005531888 kernel: PCI: Using E820 reservations for host bridge windows
Nov 22 01:34:21 np0005531888 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 22 01:34:21 np0005531888 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 22 01:34:21 np0005531888 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [3] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [4] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [5] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [6] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [7] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [8] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [9] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [10] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [11] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [12] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [13] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [14] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [15] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [16] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [17] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [18] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [19] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [20] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [21] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [22] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [23] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [24] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [25] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [26] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [27] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [28] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [29] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [30] registered
Nov 22 01:34:21 np0005531888 kernel: acpiphp: Slot [31] registered
Nov 22 01:34:21 np0005531888 kernel: PCI host bridge to bus 0000:00
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 22 01:34:21 np0005531888 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 22 01:34:21 np0005531888 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 22 01:34:21 np0005531888 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 22 01:34:21 np0005531888 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 22 01:34:21 np0005531888 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 22 01:34:21 np0005531888 kernel: iommu: Default domain type: Translated
Nov 22 01:34:21 np0005531888 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 22 01:34:21 np0005531888 kernel: SCSI subsystem initialized
Nov 22 01:34:21 np0005531888 kernel: ACPI: bus type USB registered
Nov 22 01:34:21 np0005531888 kernel: usbcore: registered new interface driver usbfs
Nov 22 01:34:21 np0005531888 kernel: usbcore: registered new interface driver hub
Nov 22 01:34:21 np0005531888 kernel: usbcore: registered new device driver usb
Nov 22 01:34:21 np0005531888 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 22 01:34:21 np0005531888 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 22 01:34:21 np0005531888 kernel: PTP clock support registered
Nov 22 01:34:21 np0005531888 kernel: EDAC MC: Ver: 3.0.0
Nov 22 01:34:21 np0005531888 kernel: NetLabel: Initializing
Nov 22 01:34:21 np0005531888 kernel: NetLabel:  domain hash size = 128
Nov 22 01:34:21 np0005531888 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 22 01:34:21 np0005531888 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 22 01:34:21 np0005531888 kernel: PCI: Using ACPI for IRQ routing
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 22 01:34:21 np0005531888 kernel: vgaarb: loaded
Nov 22 01:34:21 np0005531888 kernel: clocksource: Switched to clocksource kvm-clock
Nov 22 01:34:21 np0005531888 kernel: VFS: Disk quotas dquot_6.6.0
Nov 22 01:34:21 np0005531888 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 22 01:34:21 np0005531888 kernel: pnp: PnP ACPI init
Nov 22 01:34:21 np0005531888 kernel: pnp: PnP ACPI: found 5 devices
Nov 22 01:34:21 np0005531888 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 22 01:34:21 np0005531888 kernel: NET: Registered PF_INET protocol family
Nov 22 01:34:21 np0005531888 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 22 01:34:21 np0005531888 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 22 01:34:21 np0005531888 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 22 01:34:21 np0005531888 kernel: NET: Registered PF_XDP protocol family
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 22 01:34:21 np0005531888 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 22 01:34:21 np0005531888 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 22 01:34:21 np0005531888 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 109714 usecs
Nov 22 01:34:21 np0005531888 kernel: PCI: CLS 0 bytes, default 64
Nov 22 01:34:21 np0005531888 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 22 01:34:21 np0005531888 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 22 01:34:21 np0005531888 kernel: ACPI: bus type thunderbolt registered
Nov 22 01:34:21 np0005531888 kernel: Trying to unpack rootfs image as initramfs...
Nov 22 01:34:21 np0005531888 kernel: Initialise system trusted keyrings
Nov 22 01:34:21 np0005531888 kernel: Key type blacklist registered
Nov 22 01:34:21 np0005531888 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 22 01:34:21 np0005531888 kernel: zbud: loaded
Nov 22 01:34:21 np0005531888 kernel: integrity: Platform Keyring initialized
Nov 22 01:34:21 np0005531888 kernel: integrity: Machine keyring initialized
Nov 22 01:34:21 np0005531888 kernel: Freeing initrd memory: 85868K
Nov 22 01:34:21 np0005531888 kernel: NET: Registered PF_ALG protocol family
Nov 22 01:34:21 np0005531888 kernel: xor: automatically using best checksumming function   avx       
Nov 22 01:34:21 np0005531888 kernel: Key type asymmetric registered
Nov 22 01:34:21 np0005531888 kernel: Asymmetric key parser 'x509' registered
Nov 22 01:34:21 np0005531888 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 22 01:34:21 np0005531888 kernel: io scheduler mq-deadline registered
Nov 22 01:34:21 np0005531888 kernel: io scheduler kyber registered
Nov 22 01:34:21 np0005531888 kernel: io scheduler bfq registered
Nov 22 01:34:21 np0005531888 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 22 01:34:21 np0005531888 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 22 01:34:21 np0005531888 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 22 01:34:21 np0005531888 kernel: ACPI: button: Power Button [PWRF]
Nov 22 01:34:21 np0005531888 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 22 01:34:21 np0005531888 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 22 01:34:21 np0005531888 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 22 01:34:21 np0005531888 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 22 01:34:21 np0005531888 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 22 01:34:21 np0005531888 kernel: Non-volatile memory driver v1.3
Nov 22 01:34:21 np0005531888 kernel: rdac: device handler registered
Nov 22 01:34:21 np0005531888 kernel: hp_sw: device handler registered
Nov 22 01:34:21 np0005531888 kernel: emc: device handler registered
Nov 22 01:34:21 np0005531888 kernel: alua: device handler registered
Nov 22 01:34:21 np0005531888 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 22 01:34:21 np0005531888 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 22 01:34:21 np0005531888 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 22 01:34:21 np0005531888 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 22 01:34:21 np0005531888 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 22 01:34:21 np0005531888 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 22 01:34:21 np0005531888 kernel: usb usb1: Product: UHCI Host Controller
Nov 22 01:34:21 np0005531888 kernel: usb usb1: Manufacturer: Linux 5.14.0-639.el9.x86_64 uhci_hcd
Nov 22 01:34:21 np0005531888 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 22 01:34:21 np0005531888 kernel: hub 1-0:1.0: USB hub found
Nov 22 01:34:21 np0005531888 kernel: hub 1-0:1.0: 2 ports detected
Nov 22 01:34:21 np0005531888 kernel: usbcore: registered new interface driver usbserial_generic
Nov 22 01:34:21 np0005531888 kernel: usbserial: USB Serial support registered for generic
Nov 22 01:34:21 np0005531888 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 22 01:34:21 np0005531888 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 22 01:34:21 np0005531888 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 22 01:34:21 np0005531888 kernel: mousedev: PS/2 mouse device common for all mice
Nov 22 01:34:21 np0005531888 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 22 01:34:21 np0005531888 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 22 01:34:21 np0005531888 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 22 01:34:21 np0005531888 kernel: rtc_cmos 00:04: registered as rtc0
Nov 22 01:34:21 np0005531888 kernel: rtc_cmos 00:04: setting system clock to 2025-11-22T06:34:20 UTC (1763793260)
Nov 22 01:34:21 np0005531888 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 22 01:34:21 np0005531888 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 22 01:34:21 np0005531888 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 22 01:34:21 np0005531888 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 22 01:34:21 np0005531888 kernel: usbcore: registered new interface driver usbhid
Nov 22 01:34:21 np0005531888 kernel: usbhid: USB HID core driver
Nov 22 01:34:21 np0005531888 kernel: drop_monitor: Initializing network drop monitor service
Nov 22 01:34:21 np0005531888 kernel: Initializing XFRM netlink socket
Nov 22 01:34:21 np0005531888 kernel: NET: Registered PF_INET6 protocol family
Nov 22 01:34:21 np0005531888 kernel: Segment Routing with IPv6
Nov 22 01:34:21 np0005531888 kernel: NET: Registered PF_PACKET protocol family
Nov 22 01:34:21 np0005531888 kernel: mpls_gso: MPLS GSO support
Nov 22 01:34:21 np0005531888 kernel: IPI shorthand broadcast: enabled
Nov 22 01:34:21 np0005531888 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 22 01:34:21 np0005531888 kernel: AES CTR mode by8 optimization enabled
Nov 22 01:34:21 np0005531888 kernel: sched_clock: Marking stable (1238006600, 147083995)->(1492020016, -106929421)
Nov 22 01:34:21 np0005531888 kernel: registered taskstats version 1
Nov 22 01:34:21 np0005531888 kernel: Loading compiled-in X.509 certificates
Nov 22 01:34:21 np0005531888 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 22 01:34:21 np0005531888 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 22 01:34:21 np0005531888 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 22 01:34:21 np0005531888 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 22 01:34:21 np0005531888 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 22 01:34:21 np0005531888 kernel: Demotion targets for Node 0: null
Nov 22 01:34:21 np0005531888 kernel: page_owner is disabled
Nov 22 01:34:21 np0005531888 kernel: Key type .fscrypt registered
Nov 22 01:34:21 np0005531888 kernel: Key type fscrypt-provisioning registered
Nov 22 01:34:21 np0005531888 kernel: Key type big_key registered
Nov 22 01:34:21 np0005531888 kernel: Key type encrypted registered
Nov 22 01:34:21 np0005531888 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 22 01:34:21 np0005531888 kernel: Loading compiled-in module X.509 certificates
Nov 22 01:34:21 np0005531888 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 22 01:34:21 np0005531888 kernel: ima: Allocated hash algorithm: sha256
Nov 22 01:34:21 np0005531888 kernel: ima: No architecture policies found
Nov 22 01:34:21 np0005531888 kernel: evm: Initialising EVM extended attributes:
Nov 22 01:34:21 np0005531888 kernel: evm: security.selinux
Nov 22 01:34:21 np0005531888 kernel: evm: security.SMACK64 (disabled)
Nov 22 01:34:21 np0005531888 kernel: evm: security.SMACK64EXEC (disabled)
Nov 22 01:34:21 np0005531888 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 22 01:34:21 np0005531888 kernel: evm: security.SMACK64MMAP (disabled)
Nov 22 01:34:21 np0005531888 kernel: evm: security.apparmor (disabled)
Nov 22 01:34:21 np0005531888 kernel: evm: security.ima
Nov 22 01:34:21 np0005531888 kernel: evm: security.capability
Nov 22 01:34:21 np0005531888 kernel: evm: HMAC attrs: 0x1
Nov 22 01:34:21 np0005531888 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 22 01:34:21 np0005531888 kernel: Running certificate verification RSA selftest
Nov 22 01:34:21 np0005531888 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 22 01:34:21 np0005531888 kernel: Running certificate verification ECDSA selftest
Nov 22 01:34:21 np0005531888 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 22 01:34:21 np0005531888 kernel: clk: Disabling unused clocks
Nov 22 01:34:21 np0005531888 kernel: Freeing unused decrypted memory: 2028K
Nov 22 01:34:21 np0005531888 kernel: Freeing unused kernel image (initmem) memory: 4188K
Nov 22 01:34:21 np0005531888 kernel: Write protecting the kernel read-only data: 30720k
Nov 22 01:34:21 np0005531888 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 22 01:34:21 np0005531888 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 22 01:34:21 np0005531888 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 22 01:34:21 np0005531888 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 22 01:34:21 np0005531888 kernel: usb 1-1: Manufacturer: QEMU
Nov 22 01:34:21 np0005531888 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 22 01:34:21 np0005531888 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 22 01:34:21 np0005531888 kernel: Run /init as init process
Nov 22 01:34:21 np0005531888 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 22 01:34:21 np0005531888 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 22 01:34:21 np0005531888 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 22 01:34:21 np0005531888 systemd: Detected virtualization kvm.
Nov 22 01:34:21 np0005531888 systemd: Detected architecture x86-64.
Nov 22 01:34:21 np0005531888 systemd: Running in initrd.
Nov 22 01:34:21 np0005531888 systemd: No hostname configured, using default hostname.
Nov 22 01:34:21 np0005531888 systemd: Hostname set to <localhost>.
Nov 22 01:34:21 np0005531888 systemd: Initializing machine ID from VM UUID.
Nov 22 01:34:21 np0005531888 systemd: Queued start job for default target Initrd Default Target.
Nov 22 01:34:21 np0005531888 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 22 01:34:21 np0005531888 systemd: Reached target Local Encrypted Volumes.
Nov 22 01:34:21 np0005531888 systemd: Reached target Initrd /usr File System.
Nov 22 01:34:21 np0005531888 systemd: Reached target Local File Systems.
Nov 22 01:34:21 np0005531888 systemd: Reached target Path Units.
Nov 22 01:34:21 np0005531888 systemd: Reached target Slice Units.
Nov 22 01:34:21 np0005531888 systemd: Reached target Swaps.
Nov 22 01:34:21 np0005531888 systemd: Reached target Timer Units.
Nov 22 01:34:21 np0005531888 systemd: Listening on D-Bus System Message Bus Socket.
Nov 22 01:34:21 np0005531888 systemd: Listening on Journal Socket (/dev/log).
Nov 22 01:34:21 np0005531888 systemd: Listening on Journal Socket.
Nov 22 01:34:21 np0005531888 systemd: Listening on udev Control Socket.
Nov 22 01:34:21 np0005531888 systemd: Listening on udev Kernel Socket.
Nov 22 01:34:21 np0005531888 systemd: Reached target Socket Units.
Nov 22 01:34:21 np0005531888 systemd: Starting Create List of Static Device Nodes...
Nov 22 01:34:21 np0005531888 systemd: Starting Journal Service...
Nov 22 01:34:21 np0005531888 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 22 01:34:21 np0005531888 systemd: Starting Apply Kernel Variables...
Nov 22 01:34:21 np0005531888 systemd: Starting Create System Users...
Nov 22 01:34:21 np0005531888 systemd: Starting Setup Virtual Console...
Nov 22 01:34:21 np0005531888 systemd: Finished Create List of Static Device Nodes.
Nov 22 01:34:21 np0005531888 systemd: Finished Apply Kernel Variables.
Nov 22 01:34:21 np0005531888 systemd: Finished Create System Users.
Nov 22 01:34:21 np0005531888 systemd-journald[308]: Journal started
Nov 22 01:34:21 np0005531888 systemd-journald[308]: Runtime Journal (/run/log/journal/0008dc3a3a62409d980494baff3c1d3a) is 8.0M, max 153.6M, 145.6M free.
Nov 22 01:34:21 np0005531888 systemd-sysusers[312]: Creating group 'users' with GID 100.
Nov 22 01:34:21 np0005531888 systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Nov 22 01:34:21 np0005531888 systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 22 01:34:21 np0005531888 systemd: Starting Create Static Device Nodes in /dev...
Nov 22 01:34:21 np0005531888 systemd: Started Journal Service.
Nov 22 01:34:21 np0005531888 systemd[1]: Starting Create Volatile Files and Directories...
Nov 22 01:34:21 np0005531888 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 22 01:34:21 np0005531888 systemd[1]: Finished Create Volatile Files and Directories.
Nov 22 01:34:21 np0005531888 systemd[1]: Finished Setup Virtual Console.
Nov 22 01:34:21 np0005531888 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 22 01:34:21 np0005531888 systemd[1]: Starting dracut cmdline hook...
Nov 22 01:34:21 np0005531888 dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Nov 22 01:34:21 np0005531888 dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 01:34:21 np0005531888 systemd[1]: Finished dracut cmdline hook.
Nov 22 01:34:21 np0005531888 systemd[1]: Starting dracut pre-udev hook...
Nov 22 01:34:21 np0005531888 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 22 01:34:21 np0005531888 kernel: device-mapper: uevent: version 1.0.3
Nov 22 01:34:21 np0005531888 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 22 01:34:21 np0005531888 kernel: RPC: Registered named UNIX socket transport module.
Nov 22 01:34:21 np0005531888 kernel: RPC: Registered udp transport module.
Nov 22 01:34:21 np0005531888 kernel: RPC: Registered tcp transport module.
Nov 22 01:34:21 np0005531888 kernel: RPC: Registered tcp-with-tls transport module.
Nov 22 01:34:21 np0005531888 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 22 01:34:21 np0005531888 rpc.statd[444]: Version 2.5.4 starting
Nov 22 01:34:21 np0005531888 rpc.statd[444]: Initializing NSM state
Nov 22 01:34:21 np0005531888 rpc.idmapd[449]: Setting log level to 0
Nov 22 01:34:21 np0005531888 systemd[1]: Finished dracut pre-udev hook.
Nov 22 01:34:21 np0005531888 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 22 01:34:21 np0005531888 systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Nov 22 01:34:21 np0005531888 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 22 01:34:21 np0005531888 systemd[1]: Starting dracut pre-trigger hook...
Nov 22 01:34:21 np0005531888 systemd[1]: Finished dracut pre-trigger hook.
Nov 22 01:34:22 np0005531888 systemd[1]: Starting Coldplug All udev Devices...
Nov 22 01:34:22 np0005531888 systemd[1]: Created slice Slice /system/modprobe.
Nov 22 01:34:22 np0005531888 systemd[1]: Starting Load Kernel Module configfs...
Nov 22 01:34:22 np0005531888 systemd[1]: Finished Coldplug All udev Devices.
Nov 22 01:34:22 np0005531888 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 01:34:22 np0005531888 systemd[1]: Finished Load Kernel Module configfs.
Nov 22 01:34:22 np0005531888 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 22 01:34:22 np0005531888 systemd[1]: Reached target Network.
Nov 22 01:34:22 np0005531888 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 22 01:34:22 np0005531888 systemd[1]: Starting dracut initqueue hook...
Nov 22 01:34:22 np0005531888 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 22 01:34:22 np0005531888 systemd[1]: Mounting Kernel Configuration File System...
Nov 22 01:34:22 np0005531888 systemd[1]: Mounted Kernel Configuration File System.
Nov 22 01:34:22 np0005531888 systemd[1]: Reached target System Initialization.
Nov 22 01:34:22 np0005531888 systemd[1]: Reached target Basic System.
Nov 22 01:34:22 np0005531888 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 22 01:34:22 np0005531888 kernel: scsi host0: ata_piix
Nov 22 01:34:22 np0005531888 kernel: scsi host1: ata_piix
Nov 22 01:34:22 np0005531888 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 22 01:34:22 np0005531888 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 22 01:34:22 np0005531888 kernel: vda: vda1
Nov 22 01:34:22 np0005531888 kernel: ata1: found unknown device (class 0)
Nov 22 01:34:22 np0005531888 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 22 01:34:22 np0005531888 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 22 01:34:22 np0005531888 systemd-udevd[504]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 01:34:22 np0005531888 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 22 01:34:22 np0005531888 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 22 01:34:22 np0005531888 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 22 01:34:22 np0005531888 systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 22 01:34:22 np0005531888 systemd[1]: Reached target Initrd Root Device.
Nov 22 01:34:22 np0005531888 systemd[1]: Finished dracut initqueue hook.
Nov 22 01:34:22 np0005531888 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 22 01:34:22 np0005531888 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 22 01:34:22 np0005531888 systemd[1]: Reached target Remote File Systems.
Nov 22 01:34:22 np0005531888 systemd[1]: Starting dracut pre-mount hook...
Nov 22 01:34:22 np0005531888 systemd[1]: Finished dracut pre-mount hook.
Nov 22 01:34:22 np0005531888 systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 22 01:34:22 np0005531888 systemd-fsck[558]: /usr/sbin/fsck.xfs: XFS file system.
Nov 22 01:34:22 np0005531888 systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 22 01:34:22 np0005531888 systemd[1]: Mounting /sysroot...
Nov 22 01:34:23 np0005531888 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 22 01:34:23 np0005531888 kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 22 01:34:23 np0005531888 kernel: XFS (vda1): Ending clean mount
Nov 22 01:34:23 np0005531888 systemd[1]: Mounted /sysroot.
Nov 22 01:34:23 np0005531888 systemd[1]: Reached target Initrd Root File System.
Nov 22 01:34:23 np0005531888 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 22 01:34:23 np0005531888 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 22 01:34:23 np0005531888 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 22 01:34:23 np0005531888 systemd[1]: Reached target Initrd File Systems.
Nov 22 01:34:23 np0005531888 systemd[1]: Reached target Initrd Default Target.
Nov 22 01:34:23 np0005531888 systemd[1]: Starting dracut mount hook...
Nov 22 01:34:24 np0005531888 systemd[1]: Finished dracut mount hook.
Nov 22 01:34:24 np0005531888 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 22 01:34:24 np0005531888 rpc.idmapd[449]: exiting on signal 15
Nov 22 01:34:24 np0005531888 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 22 01:34:24 np0005531888 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Network.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Timer Units.
Nov 22 01:34:24 np0005531888 systemd[1]: dbus.socket: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 22 01:34:24 np0005531888 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Initrd Default Target.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Basic System.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Initrd Root Device.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Initrd /usr File System.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Path Units.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Remote File Systems.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Slice Units.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Socket Units.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target System Initialization.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Local File Systems.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Swaps.
Nov 22 01:34:24 np0005531888 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped dracut mount hook.
Nov 22 01:34:24 np0005531888 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped dracut pre-mount hook.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 22 01:34:24 np0005531888 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped dracut initqueue hook.
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Apply Kernel Variables.
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Coldplug All udev Devices.
Nov 22 01:34:24 np0005531888 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped dracut pre-trigger hook.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Setup Virtual Console.
Nov 22 01:34:24 np0005531888 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Closed udev Control Socket.
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Closed udev Kernel Socket.
Nov 22 01:34:24 np0005531888 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped dracut pre-udev hook.
Nov 22 01:34:24 np0005531888 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped dracut cmdline hook.
Nov 22 01:34:24 np0005531888 systemd[1]: Starting Cleanup udev Database...
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 22 01:34:24 np0005531888 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 22 01:34:24 np0005531888 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Stopped Create System Users.
Nov 22 01:34:24 np0005531888 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 22 01:34:24 np0005531888 systemd[1]: Finished Cleanup udev Database.
Nov 22 01:34:24 np0005531888 systemd[1]: Reached target Switch Root.
Nov 22 01:34:24 np0005531888 systemd[1]: Starting Switch Root...
Nov 22 01:34:24 np0005531888 systemd[1]: Switching root.
Nov 22 01:34:24 np0005531888 systemd-journald[308]: Journal stopped
Nov 22 01:34:26 np0005531888 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 22 01:34:26 np0005531888 kernel: audit: type=1404 audit(1763793264.799:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 22 01:34:26 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:34:26 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:34:26 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:34:26 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:34:26 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:34:26 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:34:26 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:34:26 np0005531888 kernel: audit: type=1403 audit(1763793265.020:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 22 01:34:26 np0005531888 systemd: Successfully loaded SELinux policy in 256.151ms.
Nov 22 01:34:26 np0005531888 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 40.734ms.
Nov 22 01:34:26 np0005531888 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 22 01:34:26 np0005531888 systemd: Detected virtualization kvm.
Nov 22 01:34:26 np0005531888 systemd: Detected architecture x86-64.
Nov 22 01:34:26 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 01:34:26 np0005531888 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 22 01:34:26 np0005531888 systemd: Stopped Switch Root.
Nov 22 01:34:26 np0005531888 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 22 01:34:26 np0005531888 systemd: Created slice Slice /system/getty.
Nov 22 01:34:26 np0005531888 systemd: Created slice Slice /system/serial-getty.
Nov 22 01:34:26 np0005531888 systemd: Created slice Slice /system/sshd-keygen.
Nov 22 01:34:26 np0005531888 systemd: Created slice User and Session Slice.
Nov 22 01:34:26 np0005531888 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 22 01:34:26 np0005531888 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 22 01:34:26 np0005531888 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 22 01:34:26 np0005531888 systemd: Reached target Local Encrypted Volumes.
Nov 22 01:34:26 np0005531888 systemd: Stopped target Switch Root.
Nov 22 01:34:26 np0005531888 systemd: Stopped target Initrd File Systems.
Nov 22 01:34:26 np0005531888 systemd: Stopped target Initrd Root File System.
Nov 22 01:34:26 np0005531888 systemd: Reached target Local Integrity Protected Volumes.
Nov 22 01:34:26 np0005531888 systemd: Reached target Path Units.
Nov 22 01:34:26 np0005531888 systemd: Reached target rpc_pipefs.target.
Nov 22 01:34:26 np0005531888 systemd: Reached target Slice Units.
Nov 22 01:34:26 np0005531888 systemd: Reached target Swaps.
Nov 22 01:34:26 np0005531888 systemd: Reached target Local Verity Protected Volumes.
Nov 22 01:34:26 np0005531888 systemd: Listening on RPCbind Server Activation Socket.
Nov 22 01:34:26 np0005531888 systemd: Reached target RPC Port Mapper.
Nov 22 01:34:26 np0005531888 systemd: Listening on Process Core Dump Socket.
Nov 22 01:34:26 np0005531888 systemd: Listening on initctl Compatibility Named Pipe.
Nov 22 01:34:26 np0005531888 systemd: Listening on udev Control Socket.
Nov 22 01:34:26 np0005531888 systemd: Listening on udev Kernel Socket.
Nov 22 01:34:26 np0005531888 systemd: Mounting Huge Pages File System...
Nov 22 01:34:26 np0005531888 systemd: Mounting POSIX Message Queue File System...
Nov 22 01:34:26 np0005531888 systemd: Mounting Kernel Debug File System...
Nov 22 01:34:26 np0005531888 systemd: Mounting Kernel Trace File System...
Nov 22 01:34:26 np0005531888 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 22 01:34:26 np0005531888 systemd: Starting Create List of Static Device Nodes...
Nov 22 01:34:26 np0005531888 systemd: Starting Load Kernel Module configfs...
Nov 22 01:34:26 np0005531888 systemd: Starting Load Kernel Module drm...
Nov 22 01:34:26 np0005531888 systemd: Starting Load Kernel Module efi_pstore...
Nov 22 01:34:26 np0005531888 systemd: Starting Load Kernel Module fuse...
Nov 22 01:34:26 np0005531888 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 22 01:34:26 np0005531888 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 22 01:34:26 np0005531888 systemd: Stopped File System Check on Root Device.
Nov 22 01:34:26 np0005531888 systemd: Stopped Journal Service.
Nov 22 01:34:26 np0005531888 systemd: Starting Journal Service...
Nov 22 01:34:26 np0005531888 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 22 01:34:26 np0005531888 systemd: Starting Generate network units from Kernel command line...
Nov 22 01:34:26 np0005531888 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 01:34:26 np0005531888 systemd: Starting Remount Root and Kernel File Systems...
Nov 22 01:34:26 np0005531888 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 22 01:34:26 np0005531888 systemd: Starting Apply Kernel Variables...
Nov 22 01:34:26 np0005531888 kernel: ACPI: bus type drm_connector registered
Nov 22 01:34:26 np0005531888 systemd: Starting Coldplug All udev Devices...
Nov 22 01:34:26 np0005531888 systemd-journald[681]: Journal started
Nov 22 01:34:26 np0005531888 systemd-journald[681]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 22 01:34:26 np0005531888 systemd[1]: Queued start job for default target Multi-User System.
Nov 22 01:34:26 np0005531888 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 22 01:34:26 np0005531888 systemd: Started Journal Service.
Nov 22 01:34:26 np0005531888 systemd[1]: Mounted Huge Pages File System.
Nov 22 01:34:26 np0005531888 kernel: fuse: init (API version 7.37)
Nov 22 01:34:26 np0005531888 systemd[1]: Mounted POSIX Message Queue File System.
Nov 22 01:34:26 np0005531888 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 22 01:34:26 np0005531888 systemd[1]: Mounted Kernel Debug File System.
Nov 22 01:34:26 np0005531888 systemd[1]: Mounted Kernel Trace File System.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Create List of Static Device Nodes.
Nov 22 01:34:26 np0005531888 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Load Kernel Module configfs.
Nov 22 01:34:26 np0005531888 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Load Kernel Module drm.
Nov 22 01:34:26 np0005531888 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 22 01:34:26 np0005531888 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Load Kernel Module fuse.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Generate network units from Kernel command line.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Apply Kernel Variables.
Nov 22 01:34:26 np0005531888 systemd[1]: Mounting FUSE Control File System...
Nov 22 01:34:26 np0005531888 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 22 01:34:26 np0005531888 systemd[1]: Starting Rebuild Hardware Database...
Nov 22 01:34:26 np0005531888 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 22 01:34:26 np0005531888 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 22 01:34:26 np0005531888 systemd[1]: Starting Load/Save OS Random Seed...
Nov 22 01:34:26 np0005531888 systemd[1]: Starting Create System Users...
Nov 22 01:34:26 np0005531888 systemd[1]: Mounted FUSE Control File System.
Nov 22 01:34:26 np0005531888 systemd-journald[681]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 22 01:34:26 np0005531888 systemd-journald[681]: Received client request to flush runtime journal.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Coldplug All udev Devices.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Load/Save OS Random Seed.
Nov 22 01:34:26 np0005531888 systemd[1]: Finished Create System Users.
Nov 22 01:34:26 np0005531888 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 22 01:34:26 np0005531888 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 22 01:34:27 np0005531888 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 22 01:34:27 np0005531888 systemd[1]: Reached target Preparation for Local File Systems.
Nov 22 01:34:27 np0005531888 systemd[1]: Reached target Local File Systems.
Nov 22 01:34:27 np0005531888 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 22 01:34:27 np0005531888 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 22 01:34:27 np0005531888 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 22 01:34:27 np0005531888 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 22 01:34:27 np0005531888 systemd[1]: Starting Automatic Boot Loader Update...
Nov 22 01:34:27 np0005531888 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 22 01:34:27 np0005531888 systemd[1]: Starting Create Volatile Files and Directories...
Nov 22 01:34:27 np0005531888 bootctl[699]: Couldn't find EFI system partition, skipping.
Nov 22 01:34:27 np0005531888 systemd[1]: Finished Automatic Boot Loader Update.
Nov 22 01:34:27 np0005531888 systemd[1]: Finished Create Volatile Files and Directories.
Nov 22 01:34:27 np0005531888 systemd[1]: Starting Security Auditing Service...
Nov 22 01:34:27 np0005531888 systemd[1]: Starting RPC Bind...
Nov 22 01:34:27 np0005531888 systemd[1]: Starting Rebuild Journal Catalog...
Nov 22 01:34:27 np0005531888 auditd[705]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 22 01:34:27 np0005531888 auditd[705]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 22 01:34:27 np0005531888 systemd[1]: Finished Rebuild Journal Catalog.
Nov 22 01:34:27 np0005531888 systemd[1]: Started RPC Bind.
Nov 22 01:34:27 np0005531888 augenrules[710]: /sbin/augenrules: No change
Nov 22 01:34:27 np0005531888 augenrules[725]: No rules
Nov 22 01:34:27 np0005531888 augenrules[725]: enabled 1
Nov 22 01:34:27 np0005531888 augenrules[725]: failure 1
Nov 22 01:34:27 np0005531888 augenrules[725]: pid 705
Nov 22 01:34:27 np0005531888 augenrules[725]: rate_limit 0
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_limit 8192
Nov 22 01:34:27 np0005531888 augenrules[725]: lost 0
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog 3
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_wait_time 60000
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_wait_time_actual 0
Nov 22 01:34:27 np0005531888 augenrules[725]: enabled 1
Nov 22 01:34:27 np0005531888 augenrules[725]: failure 1
Nov 22 01:34:27 np0005531888 augenrules[725]: pid 705
Nov 22 01:34:27 np0005531888 augenrules[725]: rate_limit 0
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_limit 8192
Nov 22 01:34:27 np0005531888 augenrules[725]: lost 0
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog 4
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_wait_time 60000
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_wait_time_actual 0
Nov 22 01:34:27 np0005531888 augenrules[725]: enabled 1
Nov 22 01:34:27 np0005531888 augenrules[725]: failure 1
Nov 22 01:34:27 np0005531888 augenrules[725]: pid 705
Nov 22 01:34:27 np0005531888 augenrules[725]: rate_limit 0
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_limit 8192
Nov 22 01:34:27 np0005531888 augenrules[725]: lost 0
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog 3
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_wait_time 60000
Nov 22 01:34:27 np0005531888 augenrules[725]: backlog_wait_time_actual 0
Nov 22 01:34:27 np0005531888 systemd[1]: Started Security Auditing Service.
Nov 22 01:34:27 np0005531888 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 22 01:34:27 np0005531888 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 22 01:34:28 np0005531888 systemd[1]: Finished Rebuild Hardware Database.
Nov 22 01:34:28 np0005531888 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 22 01:34:28 np0005531888 systemd-udevd[733]: Using default interface naming scheme 'rhel-9.0'.
Nov 22 01:34:28 np0005531888 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 22 01:34:28 np0005531888 systemd[1]: Starting Load Kernel Module configfs...
Nov 22 01:34:28 np0005531888 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 22 01:34:28 np0005531888 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 01:34:28 np0005531888 systemd[1]: Finished Load Kernel Module configfs.
Nov 22 01:34:28 np0005531888 systemd-udevd[740]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 01:34:28 np0005531888 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 22 01:34:28 np0005531888 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 22 01:34:28 np0005531888 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 22 01:34:28 np0005531888 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 22 01:34:28 np0005531888 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 22 01:34:28 np0005531888 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 22 01:34:28 np0005531888 kernel: Console: switching to colour dummy device 80x25
Nov 22 01:34:28 np0005531888 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 22 01:34:28 np0005531888 kernel: [drm] features: -context_init
Nov 22 01:34:28 np0005531888 kernel: [drm] number of scanouts: 1
Nov 22 01:34:28 np0005531888 kernel: [drm] number of cap sets: 0
Nov 22 01:34:28 np0005531888 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 22 01:34:28 np0005531888 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 22 01:34:28 np0005531888 kernel: Console: switching to colour frame buffer device 128x48
Nov 22 01:34:28 np0005531888 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 22 01:34:28 np0005531888 kernel: kvm_amd: TSC scaling supported
Nov 22 01:34:28 np0005531888 kernel: kvm_amd: Nested Virtualization enabled
Nov 22 01:34:28 np0005531888 kernel: kvm_amd: Nested Paging enabled
Nov 22 01:34:28 np0005531888 kernel: kvm_amd: LBR virtualization supported
Nov 22 01:34:29 np0005531888 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 22 01:34:29 np0005531888 systemd[1]: Starting Update is Completed...
Nov 22 01:34:29 np0005531888 systemd[1]: Finished Update is Completed.
Nov 22 01:34:29 np0005531888 systemd[1]: Reached target System Initialization.
Nov 22 01:34:29 np0005531888 systemd[1]: Started dnf makecache --timer.
Nov 22 01:34:29 np0005531888 systemd[1]: Started Daily rotation of log files.
Nov 22 01:34:29 np0005531888 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 22 01:34:29 np0005531888 systemd[1]: Reached target Timer Units.
Nov 22 01:34:29 np0005531888 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 22 01:34:29 np0005531888 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 22 01:34:29 np0005531888 systemd[1]: Reached target Socket Units.
Nov 22 01:34:29 np0005531888 systemd[1]: Starting D-Bus System Message Bus...
Nov 22 01:34:29 np0005531888 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 01:34:29 np0005531888 systemd[1]: Started D-Bus System Message Bus.
Nov 22 01:34:29 np0005531888 systemd[1]: Reached target Basic System.
Nov 22 01:34:29 np0005531888 dbus-broker-lau[814]: Ready
Nov 22 01:34:29 np0005531888 systemd[1]: Starting NTP client/server...
Nov 22 01:34:29 np0005531888 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 22 01:34:29 np0005531888 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 22 01:34:29 np0005531888 systemd[1]: Starting IPv4 firewall with iptables...
Nov 22 01:34:29 np0005531888 systemd[1]: Started irqbalance daemon.
Nov 22 01:34:29 np0005531888 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 22 01:34:29 np0005531888 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 01:34:29 np0005531888 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 01:34:29 np0005531888 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 01:34:29 np0005531888 systemd[1]: Reached target sshd-keygen.target.
Nov 22 01:34:29 np0005531888 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 22 01:34:29 np0005531888 systemd[1]: Reached target User and Group Name Lookups.
Nov 22 01:34:29 np0005531888 systemd[1]: Starting User Login Management...
Nov 22 01:34:29 np0005531888 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 22 01:34:29 np0005531888 systemd-logind[825]: New seat seat0.
Nov 22 01:34:29 np0005531888 systemd-logind[825]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 22 01:34:29 np0005531888 systemd-logind[825]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 22 01:34:29 np0005531888 systemd[1]: Started User Login Management.
Nov 22 01:34:29 np0005531888 chronyd[833]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 22 01:34:29 np0005531888 chronyd[833]: Loaded 0 symmetric keys
Nov 22 01:34:29 np0005531888 chronyd[833]: Using right/UTC timezone to obtain leap second data
Nov 22 01:34:29 np0005531888 chronyd[833]: Loaded seccomp filter (level 2)
Nov 22 01:34:29 np0005531888 systemd[1]: Started NTP client/server.
Nov 22 01:34:29 np0005531888 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 22 01:34:29 np0005531888 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 22 01:34:29 np0005531888 iptables.init[819]: iptables: Applying firewall rules: [  OK  ]
Nov 22 01:34:29 np0005531888 systemd[1]: Finished IPv4 firewall with iptables.
Nov 22 01:34:32 np0005531888 cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 22 Nov 2025 06:34:31 +0000. Up 12.67 seconds.
Nov 22 01:34:32 np0005531888 systemd[1]: run-cloud\x2dinit-tmp-tmp6wh0uq9e.mount: Deactivated successfully.
Nov 22 01:34:32 np0005531888 systemd[1]: Starting Hostname Service...
Nov 22 01:34:32 np0005531888 systemd[1]: Started Hostname Service.
Nov 22 01:34:32 np0005531888 systemd-hostnamed[857]: Hostname set to <np0005531888.novalocal> (static)
Nov 22 01:34:32 np0005531888 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 22 01:34:32 np0005531888 systemd[1]: Reached target Preparation for Network.
Nov 22 01:34:32 np0005531888 systemd[1]: Starting Network Manager...
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.0635] NetworkManager (version 1.54.1-1.el9) is starting... (boot:599dca8e-896c-427a-bdfa-1bf204f481cd)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.0642] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.1452] manager[0x55ae26463080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.1592] hostname: hostname: using hostnamed
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.1592] hostname: static hostname changed from (none) to "np0005531888.novalocal"
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.1601] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.1826] manager[0x55ae26463080]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.1827] manager[0x55ae26463080]: rfkill: WWAN hardware radio set enabled
Nov 22 01:34:33 np0005531888 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2197] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2197] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2198] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2199] manager: Networking is enabled by state file
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2200] settings: Loaded settings plugin: keyfile (internal)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2244] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2272] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2302] dhcp: init: Using DHCP client 'internal'
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2305] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2321] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 01:34:33 np0005531888 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2335] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2343] device (lo): Activation: starting connection 'lo' (94d7c5fa-5249-4bb3-8e91-0e8922804c08)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2354] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2358] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 01:34:33 np0005531888 systemd[1]: Started Network Manager.
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2387] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2391] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2393] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2395] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2397] device (eth0): carrier: link connected
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2400] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 01:34:33 np0005531888 systemd[1]: Reached target Network.
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2405] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2417] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2422] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2422] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2424] manager: NetworkManager state is now CONNECTING
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2426] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2433] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2436] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:34:33 np0005531888 systemd[1]: Starting Network Manager Wait Online...
Nov 22 01:34:33 np0005531888 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 22 01:34:33 np0005531888 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2682] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2686] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 01:34:33 np0005531888 NetworkManager[861]: <info>  [1763793273.2692] device (lo): Activation: successful, device activated.
Nov 22 01:34:33 np0005531888 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 22 01:34:33 np0005531888 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 22 01:34:33 np0005531888 systemd[1]: Reached target NFS client services.
Nov 22 01:34:33 np0005531888 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 22 01:34:33 np0005531888 systemd[1]: Reached target Remote File Systems.
Nov 22 01:34:33 np0005531888 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4190] dhcp4 (eth0): state changed new lease, address=38.129.56.229
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4207] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4251] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4285] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4288] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4293] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4299] device (eth0): Activation: successful, device activated.
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4309] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 01:34:34 np0005531888 NetworkManager[861]: <info>  [1763793274.4315] manager: startup complete
Nov 22 01:34:34 np0005531888 systemd[1]: Finished Network Manager Wait Online.
Nov 22 01:34:34 np0005531888 systemd[1]: Starting Cloud-init: Network Stage...
Nov 22 01:34:34 np0005531888 cloud-init[926]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 22 Nov 2025 06:34:34 +0000. Up 15.44 seconds.
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |  eth0  | True |        38.129.56.229         | 255.255.255.0 | global | fa:16:3e:67:75:e5 |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |  eth0  | True | fe80::f816:3eff:fe67:75e5/64 |       .       |  link  | fa:16:3e:67:75:e5 |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 22 01:34:34 np0005531888 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 01:34:38 np0005531888 cloud-init[926]: Generating public/private rsa key pair.
Nov 22 01:34:38 np0005531888 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 22 01:34:38 np0005531888 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 22 01:34:38 np0005531888 cloud-init[926]: The key fingerprint is:
Nov 22 01:34:38 np0005531888 cloud-init[926]: SHA256:vlUwp29CkPI/uboBcUqcoFQrDZuXXSNt40txYNV9q7A root@np0005531888.novalocal
Nov 22 01:34:38 np0005531888 cloud-init[926]: The key's randomart image is:
Nov 22 01:34:38 np0005531888 cloud-init[926]: +---[RSA 3072]----+
Nov 22 01:34:38 np0005531888 cloud-init[926]: |  o.o ..=o.. .   |
Nov 22 01:34:38 np0005531888 cloud-init[926]: | . * * ==o. . . .|
Nov 22 01:34:38 np0005531888 cloud-init[926]: |  = = Bo++o .  ..|
Nov 22 01:34:38 np0005531888 cloud-init[926]: |   o . *o. *   . |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |      o.S.o + .  |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |       o.o E .   |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |        o * o    |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |         + =     |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |        +o.      |
Nov 22 01:34:38 np0005531888 cloud-init[926]: +----[SHA256]-----+
Nov 22 01:34:38 np0005531888 cloud-init[926]: Generating public/private ecdsa key pair.
Nov 22 01:34:38 np0005531888 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 22 01:34:38 np0005531888 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 22 01:34:38 np0005531888 cloud-init[926]: The key fingerprint is:
Nov 22 01:34:38 np0005531888 cloud-init[926]: SHA256:s7y/EQiKlhD8OIeg7V4/hvXLJES8jzFJ3hpBO7Je/oY root@np0005531888.novalocal
Nov 22 01:34:38 np0005531888 cloud-init[926]: The key's randomart image is:
Nov 22 01:34:38 np0005531888 cloud-init[926]: +---[ECDSA 256]---+
Nov 22 01:34:38 np0005531888 cloud-init[926]: |o     .          |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |.o   o .         |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |+.+ . O          |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |.=.= B B .       |
Nov 22 01:34:38 np0005531888 cloud-init[926]: | .* o X S .      |
Nov 22 01:34:38 np0005531888 cloud-init[926]: | ....+.O o .     |
Nov 22 01:34:38 np0005531888 cloud-init[926]: | . ..+=o= .      |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |  . . E=o. .     |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |     . o=oo.     |
Nov 22 01:34:38 np0005531888 cloud-init[926]: +----[SHA256]-----+
Nov 22 01:34:38 np0005531888 cloud-init[926]: Generating public/private ed25519 key pair.
Nov 22 01:34:38 np0005531888 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 22 01:34:38 np0005531888 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 22 01:34:38 np0005531888 cloud-init[926]: The key fingerprint is:
Nov 22 01:34:38 np0005531888 cloud-init[926]: SHA256:JPH3Reqw+F3eDvQWL0NQEHwSjnBm0rsGnq2zXiBhgxY root@np0005531888.novalocal
Nov 22 01:34:38 np0005531888 cloud-init[926]: The key's randomart image is:
Nov 22 01:34:38 np0005531888 cloud-init[926]: +--[ED25519 256]--+
Nov 22 01:34:38 np0005531888 cloud-init[926]: |    E . o.+.++o  |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |     o o *.oo+.  |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |    o = o +.+o.  |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |   . . =.o.= o   |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |      ..S+..o +. |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |       .oo+. = oo|
Nov 22 01:34:38 np0005531888 cloud-init[926]: |         oo . = =|
Nov 22 01:34:38 np0005531888 cloud-init[926]: |        o.     * |
Nov 22 01:34:38 np0005531888 cloud-init[926]: |       .oo      .|
Nov 22 01:34:38 np0005531888 cloud-init[926]: +----[SHA256]-----+
Nov 22 01:34:38 np0005531888 systemd[1]: Finished Cloud-init: Network Stage.
Nov 22 01:34:38 np0005531888 systemd[1]: Reached target Cloud-config availability.
Nov 22 01:34:38 np0005531888 systemd[1]: Reached target Network is Online.
Nov 22 01:34:38 np0005531888 systemd[1]: Starting Cloud-init: Config Stage...
Nov 22 01:34:38 np0005531888 systemd[1]: Starting Crash recovery kernel arming...
Nov 22 01:34:38 np0005531888 systemd[1]: Starting Notify NFS peers of a restart...
Nov 22 01:34:38 np0005531888 systemd[1]: Starting System Logging Service...
Nov 22 01:34:38 np0005531888 systemd[1]: Starting OpenSSH server daemon...
Nov 22 01:34:38 np0005531888 systemd[1]: Starting Permit User Sessions...
Nov 22 01:34:38 np0005531888 systemd[1]: Started OpenSSH server daemon.
Nov 22 01:34:38 np0005531888 sm-notify[1009]: Version 2.5.4 starting
Nov 22 01:34:38 np0005531888 systemd[1]: Started Notify NFS peers of a restart.
Nov 22 01:34:38 np0005531888 systemd[1]: Finished Permit User Sessions.
Nov 22 01:34:38 np0005531888 systemd[1]: Started Command Scheduler.
Nov 22 01:34:38 np0005531888 systemd[1]: Started Getty on tty1.
Nov 22 01:34:38 np0005531888 systemd[1]: Started Serial Getty on ttyS0.
Nov 22 01:34:38 np0005531888 systemd[1]: Reached target Login Prompts.
Nov 22 01:34:39 np0005531888 rsyslogd[1010]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1010" x-info="https://www.rsyslog.com"] start
Nov 22 01:34:39 np0005531888 rsyslogd[1010]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 22 01:34:39 np0005531888 systemd[1]: Started System Logging Service.
Nov 22 01:34:39 np0005531888 systemd[1]: Reached target Multi-User System.
Nov 22 01:34:39 np0005531888 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 22 01:34:39 np0005531888 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 22 01:34:39 np0005531888 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 22 01:34:39 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 01:34:39 np0005531888 cloud-init[1087]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 22 Nov 2025 06:34:39 +0000. Up 19.85 seconds.
Nov 22 01:34:39 np0005531888 kdumpctl[1021]: kdump: No kdump initial ramdisk found.
Nov 22 01:34:39 np0005531888 kdumpctl[1021]: kdump: Rebuilding /boot/initramfs-5.14.0-639.el9.x86_64kdump.img
Nov 22 01:34:39 np0005531888 systemd[1]: Finished Cloud-init: Config Stage.
Nov 22 01:34:39 np0005531888 systemd[1]: Starting Cloud-init: Final Stage...
Nov 22 01:34:39 np0005531888 chronyd[833]: Selected source 138.197.135.239 (2.centos.pool.ntp.org)
Nov 22 01:34:39 np0005531888 chronyd[833]: System clock TAI offset set to 37 seconds
Nov 22 01:34:39 np0005531888 cloud-init[1238]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 22 Nov 2025 06:34:39 +0000. Up 20.27 seconds.
Nov 22 01:34:39 np0005531888 cloud-init[1257]: #############################################################
Nov 22 01:34:39 np0005531888 cloud-init[1262]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 22 01:34:39 np0005531888 cloud-init[1268]: 256 SHA256:s7y/EQiKlhD8OIeg7V4/hvXLJES8jzFJ3hpBO7Je/oY root@np0005531888.novalocal (ECDSA)
Nov 22 01:34:39 np0005531888 cloud-init[1274]: 256 SHA256:JPH3Reqw+F3eDvQWL0NQEHwSjnBm0rsGnq2zXiBhgxY root@np0005531888.novalocal (ED25519)
Nov 22 01:34:39 np0005531888 cloud-init[1281]: 3072 SHA256:vlUwp29CkPI/uboBcUqcoFQrDZuXXSNt40txYNV9q7A root@np0005531888.novalocal (RSA)
Nov 22 01:34:39 np0005531888 cloud-init[1282]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 22 01:34:39 np0005531888 cloud-init[1283]: #############################################################
Nov 22 01:34:39 np0005531888 cloud-init[1238]: Cloud-init v. 24.4-7.el9 finished at Sat, 22 Nov 2025 06:34:39 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 20.50 seconds
Nov 22 01:34:39 np0005531888 dracut[1305]: dracut-057-102.git20250818.el9
Nov 22 01:34:39 np0005531888 systemd[1]: Finished Cloud-init: Final Stage.
Nov 22 01:34:39 np0005531888 systemd[1]: Reached target Cloud-init target.
Nov 22 01:34:40 np0005531888 dracut[1307]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-639.el9.x86_64kdump.img 5.14.0-639.el9.x86_64
Nov 22 01:34:40 np0005531888 irqbalance[820]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 22 01:34:40 np0005531888 irqbalance[820]: IRQ 25 affinity is now unmanaged
Nov 22 01:34:40 np0005531888 irqbalance[820]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 22 01:34:40 np0005531888 irqbalance[820]: IRQ 31 affinity is now unmanaged
Nov 22 01:34:40 np0005531888 irqbalance[820]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 22 01:34:40 np0005531888 irqbalance[820]: IRQ 28 affinity is now unmanaged
Nov 22 01:34:40 np0005531888 irqbalance[820]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 22 01:34:40 np0005531888 irqbalance[820]: IRQ 32 affinity is now unmanaged
Nov 22 01:34:40 np0005531888 irqbalance[820]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 22 01:34:40 np0005531888 irqbalance[820]: IRQ 30 affinity is now unmanaged
Nov 22 01:34:40 np0005531888 irqbalance[820]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 22 01:34:40 np0005531888 irqbalance[820]: IRQ 29 affinity is now unmanaged
Nov 22 01:34:40 np0005531888 dracut[1307]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 22 01:34:40 np0005531888 dracut[1307]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 22 01:34:40 np0005531888 dracut[1307]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: memstrack is not available
Nov 22 01:34:41 np0005531888 dracut[1307]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531888 dracut[1307]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 22 01:34:42 np0005531888 dracut[1307]: memstrack is not available
Nov 22 01:34:42 np0005531888 dracut[1307]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 22 01:34:42 np0005531888 dracut[1307]: *** Including module: systemd ***
Nov 22 01:34:42 np0005531888 dracut[1307]: *** Including module: fips ***
Nov 22 01:34:43 np0005531888 dracut[1307]: *** Including module: systemd-initrd ***
Nov 22 01:34:43 np0005531888 dracut[1307]: *** Including module: i18n ***
Nov 22 01:34:43 np0005531888 dracut[1307]: *** Including module: drm ***
Nov 22 01:34:43 np0005531888 dracut[1307]: *** Including module: prefixdevname ***
Nov 22 01:34:43 np0005531888 dracut[1307]: *** Including module: kernel-modules ***
Nov 22 01:34:44 np0005531888 kernel: block vda: the capability attribute has been deprecated.
Nov 22 01:34:44 np0005531888 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 01:34:44 np0005531888 dracut[1307]: *** Including module: kernel-modules-extra ***
Nov 22 01:34:44 np0005531888 dracut[1307]: *** Including module: qemu ***
Nov 22 01:34:44 np0005531888 dracut[1307]: *** Including module: fstab-sys ***
Nov 22 01:34:44 np0005531888 dracut[1307]: *** Including module: rootfs-block ***
Nov 22 01:34:44 np0005531888 dracut[1307]: *** Including module: terminfo ***
Nov 22 01:34:44 np0005531888 dracut[1307]: *** Including module: udev-rules ***
Nov 22 01:34:45 np0005531888 dracut[1307]: Skipping udev rule: 91-permissions.rules
Nov 22 01:34:45 np0005531888 dracut[1307]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 22 01:34:45 np0005531888 dracut[1307]: *** Including module: virtiofs ***
Nov 22 01:34:45 np0005531888 dracut[1307]: *** Including module: dracut-systemd ***
Nov 22 01:34:45 np0005531888 dracut[1307]: *** Including module: usrmount ***
Nov 22 01:34:45 np0005531888 dracut[1307]: *** Including module: base ***
Nov 22 01:34:45 np0005531888 dracut[1307]: *** Including module: fs-lib ***
Nov 22 01:34:45 np0005531888 dracut[1307]: *** Including module: kdumpbase ***
Nov 22 01:34:46 np0005531888 dracut[1307]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 22 01:34:46 np0005531888 dracut[1307]:  microcode_ctl module: mangling fw_dir
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 22 01:34:46 np0005531888 dracut[1307]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 22 01:34:46 np0005531888 dracut[1307]: *** Including module: openssl ***
Nov 22 01:34:46 np0005531888 dracut[1307]: *** Including module: shutdown ***
Nov 22 01:34:46 np0005531888 dracut[1307]: *** Including module: squash ***
Nov 22 01:34:46 np0005531888 dracut[1307]: *** Including modules done ***
Nov 22 01:34:46 np0005531888 dracut[1307]: *** Installing kernel module dependencies ***
Nov 22 01:34:47 np0005531888 dracut[1307]: *** Installing kernel module dependencies done ***
Nov 22 01:34:47 np0005531888 dracut[1307]: *** Resolving executable dependencies ***
Nov 22 01:34:49 np0005531888 dracut[1307]: *** Resolving executable dependencies done ***
Nov 22 01:34:49 np0005531888 dracut[1307]: *** Generating early-microcode cpio image ***
Nov 22 01:34:49 np0005531888 dracut[1307]: *** Store current command line parameters ***
Nov 22 01:34:49 np0005531888 dracut[1307]: Stored kernel commandline:
Nov 22 01:34:49 np0005531888 dracut[1307]: No dracut internal kernel commandline stored in the initramfs
Nov 22 01:34:50 np0005531888 dracut[1307]: *** Install squash loader ***
Nov 22 01:34:51 np0005531888 dracut[1307]: *** Squashing the files inside the initramfs ***
Nov 22 01:34:52 np0005531888 dracut[1307]: *** Squashing the files inside the initramfs done ***
Nov 22 01:34:52 np0005531888 dracut[1307]: *** Creating image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' ***
Nov 22 01:34:52 np0005531888 dracut[1307]: *** Hardlinking files ***
Nov 22 01:34:52 np0005531888 dracut[1307]: *** Hardlinking files done ***
Nov 22 01:34:53 np0005531888 dracut[1307]: *** Creating initramfs image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' done ***
Nov 22 01:34:53 np0005531888 kdumpctl[1021]: kdump: kexec: loaded kdump kernel
Nov 22 01:34:53 np0005531888 kdumpctl[1021]: kdump: Starting kdump: [OK]
Nov 22 01:34:53 np0005531888 systemd[1]: Finished Crash recovery kernel arming.
Nov 22 01:34:53 np0005531888 systemd[1]: Startup finished in 1.590s (kernel) + 3.871s (initrd) + 29.172s (userspace) = 34.635s.
Nov 22 01:34:55 np0005531888 systemd[1]: Created slice User Slice of UID 1000.
Nov 22 01:34:55 np0005531888 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 22 01:34:55 np0005531888 systemd-logind[825]: New session 1 of user zuul.
Nov 22 01:34:55 np0005531888 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 22 01:34:55 np0005531888 systemd[1]: Starting User Manager for UID 1000...
Nov 22 01:34:56 np0005531888 systemd[4304]: Queued start job for default target Main User Target.
Nov 22 01:34:56 np0005531888 systemd[4304]: Created slice User Application Slice.
Nov 22 01:34:56 np0005531888 systemd[4304]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 01:34:56 np0005531888 systemd[4304]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 01:34:56 np0005531888 systemd[4304]: Reached target Paths.
Nov 22 01:34:56 np0005531888 systemd[4304]: Reached target Timers.
Nov 22 01:34:56 np0005531888 systemd[4304]: Starting D-Bus User Message Bus Socket...
Nov 22 01:34:56 np0005531888 systemd[4304]: Starting Create User's Volatile Files and Directories...
Nov 22 01:34:56 np0005531888 systemd[4304]: Finished Create User's Volatile Files and Directories.
Nov 22 01:34:56 np0005531888 systemd[4304]: Listening on D-Bus User Message Bus Socket.
Nov 22 01:34:56 np0005531888 systemd[4304]: Reached target Sockets.
Nov 22 01:34:56 np0005531888 systemd[4304]: Reached target Basic System.
Nov 22 01:34:56 np0005531888 systemd[4304]: Reached target Main User Target.
Nov 22 01:34:56 np0005531888 systemd[4304]: Startup finished in 164ms.
Nov 22 01:34:56 np0005531888 systemd[1]: Started User Manager for UID 1000.
Nov 22 01:34:56 np0005531888 systemd[1]: Started Session 1 of User zuul.
Nov 22 01:34:56 np0005531888 python3[4387]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:35:03 np0005531888 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 01:35:10 np0005531888 python3[4418]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:35:16 np0005531888 python3[4476]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:35:17 np0005531888 python3[4516]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 22 01:35:19 np0005531888 python3[4542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDIDDDD+fltt9cmDgcjLSkGENwZvQzj5XoQ8wGDcg2s6u+LVhotbjXRoCyQvkLrQ9+aYjFbt1JZ05PeSToOVkPdJ2l6AucsYKMFk7tKlgqYA0SfBQkQjrI4dYCIJp5Zl46tl+HQ7eT2kkERLJRgc1sNhw88jbxU83GEmQNcj9/Q6rj2r+/nIptD66sUseZ1GDb43Ao7zBSzRrD8HRZlEfDChNFod0RykV5phE1R5jhZzJ7KtwI8ovnac3+YT5JW3uK2sdRHHMkZyMiqLqGgsozncX0tlbDqQ6Td89rR3ia15IGC2ZhCwZ5c8vyHhHLG0eEjA73ADlY3cxVKkV8ULfKIWbZL7+AmS7WLvTbD3QSMnkFyuzpAbq/zrs1iZFaLNioOyXiKn0sdTX+CE+goDViTSGJIE8ELsdVZ1adwTqArvAG+Rek7RLJ0oiTWo43Kjdyfs/JYcGpxz+5HVoi4aE2g0M5qhLU7D/EmGa4VwYjui4rxXMlhIFmTsq1NgHSMlB8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:20 np0005531888 python3[4566]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:20 np0005531888 python3[4665]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:20 np0005531888 python3[4736]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793320.2956715-253-249832880561284/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6bc74860ecfa49adaf1e65a536fcfd6f_id_rsa follow=False checksum=d1aad691a5f7d928d36e451e57eecb0570edc5f2 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:21 np0005531888 python3[4859]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:22 np0005531888 python3[4930]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793321.3191621-308-19285497639019/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6bc74860ecfa49adaf1e65a536fcfd6f_id_rsa.pub follow=False checksum=5c64f06d32705901c18adda8251e89259a484c91 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:23 np0005531888 python3[4978]: ansible-ping Invoked with data=pong
Nov 22 01:35:24 np0005531888 python3[5002]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:35:26 np0005531888 python3[5060]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 22 01:35:27 np0005531888 python3[5092]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:27 np0005531888 python3[5116]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:28 np0005531888 python3[5140]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:29 np0005531888 python3[5164]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:29 np0005531888 python3[5188]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:29 np0005531888 python3[5212]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:31 np0005531888 python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:31 np0005531888 python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:32 np0005531888 python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793331.3556905-34-169511528045850/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:33 np0005531888 python3[5437]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:33 np0005531888 python3[5461]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:33 np0005531888 python3[5485]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:33 np0005531888 python3[5509]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:34 np0005531888 python3[5533]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:34 np0005531888 python3[5557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:34 np0005531888 python3[5581]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:34 np0005531888 python3[5605]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:35 np0005531888 python3[5629]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:35 np0005531888 python3[5653]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:35 np0005531888 python3[5677]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:36 np0005531888 python3[5701]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:36 np0005531888 python3[5725]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:36 np0005531888 python3[5749]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:36 np0005531888 python3[5773]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:37 np0005531888 python3[5797]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:37 np0005531888 python3[5821]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:37 np0005531888 python3[5845]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:37 np0005531888 python3[5869]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:38 np0005531888 python3[5893]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:38 np0005531888 python3[5917]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:38 np0005531888 python3[5941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:39 np0005531888 python3[5965]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:39 np0005531888 python3[5989]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:39 np0005531888 python3[6013]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:39 np0005531888 python3[6037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:42 np0005531888 python3[6063]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 22 01:35:42 np0005531888 systemd[1]: Starting Time & Date Service...
Nov 22 01:35:42 np0005531888 systemd[1]: Started Time & Date Service.
Nov 22 01:35:42 np0005531888 systemd-timedated[6065]: Changed time zone to 'UTC' (UTC).
Nov 22 01:35:44 np0005531888 python3[6094]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:45 np0005531888 python3[6170]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:45 np0005531888 python3[6241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763793345.1219077-254-32507677044152/source _original_basename=tmp29q36v0b follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:46 np0005531888 python3[6341]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:46 np0005531888 python3[6412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763793345.9725387-305-145283022754383/source _original_basename=tmpcxagexxl follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:47 np0005531888 python3[6514]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:47 np0005531888 python3[6587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763793347.0558496-384-128951426373303/source _original_basename=tmpvfv5xe1b follow=False checksum=2ba4345008cfb3d885fcd5bc880f8dce67f5151b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:48 np0005531888 python3[6635]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:35:48 np0005531888 python3[6661]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:35:48 np0005531888 python3[6741]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:49 np0005531888 python3[6814]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793348.6814065-454-210144349514264/source _original_basename=tmpithrikv5 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:50 np0005531888 python3[6865]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-2f73-06ac-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:35:50 np0005531888 python3[6893]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-2f73-06ac-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 22 01:35:52 np0005531888 python3[6922]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:36:13 np0005531888 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 01:36:29 np0005531888 python3[6950]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:37:29 np0005531888 systemd-logind[825]: Session 1 logged out. Waiting for processes to exit.
Nov 22 01:37:50 np0005531888 systemd[4304]: Starting Mark boot as successful...
Nov 22 01:37:50 np0005531888 systemd[4304]: Finished Mark boot as successful.
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 22 01:37:54 np0005531888 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 22 01:37:54 np0005531888 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1022] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 01:37:54 np0005531888 systemd-udevd[6954]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1218] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1256] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1262] device (eth1): carrier: link connected
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1264] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1271] policy: auto-activating connection 'Wired connection 1' (284b0229-109e-3afe-9ffa-d17d71e04c75)
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1276] device (eth1): Activation: starting connection 'Wired connection 1' (284b0229-109e-3afe-9ffa-d17d71e04c75)
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1277] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1282] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1287] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 01:37:54 np0005531888 NetworkManager[861]: <info>  [1763793474.1292] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:37:54 np0005531888 systemd-logind[825]: New session 3 of user zuul.
Nov 22 01:37:54 np0005531888 systemd[1]: Started Session 3 of User zuul.
Nov 22 01:37:54 np0005531888 python3[6985]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-e99d-39a2-0000000001ea-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:38:01 np0005531888 python3[7065]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:38:02 np0005531888 python3[7138]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793481.6108832-206-229311530934562/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=5b2b366f624e36745816768759036817e1980dd7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:38:02 np0005531888 python3[7188]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 01:38:02 np0005531888 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 22 01:38:02 np0005531888 systemd[1]: Stopped Network Manager Wait Online.
Nov 22 01:38:02 np0005531888 systemd[1]: Stopping Network Manager Wait Online...
Nov 22 01:38:02 np0005531888 systemd[1]: Stopping Network Manager...
Nov 22 01:38:02 np0005531888 NetworkManager[861]: <info>  [1763793482.8537] caught SIGTERM, shutting down normally.
Nov 22 01:38:02 np0005531888 NetworkManager[861]: <info>  [1763793482.8549] dhcp4 (eth0): canceled DHCP transaction
Nov 22 01:38:02 np0005531888 NetworkManager[861]: <info>  [1763793482.8550] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:38:02 np0005531888 NetworkManager[861]: <info>  [1763793482.8550] dhcp4 (eth0): state changed no lease
Nov 22 01:38:02 np0005531888 NetworkManager[861]: <info>  [1763793482.8552] manager: NetworkManager state is now CONNECTING
Nov 22 01:38:02 np0005531888 NetworkManager[861]: <info>  [1763793482.8646] dhcp4 (eth1): canceled DHCP transaction
Nov 22 01:38:02 np0005531888 NetworkManager[861]: <info>  [1763793482.8647] dhcp4 (eth1): state changed no lease
Nov 22 01:38:02 np0005531888 NetworkManager[861]: <info>  [1763793482.8697] exiting (success)
Nov 22 01:38:02 np0005531888 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 01:38:02 np0005531888 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 22 01:38:02 np0005531888 systemd[1]: Stopped Network Manager.
Nov 22 01:38:02 np0005531888 systemd[1]: NetworkManager.service: Consumed 1.816s CPU time, 9.9M memory peak.
Nov 22 01:38:02 np0005531888 systemd[1]: Starting Network Manager...
Nov 22 01:38:02 np0005531888 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 01:38:02 np0005531888 NetworkManager[7194]: <info>  [1763793482.9211] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:599dca8e-896c-427a-bdfa-1bf204f481cd)
Nov 22 01:38:02 np0005531888 NetworkManager[7194]: <info>  [1763793482.9214] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 01:38:02 np0005531888 NetworkManager[7194]: <info>  [1763793482.9269] manager[0x56060d359070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 01:38:02 np0005531888 systemd[1]: Starting Hostname Service...
Nov 22 01:38:03 np0005531888 systemd[1]: Started Hostname Service.
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0282] hostname: hostname: using hostnamed
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0283] hostname: static hostname changed from (none) to "np0005531888.novalocal"
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0291] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0297] manager[0x56060d359070]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0298] manager[0x56060d359070]: rfkill: WWAN hardware radio set enabled
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0327] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0327] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0328] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0328] manager: Networking is enabled by state file
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0331] settings: Loaded settings plugin: keyfile (internal)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0334] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0360] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0372] dhcp: init: Using DHCP client 'internal'
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0374] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0379] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0384] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0392] device (lo): Activation: starting connection 'lo' (94d7c5fa-5249-4bb3-8e91-0e8922804c08)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0398] device (eth0): carrier: link connected
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0402] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0406] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0407] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0414] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0420] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0427] device (eth1): carrier: link connected
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0430] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0435] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (284b0229-109e-3afe-9ffa-d17d71e04c75) (indicated)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0436] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0440] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0446] device (eth1): Activation: starting connection 'Wired connection 1' (284b0229-109e-3afe-9ffa-d17d71e04c75)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0454] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 01:38:03 np0005531888 systemd[1]: Started Network Manager.
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0458] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0462] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0463] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0466] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0469] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0471] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0473] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0478] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0485] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0487] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0496] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0498] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0541] dhcp4 (eth0): state changed new lease, address=38.129.56.229
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0549] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 01:38:03 np0005531888 systemd[1]: Starting Network Manager Wait Online...
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0603] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0607] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0613] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0621] device (lo): Activation: successful, device activated.
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0691] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0693] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0696] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0700] device (eth0): Activation: successful, device activated.
Nov 22 01:38:03 np0005531888 NetworkManager[7194]: <info>  [1763793483.0704] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 01:38:03 np0005531888 python3[7272]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-e99d-39a2-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:38:13 np0005531888 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 01:38:33 np0005531888 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3008] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 01:38:48 np0005531888 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 01:38:48 np0005531888 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3379] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3384] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3396] device (eth1): Activation: successful, device activated.
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3406] manager: startup complete
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3408] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <warn>  [1763793528.3417] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3428] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 22 01:38:48 np0005531888 systemd[1]: Finished Network Manager Wait Online.
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3566] dhcp4 (eth1): canceled DHCP transaction
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3566] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3567] dhcp4 (eth1): state changed no lease
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3580] policy: auto-activating connection 'ci-private-network' (118125b3-7f6e-507c-b3e2-eabc6d5622d9)
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3584] device (eth1): Activation: starting connection 'ci-private-network' (118125b3-7f6e-507c-b3e2-eabc6d5622d9)
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3584] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3586] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3592] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3600] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3793] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3795] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 01:38:48 np0005531888 NetworkManager[7194]: <info>  [1763793528.3804] device (eth1): Activation: successful, device activated.
Nov 22 01:38:58 np0005531888 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 01:39:03 np0005531888 systemd[1]: session-3.scope: Deactivated successfully.
Nov 22 01:39:03 np0005531888 systemd[1]: session-3.scope: Consumed 1.631s CPU time.
Nov 22 01:39:03 np0005531888 systemd-logind[825]: Session 3 logged out. Waiting for processes to exit.
Nov 22 01:39:03 np0005531888 systemd-logind[825]: Removed session 3.
Nov 22 01:39:18 np0005531888 systemd-logind[825]: New session 4 of user zuul.
Nov 22 01:39:18 np0005531888 systemd[1]: Started Session 4 of User zuul.
Nov 22 01:39:19 np0005531888 python3[7382]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:39:19 np0005531888 python3[7455]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793558.937094-365-62519209424640/source _original_basename=tmpcd6um_2v follow=False checksum=ec5da2e3f9737eb58d2ca927fe651700c5f6760b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:39:21 np0005531888 systemd[1]: session-4.scope: Deactivated successfully.
Nov 22 01:39:21 np0005531888 systemd-logind[825]: Session 4 logged out. Waiting for processes to exit.
Nov 22 01:39:21 np0005531888 systemd-logind[825]: Removed session 4.
Nov 22 01:40:50 np0005531888 systemd[4304]: Created slice User Background Tasks Slice.
Nov 22 01:40:50 np0005531888 systemd[4304]: Starting Cleanup of User's Temporary Files and Directories...
Nov 22 01:40:50 np0005531888 systemd[4304]: Finished Cleanup of User's Temporary Files and Directories.
Nov 22 01:45:32 np0005531888 systemd-logind[825]: New session 5 of user zuul.
Nov 22 01:45:32 np0005531888 systemd[1]: Started Session 5 of User zuul.
Nov 22 01:45:32 np0005531888 python3[7537]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-eb85-1f76-000000000ca6-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:32 np0005531888 python3[7565]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:33 np0005531888 python3[7592]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:33 np0005531888 python3[7618]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:33 np0005531888 python3[7644]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:33 np0005531888 python3[7670]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:34 np0005531888 python3[7748]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:45:34 np0005531888 python3[7821]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793934.138532-370-25706399926070/source _original_basename=tmpfk0nzzyr follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:35 np0005531888 python3[7871]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 01:45:35 np0005531888 systemd[1]: Reloading.
Nov 22 01:45:35 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 01:45:37 np0005531888 python3[7927]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 22 01:45:37 np0005531888 python3[7953]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:38 np0005531888 python3[7981]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:38 np0005531888 python3[8009]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:38 np0005531888 python3[8037]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:39 np0005531888 python3[8064]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-eb85-1f76-000000000cad-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:39 np0005531888 python3[8094]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 22 01:45:42 np0005531888 systemd[1]: session-5.scope: Deactivated successfully.
Nov 22 01:45:42 np0005531888 systemd[1]: session-5.scope: Consumed 4.497s CPU time.
Nov 22 01:45:42 np0005531888 systemd-logind[825]: Session 5 logged out. Waiting for processes to exit.
Nov 22 01:45:42 np0005531888 systemd-logind[825]: Removed session 5.
Nov 22 01:45:44 np0005531888 systemd-logind[825]: New session 6 of user zuul.
Nov 22 01:45:44 np0005531888 systemd[1]: Started Session 6 of User zuul.
Nov 22 01:45:44 np0005531888 python3[8128]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 22 01:45:50 np0005531888 irqbalance[820]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 22 01:45:50 np0005531888 irqbalance[820]: IRQ 27 affinity is now unmanaged
Nov 22 01:46:01 np0005531888 kernel: SELinux:  Converting 385 SID table entries...
Nov 22 01:46:01 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:46:01 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:46:01 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:46:01 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:46:01 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:46:01 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:46:01 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:46:10 np0005531888 kernel: SELinux:  Converting 385 SID table entries...
Nov 22 01:46:10 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:46:10 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:46:10 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:46:10 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:46:10 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:46:10 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:46:10 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:46:19 np0005531888 kernel: SELinux:  Converting 385 SID table entries...
Nov 22 01:46:19 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:46:19 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:46:19 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:46:19 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:46:19 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:46:19 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:46:19 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:46:20 np0005531888 setsebool[8194]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 22 01:46:20 np0005531888 setsebool[8194]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 22 01:46:45 np0005531888 kernel: SELinux:  Converting 388 SID table entries...
Nov 22 01:46:45 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:46:45 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:46:45 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:46:45 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:46:45 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:46:45 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:46:45 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:47:08 np0005531888 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 22 01:47:08 np0005531888 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 01:47:08 np0005531888 systemd[1]: Starting man-db-cache-update.service...
Nov 22 01:47:08 np0005531888 systemd[1]: Reloading.
Nov 22 01:47:08 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 01:47:08 np0005531888 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 01:47:54 np0005531888 python3[26437]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d633-179b-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:47:55 np0005531888 kernel: evm: overlay not supported
Nov 22 01:47:56 np0005531888 systemd[4304]: Starting D-Bus User Message Bus...
Nov 22 01:47:56 np0005531888 dbus-broker-launch[26887]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 22 01:47:56 np0005531888 dbus-broker-launch[26887]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 22 01:47:56 np0005531888 systemd[4304]: Started D-Bus User Message Bus.
Nov 22 01:47:56 np0005531888 dbus-broker-lau[26887]: Ready
Nov 22 01:47:56 np0005531888 systemd[4304]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 22 01:47:56 np0005531888 systemd[4304]: Created slice Slice /user.
Nov 22 01:47:56 np0005531888 systemd[4304]: podman-26733.scope: unit configures an IP firewall, but not running as root.
Nov 22 01:47:56 np0005531888 systemd[4304]: (This warning is only shown for the first unit using IP firewalling.)
Nov 22 01:47:56 np0005531888 systemd[4304]: Started podman-26733.scope.
Nov 22 01:47:56 np0005531888 systemd[4304]: Started podman-pause-9f44b5b6.scope.
Nov 22 01:47:58 np0005531888 python3[27640]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.155:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.155:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:47:58 np0005531888 python3[27640]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 22 01:47:59 np0005531888 systemd[1]: session-6.scope: Deactivated successfully.
Nov 22 01:47:59 np0005531888 systemd[1]: session-6.scope: Consumed 1min 1.111s CPU time.
Nov 22 01:47:59 np0005531888 systemd-logind[825]: Session 6 logged out. Waiting for processes to exit.
Nov 22 01:47:59 np0005531888 systemd-logind[825]: Removed session 6.
Nov 22 01:48:06 np0005531888 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 01:48:06 np0005531888 systemd[1]: Finished man-db-cache-update.service.
Nov 22 01:48:06 np0005531888 systemd[1]: man-db-cache-update.service: Consumed 59.484s CPU time.
Nov 22 01:48:06 np0005531888 systemd[1]: run-racea23864e054e079e10eb5ee5725dd8.service: Deactivated successfully.
Nov 22 01:48:31 np0005531888 systemd-logind[825]: New session 7 of user zuul.
Nov 22 01:48:31 np0005531888 systemd[1]: Started Session 7 of User zuul.
Nov 22 01:48:31 np0005531888 python3[29645]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:48:32 np0005531888 python3[29671]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:48:33 np0005531888 python3[29697]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005531888.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 22 01:48:34 np0005531888 python3[29731]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:48:34 np0005531888 python3[29809]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:48:35 np0005531888 python3[29882]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763794114.2036984-170-5140415665270/source _original_basename=tmp38o8_rit follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:48:36 np0005531888 python3[29932]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Nov 22 01:48:36 np0005531888 systemd[1]: Starting Hostname Service...
Nov 22 01:48:36 np0005531888 systemd[1]: Started Hostname Service.
Nov 22 01:48:36 np0005531888 systemd-hostnamed[29936]: Changed pretty hostname to 'compute-2'
Nov 22 01:48:36 np0005531888 systemd-hostnamed[29936]: Hostname set to <compute-2> (static)
Nov 22 01:48:36 np0005531888 NetworkManager[7194]: <info>  [1763794116.1825] hostname: static hostname changed from "np0005531888.novalocal" to "compute-2"
Nov 22 01:48:36 np0005531888 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 01:48:36 np0005531888 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 01:48:36 np0005531888 systemd[1]: session-7.scope: Deactivated successfully.
Nov 22 01:48:36 np0005531888 systemd[1]: session-7.scope: Consumed 2.590s CPU time.
Nov 22 01:48:36 np0005531888 systemd-logind[825]: Session 7 logged out. Waiting for processes to exit.
Nov 22 01:48:36 np0005531888 systemd-logind[825]: Removed session 7.
Nov 22 01:48:46 np0005531888 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 01:49:06 np0005531888 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 01:49:50 np0005531888 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 22 01:49:50 np0005531888 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 22 01:49:50 np0005531888 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 22 01:49:50 np0005531888 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 22 01:55:33 np0005531888 systemd-logind[825]: New session 8 of user zuul.
Nov 22 01:55:33 np0005531888 systemd[1]: Started Session 8 of User zuul.
Nov 22 01:55:34 np0005531888 python3[30038]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:55:36 np0005531888 python3[30154]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:36 np0005531888 python3[30227]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.9718688-33955-133802586169289/source mode=0755 _original_basename=delorean.repo follow=False checksum=1830be8248976a7f714fb01ca8550e92dfc79ad2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:37 np0005531888 python3[30253]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:37 np0005531888 python3[30326]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.9718688-33955-133802586169289/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:37 np0005531888 python3[30352]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:37 np0005531888 python3[30425]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.9718688-33955-133802586169289/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:38 np0005531888 python3[30451]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:38 np0005531888 python3[30524]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.9718688-33955-133802586169289/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:38 np0005531888 python3[30550]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:39 np0005531888 python3[30623]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.9718688-33955-133802586169289/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:39 np0005531888 python3[30649]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:39 np0005531888 python3[30722]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.9718688-33955-133802586169289/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:40 np0005531888 python3[30748]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:40 np0005531888 python3[30821]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.9718688-33955-133802586169289/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:49 np0005531888 python3[30869]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:00:30 np0005531888 systemd[1]: Starting dnf makecache...
Nov 22 02:00:30 np0005531888 dnf[30873]: Failed determining last makecache time.
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-openstack-barbican-42b4c41831408a8e323 234 kB/s |  13 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 1.4 MB/s |  65 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-openstack-cinder-1c00d6490d88e436f26ef 193 kB/s |  32 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-python-stevedore-c4acc5639fd2329372142 1.0 MB/s | 131 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-python-observabilityclient-2f31846d73c 988 kB/s |  25 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-os-net-config-bbae2ed8a159b0435a473f38  11 MB/s | 356 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.3 MB/s |  42 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-python-designate-tests-tempest-347fdbc 762 kB/s |  18 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-openstack-glance-1fd12c29b339f30fe823e 561 kB/s |  18 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 232 kB/s |  29 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-openstack-manila-3c01b7181572c95dac462 647 kB/s |  25 kB     00:00
Nov 22 02:00:31 np0005531888 dnf[30873]: delorean-python-whitebox-neutron-tests-tempest- 3.5 MB/s | 154 kB     00:00
Nov 22 02:00:32 np0005531888 dnf[30873]: delorean-openstack-octavia-ba397f07a7331190208c 507 kB/s |  26 kB     00:00
Nov 22 02:00:32 np0005531888 dnf[30873]: delorean-openstack-watcher-c014f81a8647287f6dcc 620 kB/s |  16 kB     00:00
Nov 22 02:00:32 np0005531888 dnf[30873]: delorean-python-tcib-1124124ec06aadbac34f0d340b  26 kB/s | 7.4 kB     00:00
Nov 22 02:00:32 np0005531888 dnf[30873]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 550 kB/s | 144 kB     00:00
Nov 22 02:00:32 np0005531888 dnf[30873]: delorean-openstack-swift-dc98a8463506ac520c469a 137 kB/s |  14 kB     00:00
Nov 22 02:00:32 np0005531888 dnf[30873]: delorean-python-tempestconf-8515371b7cceebd4282 1.4 MB/s |  53 kB     00:00
Nov 22 02:00:32 np0005531888 dnf[30873]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.0 MB/s |  96 kB     00:00
Nov 22 02:00:33 np0005531888 dnf[30873]: CentOS Stream 9 - BaseOS                         28 kB/s | 7.3 kB     00:00
Nov 22 02:00:33 np0005531888 dnf[30873]: CentOS Stream 9 - AppStream                      76 kB/s | 7.4 kB     00:00
Nov 22 02:00:33 np0005531888 dnf[30873]: CentOS Stream 9 - CRB                            27 kB/s | 7.2 kB     00:00
Nov 22 02:00:34 np0005531888 dnf[30873]: CentOS Stream 9 - Extras packages                27 kB/s | 8.3 kB     00:00
Nov 22 02:00:34 np0005531888 dnf[30873]: dlrn-antelope-testing                           3.2 MB/s | 1.1 MB     00:00
Nov 22 02:00:34 np0005531888 dnf[30873]: dlrn-antelope-build-deps                        5.1 MB/s | 461 kB     00:00
Nov 22 02:00:35 np0005531888 dnf[30873]: centos9-rabbitmq                                3.7 MB/s | 123 kB     00:00
Nov 22 02:00:35 np0005531888 dnf[30873]: centos9-storage                                  20 MB/s | 415 kB     00:00
Nov 22 02:00:35 np0005531888 dnf[30873]: centos9-opstools                                2.2 MB/s |  51 kB     00:00
Nov 22 02:00:35 np0005531888 dnf[30873]: NFV SIG OpenvSwitch                              17 MB/s | 454 kB     00:00
Nov 22 02:00:36 np0005531888 dnf[30873]: repo-setup-centos-appstream                      72 MB/s |  25 MB     00:00
Nov 22 02:00:41 np0005531888 dnf[30873]: repo-setup-centos-baseos                         57 MB/s | 8.8 MB     00:00
Nov 22 02:00:43 np0005531888 dnf[30873]: repo-setup-centos-highavailability              8.8 MB/s | 744 kB     00:00
Nov 22 02:00:44 np0005531888 dnf[30873]: repo-setup-centos-powertools                     20 MB/s | 7.3 MB     00:00
Nov 22 02:00:47 np0005531888 dnf[30873]: Extra Packages for Enterprise Linux 9 - x86_64   14 MB/s |  20 MB     00:01
Nov 22 02:00:49 np0005531888 systemd[1]: session-8.scope: Deactivated successfully.
Nov 22 02:00:49 np0005531888 systemd[1]: session-8.scope: Consumed 4.994s CPU time.
Nov 22 02:00:49 np0005531888 systemd-logind[825]: Session 8 logged out. Waiting for processes to exit.
Nov 22 02:00:49 np0005531888 systemd-logind[825]: Removed session 8.
Nov 22 02:00:59 np0005531888 dnf[30873]: Metadata cache created.
Nov 22 02:00:59 np0005531888 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 22 02:00:59 np0005531888 systemd[1]: Finished dnf makecache.
Nov 22 02:00:59 np0005531888 systemd[1]: dnf-makecache.service: Consumed 24.152s CPU time.
Nov 22 02:11:49 np0005531888 systemd-logind[825]: New session 9 of user zuul.
Nov 22 02:11:49 np0005531888 systemd[1]: Started Session 9 of User zuul.
Nov 22 02:11:50 np0005531888 python3.9[31150]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:11:52 np0005531888 python3.9[31331]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:12:00 np0005531888 systemd[1]: session-9.scope: Deactivated successfully.
Nov 22 02:12:00 np0005531888 systemd[1]: session-9.scope: Consumed 7.927s CPU time.
Nov 22 02:12:00 np0005531888 systemd-logind[825]: Session 9 logged out. Waiting for processes to exit.
Nov 22 02:12:00 np0005531888 systemd-logind[825]: Removed session 9.
Nov 22 02:12:16 np0005531888 systemd-logind[825]: New session 10 of user zuul.
Nov 22 02:12:16 np0005531888 systemd[1]: Started Session 10 of User zuul.
Nov 22 02:12:16 np0005531888 python3.9[31543]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 22 02:12:18 np0005531888 python3.9[31717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:12:19 np0005531888 python3.9[31869]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:12:20 np0005531888 python3.9[32022]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:12:21 np0005531888 python3.9[32174]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:12:21 np0005531888 python3.9[32326]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:12:22 np0005531888 python3.9[32449]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795541.3037949-184-60591565761343/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:12:23 np0005531888 python3.9[32601]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:12:24 np0005531888 python3.9[32757]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:12:25 np0005531888 python3.9[32909]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:12:26 np0005531888 python3.9[33059]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:12:30 np0005531888 irqbalance[820]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 22 02:12:30 np0005531888 irqbalance[820]: IRQ 26 affinity is now unmanaged
Nov 22 02:12:30 np0005531888 python3.9[33313]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:12:31 np0005531888 python3.9[33463]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:12:32 np0005531888 python3.9[33617]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:12:33 np0005531888 python3.9[33775]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:12:34 np0005531888 python3.9[33859]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:13:19 np0005531888 systemd[1]: Reloading.
Nov 22 02:13:19 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:13:19 np0005531888 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 22 02:13:20 np0005531888 systemd[1]: Reloading.
Nov 22 02:13:20 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:13:20 np0005531888 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 22 02:13:20 np0005531888 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 22 02:13:20 np0005531888 systemd[1]: Reloading.
Nov 22 02:13:20 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:13:20 np0005531888 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 22 02:13:20 np0005531888 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Nov 22 02:13:20 np0005531888 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Nov 22 02:13:20 np0005531888 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Nov 22 02:14:30 np0005531888 kernel: SELinux:  Converting 2720 SID table entries...
Nov 22 02:14:30 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:14:30 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:14:30 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:14:30 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:14:30 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:14:30 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:14:30 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:14:30 np0005531888 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 22 02:14:30 np0005531888 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:14:30 np0005531888 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:14:30 np0005531888 systemd[1]: Reloading.
Nov 22 02:14:30 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:14:31 np0005531888 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:14:32 np0005531888 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:14:32 np0005531888 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:14:32 np0005531888 systemd[1]: man-db-cache-update.service: Consumed 1.236s CPU time.
Nov 22 02:14:32 np0005531888 systemd[1]: run-rc8fa2fa3ce7e4401a21c489e27c72c5d.service: Deactivated successfully.
Nov 22 02:14:42 np0005531888 python3.9[35371]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:14:44 np0005531888 python3.9[35652]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 22 02:14:45 np0005531888 python3.9[35804]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 22 02:14:52 np0005531888 python3.9[35959]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:15:09 np0005531888 python3.9[36114]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 22 02:15:11 np0005531888 python3.9[36266]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:15:12 np0005531888 python3.9[36418]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:15:12 np0005531888 python3.9[36541]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763795711.6385233-674-72166489014773/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:15:14 np0005531888 python3.9[36693]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:15:15 np0005531888 python3.9[36845]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:15 np0005531888 python3.9[36998]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:15:17 np0005531888 python3.9[37150]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 22 02:15:17 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:15:17 np0005531888 python3.9[37304]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:15:19 np0005531888 python3.9[37462]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:15:20 np0005531888 python3.9[37622]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 22 02:15:20 np0005531888 python3.9[37775]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:15:22 np0005531888 python3.9[37933]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 22 02:15:23 np0005531888 python3.9[38085]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:15:26 np0005531888 python3.9[38238]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:15:27 np0005531888 python3.9[38390]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:15:27 np0005531888 python3.9[38513]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795726.6100872-1031-272885440808833/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:15:28 np0005531888 python3.9[38665]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:15:28 np0005531888 systemd[1]: Starting Load Kernel Modules...
Nov 22 02:15:28 np0005531888 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 22 02:15:28 np0005531888 kernel: Bridge firewalling registered
Nov 22 02:15:28 np0005531888 systemd-modules-load[38669]: Inserted module 'br_netfilter'
Nov 22 02:15:28 np0005531888 systemd[1]: Finished Load Kernel Modules.
Nov 22 02:15:29 np0005531888 python3.9[38825]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:15:30 np0005531888 python3.9[38948]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795729.214475-1100-205752328438163/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:15:31 np0005531888 python3.9[39100]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:15:35 np0005531888 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Nov 22 02:15:35 np0005531888 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Nov 22 02:15:36 np0005531888 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:15:36 np0005531888 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:15:36 np0005531888 systemd[1]: Reloading.
Nov 22 02:15:36 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:15:36 np0005531888 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:15:38 np0005531888 python3.9[41072]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:15:39 np0005531888 python3.9[42047]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 22 02:15:39 np0005531888 python3.9[42934]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:15:40 np0005531888 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:15:40 np0005531888 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:15:40 np0005531888 systemd[1]: man-db-cache-update.service: Consumed 4.554s CPU time.
Nov 22 02:15:40 np0005531888 systemd[1]: run-r272e5e2a9b49497ba866dbf9f21e2d2c.service: Deactivated successfully.
Nov 22 02:15:40 np0005531888 python3.9[43294]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:40 np0005531888 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 22 02:15:41 np0005531888 systemd[1]: Starting Authorization Manager...
Nov 22 02:15:41 np0005531888 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 22 02:15:41 np0005531888 polkitd[43511]: Started polkitd version 0.117
Nov 22 02:15:41 np0005531888 systemd[1]: Started Authorization Manager.
Nov 22 02:15:42 np0005531888 python3.9[43681]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:15:42 np0005531888 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 22 02:15:42 np0005531888 systemd[1]: tuned.service: Deactivated successfully.
Nov 22 02:15:42 np0005531888 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 22 02:15:42 np0005531888 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 22 02:15:42 np0005531888 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 22 02:15:43 np0005531888 python3.9[43843]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 22 02:15:47 np0005531888 python3.9[43995]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:15:48 np0005531888 systemd[1]: Reloading.
Nov 22 02:15:48 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:15:49 np0005531888 python3.9[44184]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:15:49 np0005531888 systemd[1]: Reloading.
Nov 22 02:15:49 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:15:50 np0005531888 python3.9[44374]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:51 np0005531888 python3.9[44527]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:51 np0005531888 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 22 02:15:51 np0005531888 python3.9[44680]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:54 np0005531888 python3.9[44842]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:54 np0005531888 python3.9[44995]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:15:54 np0005531888 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 22 02:15:54 np0005531888 systemd[1]: Stopped Apply Kernel Variables.
Nov 22 02:15:54 np0005531888 systemd[1]: Stopping Apply Kernel Variables...
Nov 22 02:15:55 np0005531888 systemd[1]: Starting Apply Kernel Variables...
Nov 22 02:15:55 np0005531888 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 22 02:15:55 np0005531888 systemd[1]: Finished Apply Kernel Variables.
Nov 22 02:15:55 np0005531888 systemd[1]: session-10.scope: Deactivated successfully.
Nov 22 02:15:55 np0005531888 systemd[1]: session-10.scope: Consumed 2min 11.976s CPU time.
Nov 22 02:15:55 np0005531888 systemd-logind[825]: Session 10 logged out. Waiting for processes to exit.
Nov 22 02:15:55 np0005531888 systemd-logind[825]: Removed session 10.
Nov 22 02:16:01 np0005531888 systemd-logind[825]: New session 11 of user zuul.
Nov 22 02:16:01 np0005531888 systemd[1]: Started Session 11 of User zuul.
Nov 22 02:16:02 np0005531888 python3.9[45178]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:16:03 np0005531888 python3.9[45332]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:16:05 np0005531888 python3.9[45488]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:16:06 np0005531888 python3.9[45639]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:16:07 np0005531888 python3.9[45795]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:16:08 np0005531888 python3.9[45879]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:16:10 np0005531888 python3.9[46032]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:16:11 np0005531888 python3.9[46203]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:16:12 np0005531888 python3.9[46355]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:16:12 np0005531888 systemd[1]: var-lib-containers-storage-overlay-compat10817643-merged.mount: Deactivated successfully.
Nov 22 02:16:12 np0005531888 podman[46356]: 2025-11-22 07:16:12.671499917 +0000 UTC m=+0.178524889 system refresh
Nov 22 02:16:13 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:16:13 np0005531888 python3.9[46519]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:16:14 np0005531888 python3.9[46642]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763795772.921559-294-139996561316090/.source.json follow=False _original_basename=podman_network_config.j2 checksum=fce95ed37f392cc6747a783c3d82ac3f77c8376f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:16:15 np0005531888 python3.9[46794]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:16:15 np0005531888 python3.9[46917]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795774.7594068-339-26851109785897/.source.conf follow=False _original_basename=registries.conf.j2 checksum=193e1b13ee9dd51d1fc7c456c46399ca66d3b9c7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:16 np0005531888 python3.9[47069]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:17 np0005531888 python3.9[47221]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:18 np0005531888 python3.9[47373]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:18 np0005531888 python3.9[47525]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:20 np0005531888 python3.9[47675]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:16:20 np0005531888 python3.9[47829]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:22 np0005531888 python3.9[47982]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:26 np0005531888 python3.9[48142]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:29 np0005531888 python3.9[48295]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:32 np0005531888 python3.9[48448]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:36 np0005531888 python3.9[48604]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:44 np0005531888 python3.9[48774]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:47 np0005531888 python3.9[48927]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:17:10 np0005531888 python3.9[49263]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:17:12 np0005531888 python3.9[49419]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:17:13 np0005531888 python3.9[49594]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:17:13 np0005531888 python3.9[49717]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763795832.892049-783-47143098841317/.source.json _original_basename=.4og95l7j follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:17:15 np0005531888 python3.9[49869]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:17:15 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:19 np0005531888 systemd[1]: var-lib-containers-storage-overlay-compat1773130549-lower\x2dmapped.mount: Deactivated successfully.
Nov 22 02:17:23 np0005531888 podman[49880]: 2025-11-22 07:17:23.290690038 +0000 UTC m=+8.187426581 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 02:17:23 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:23 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:23 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:24 np0005531888 python3.9[50177]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:17:24 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:28 np0005531888 podman[50189]: 2025-11-22 07:17:28.242225553 +0000 UTC m=+3.520385121 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 02:17:28 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:28 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:28 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:29 np0005531888 python3.9[50428]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:17:29 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:57 np0005531888 podman[50441]: 2025-11-22 07:17:57.812088477 +0000 UTC m=+28.300564795 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 02:17:57 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:57 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:57 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:58 np0005531888 python3.9[50738]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:17:58 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:05 np0005531888 podman[50751]: 2025-11-22 07:18:05.01304463 +0000 UTC m=+6.026195356 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 22 02:18:05 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:05 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:05 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:05 np0005531888 python3.9[51005]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:18:07 np0005531888 podman[51017]: 2025-11-22 07:18:07.812265151 +0000 UTC m=+1.933852510 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 22 02:18:07 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:07 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:07 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:18 np0005531888 systemd[1]: session-11.scope: Deactivated successfully.
Nov 22 02:18:18 np0005531888 systemd[1]: session-11.scope: Consumed 1min 35.733s CPU time.
Nov 22 02:18:18 np0005531888 systemd-logind[825]: Session 11 logged out. Waiting for processes to exit.
Nov 22 02:18:18 np0005531888 systemd-logind[825]: Removed session 11.
Nov 22 02:18:24 np0005531888 systemd-logind[825]: New session 12 of user zuul.
Nov 22 02:18:24 np0005531888 systemd[1]: Started Session 12 of User zuul.
Nov 22 02:18:28 np0005531888 python3.9[51313]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:18:30 np0005531888 python3.9[51469]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 22 02:18:31 np0005531888 python3.9[51622]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:18:32 np0005531888 python3.9[51780]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:18:33 np0005531888 python3.9[51940]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:18:34 np0005531888 python3.9[52024]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:18:38 np0005531888 python3.9[52186]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:18:53 np0005531888 kernel: SELinux:  Converting 2733 SID table entries...
Nov 22 02:18:53 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:18:53 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:18:53 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:18:53 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:18:53 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:18:53 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:18:53 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:18:54 np0005531888 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 22 02:18:54 np0005531888 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 22 02:18:55 np0005531888 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:18:55 np0005531888 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:18:56 np0005531888 systemd[1]: Reloading.
Nov 22 02:18:56 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:18:56 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:18:56 np0005531888 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:18:56 np0005531888 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:18:56 np0005531888 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:18:56 np0005531888 systemd[1]: run-ra5f048ad0e33412e97fc1eef41ec9517.service: Deactivated successfully.
Nov 22 02:18:58 np0005531888 python3.9[53284]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:18:58 np0005531888 systemd[1]: Reloading.
Nov 22 02:18:58 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:18:58 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:18:58 np0005531888 systemd[1]: Starting Open vSwitch Database Unit...
Nov 22 02:18:58 np0005531888 chown[53326]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 22 02:18:58 np0005531888 ovs-ctl[53331]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 22 02:18:58 np0005531888 ovs-ctl[53331]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 22 02:18:58 np0005531888 ovs-ctl[53331]: Starting ovsdb-server [  OK  ]
Nov 22 02:18:58 np0005531888 ovs-vsctl[53380]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 22 02:18:58 np0005531888 ovs-vsctl[53400]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"4984e16e-8f1c-4426-bfc6-5927f375ce79\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 22 02:18:58 np0005531888 ovs-ctl[53331]: Configuring Open vSwitch system IDs [  OK  ]
Nov 22 02:18:58 np0005531888 ovs-vsctl[53406]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 22 02:18:58 np0005531888 ovs-ctl[53331]: Enabling remote OVSDB managers [  OK  ]
Nov 22 02:18:58 np0005531888 systemd[1]: Started Open vSwitch Database Unit.
Nov 22 02:18:58 np0005531888 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 22 02:18:58 np0005531888 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 22 02:18:58 np0005531888 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 22 02:18:58 np0005531888 kernel: openvswitch: Open vSwitch switching datapath
Nov 22 02:18:58 np0005531888 ovs-ctl[53451]: Inserting openvswitch module [  OK  ]
Nov 22 02:18:59 np0005531888 ovs-ctl[53420]: Starting ovs-vswitchd [  OK  ]
Nov 22 02:18:59 np0005531888 ovs-ctl[53420]: Enabling remote OVSDB managers [  OK  ]
Nov 22 02:18:59 np0005531888 ovs-vsctl[53472]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 22 02:18:59 np0005531888 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 22 02:18:59 np0005531888 systemd[1]: Starting Open vSwitch...
Nov 22 02:18:59 np0005531888 systemd[1]: Finished Open vSwitch.
Nov 22 02:19:00 np0005531888 python3.9[53623]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:19:01 np0005531888 python3.9[53775]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 22 02:19:04 np0005531888 kernel: SELinux:  Converting 2747 SID table entries...
Nov 22 02:19:04 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:19:04 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:19:04 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:19:04 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:19:04 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:19:04 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:19:04 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:19:06 np0005531888 python3.9[53930]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:19:06 np0005531888 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 22 02:19:07 np0005531888 python3.9[54088]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:19:09 np0005531888 python3.9[54241]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:19:11 np0005531888 python3.9[54528]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 02:19:11 np0005531888 python3.9[54678]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:19:12 np0005531888 python3.9[54832]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:19:14 np0005531888 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:19:14 np0005531888 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:19:14 np0005531888 systemd[1]: Reloading.
Nov 22 02:19:14 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:19:14 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:19:14 np0005531888 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:19:15 np0005531888 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:19:15 np0005531888 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:19:15 np0005531888 systemd[1]: run-r125ed3a30b66434bb0547b42bb5af217.service: Deactivated successfully.
Nov 22 02:19:16 np0005531888 python3.9[55151]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:19:16 np0005531888 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 22 02:19:16 np0005531888 systemd[1]: Stopped Network Manager Wait Online.
Nov 22 02:19:16 np0005531888 systemd[1]: Stopping Network Manager Wait Online...
Nov 22 02:19:16 np0005531888 systemd[1]: Stopping Network Manager...
Nov 22 02:19:16 np0005531888 NetworkManager[7194]: <info>  [1763795956.7499] caught SIGTERM, shutting down normally.
Nov 22 02:19:16 np0005531888 NetworkManager[7194]: <info>  [1763795956.7514] dhcp4 (eth0): canceled DHCP transaction
Nov 22 02:19:16 np0005531888 NetworkManager[7194]: <info>  [1763795956.7514] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 02:19:16 np0005531888 NetworkManager[7194]: <info>  [1763795956.7514] dhcp4 (eth0): state changed no lease
Nov 22 02:19:16 np0005531888 NetworkManager[7194]: <info>  [1763795956.7517] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 02:19:16 np0005531888 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 02:19:16 np0005531888 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 02:19:16 np0005531888 NetworkManager[7194]: <info>  [1763795956.7845] exiting (success)
Nov 22 02:19:16 np0005531888 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 22 02:19:16 np0005531888 systemd[1]: Stopped Network Manager.
Nov 22 02:19:16 np0005531888 systemd[1]: NetworkManager.service: Consumed 19.370s CPU time, 4.1M memory peak, read 0B from disk, written 31.0K to disk.
Nov 22 02:19:16 np0005531888 systemd[1]: Starting Network Manager...
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.8564] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:599dca8e-896c-427a-bdfa-1bf204f481cd)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.8566] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.8639] manager[0x5623d3864090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 02:19:16 np0005531888 systemd[1]: Starting Hostname Service...
Nov 22 02:19:16 np0005531888 systemd[1]: Started Hostname Service.
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9411] hostname: hostname: using hostnamed
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9412] hostname: static hostname changed from (none) to "compute-2"
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9417] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9420] manager[0x5623d3864090]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9420] manager[0x5623d3864090]: rfkill: WWAN hardware radio set enabled
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9441] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9449] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9449] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9450] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9450] manager: Networking is enabled by state file
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9451] settings: Loaded settings plugin: keyfile (internal)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9454] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9478] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9484] dhcp: init: Using DHCP client 'internal'
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9487] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9490] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9494] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9501] device (lo): Activation: starting connection 'lo' (94d7c5fa-5249-4bb3-8e91-0e8922804c08)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9505] device (eth0): carrier: link connected
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9508] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9511] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9512] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9516] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9520] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9524] device (eth1): carrier: link connected
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9527] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9531] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (118125b3-7f6e-507c-b3e2-eabc6d5622d9) (indicated)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9532] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9537] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9542] device (eth1): Activation: starting connection 'ci-private-network' (118125b3-7f6e-507c-b3e2-eabc6d5622d9)
Nov 22 02:19:16 np0005531888 systemd[1]: Started Network Manager.
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9548] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9555] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9557] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9559] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9560] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9563] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9565] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9576] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9581] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9587] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9589] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9597] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9608] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9620] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9622] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9628] device (lo): Activation: successful, device activated.
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9634] dhcp4 (eth0): state changed new lease, address=38.129.56.229
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9639] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 02:19:16 np0005531888 systemd[1]: Starting Network Manager Wait Online...
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9730] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9737] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9738] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9741] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9744] device (eth1): Activation: successful, device activated.
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9804] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9808] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9812] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9819] device (eth0): Activation: successful, device activated.
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9825] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 02:19:16 np0005531888 NetworkManager[55166]: <info>  [1763795956.9829] manager: startup complete
Nov 22 02:19:16 np0005531888 systemd[1]: Finished Network Manager Wait Online.
Nov 22 02:19:17 np0005531888 python3.9[55378]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:19:23 np0005531888 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:19:23 np0005531888 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:19:23 np0005531888 systemd[1]: Reloading.
Nov 22 02:19:23 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:19:23 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:19:23 np0005531888 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:19:27 np0005531888 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:19:27 np0005531888 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:19:27 np0005531888 systemd[1]: run-r9a6898f24c9e490684ccada218859036.service: Deactivated successfully.
Nov 22 02:19:27 np0005531888 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 02:19:28 np0005531888 python3.9[55838]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:19:29 np0005531888 python3.9[55990]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:30 np0005531888 python3.9[56144]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:31 np0005531888 python3.9[56296]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:32 np0005531888 python3.9[56448]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:32 np0005531888 python3.9[56600]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:33 np0005531888 python3.9[56752]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:19:34 np0005531888 python3.9[56875]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795973.1014216-655-122923052412453/.source _original_basename=.3ntdns4t follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:35 np0005531888 python3.9[57027]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:36 np0005531888 python3.9[57179]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 22 02:19:37 np0005531888 python3.9[57331]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:40 np0005531888 python3.9[57758]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 22 02:19:41 np0005531888 ansible-async_wrapper.py[57933]: Invoked with j684272940563 300 /home/zuul/.ansible/tmp/ansible-tmp-1763795980.444076-852-117857165847868/AnsiballZ_edpm_os_net_config.py _
Nov 22 02:19:41 np0005531888 ansible-async_wrapper.py[57936]: Starting module and watcher
Nov 22 02:19:41 np0005531888 ansible-async_wrapper.py[57936]: Start watching 57937 (300)
Nov 22 02:19:41 np0005531888 ansible-async_wrapper.py[57937]: Start module (57937)
Nov 22 02:19:41 np0005531888 ansible-async_wrapper.py[57933]: Return async_wrapper task started.
Nov 22 02:19:41 np0005531888 python3.9[57938]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 22 02:19:42 np0005531888 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 22 02:19:42 np0005531888 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 22 02:19:42 np0005531888 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 22 02:19:42 np0005531888 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 22 02:19:42 np0005531888 kernel: cfg80211: failed to load regulatory.db
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.3997] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4029] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4758] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4760] audit: op="connection-add" uuid="10e9b76b-0d41-45fe-a8bb-53c1ffae4ba5" name="br-ex-br" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4783] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4785] audit: op="connection-add" uuid="4c29d942-3ccc-4820-a262-4e7d7b05c254" name="br-ex-port" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4798] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4799] audit: op="connection-add" uuid="82d1f0d2-c4c1-4feb-a681-5a2a7c499085" name="eth1-port" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4813] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4814] audit: op="connection-add" uuid="3fc507cf-8faa-4989-8255-02b52490fa07" name="vlan20-port" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4826] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4827] audit: op="connection-add" uuid="7cd177bc-d533-421c-81f7-bb5917ad91b4" name="vlan21-port" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4839] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4841] audit: op="connection-add" uuid="0f53254e-ad02-4654-b8f4-3c37e62dcf99" name="vlan22-port" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4866] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4888] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.4891] audit: op="connection-add" uuid="9b4c1690-e2e4-4579-b009-7857125c9561" name="br-ex-if" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5322] audit: op="connection-update" uuid="118125b3-7f6e-507c-b3e2-eabc6d5622d9" name="ci-private-network" args="ipv6.dns,ipv6.routes,ipv6.addr-gen-mode,ipv6.addresses,ipv6.method,ipv6.routing-rules,ipv4.dns,ipv4.routes,ipv4.never-default,ipv4.routing-rules,ipv4.addresses,ipv4.method,ovs-external-ids.data,ovs-interface.type,connection.port-type,connection.controller,connection.master,connection.timestamp,connection.slave-type" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5343] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5345] audit: op="connection-add" uuid="5fbe1ccb-73f6-4b82-a7c0-fde4f914d991" name="vlan20-if" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5361] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5363] audit: op="connection-add" uuid="97986e3b-6fd8-4545-82be-6b25865655e0" name="vlan21-if" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5379] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5381] audit: op="connection-add" uuid="4ad4f434-9900-4563-8194-81b04b6a387b" name="vlan22-if" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5397] audit: op="connection-delete" uuid="284b0229-109e-3afe-9ffa-d17d71e04c75" name="Wired connection 1" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5408] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5418] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5421] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (10e9b76b-0d41-45fe-a8bb-53c1ffae4ba5)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5421] audit: op="connection-activate" uuid="10e9b76b-0d41-45fe-a8bb-53c1ffae4ba5" name="br-ex-br" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5423] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5430] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5434] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (4c29d942-3ccc-4820-a262-4e7d7b05c254)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5436] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5441] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5446] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (82d1f0d2-c4c1-4feb-a681-5a2a7c499085)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5448] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5456] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5460] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (3fc507cf-8faa-4989-8255-02b52490fa07)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5462] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5469] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5473] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (7cd177bc-d533-421c-81f7-bb5917ad91b4)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5476] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5482] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5487] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (0f53254e-ad02-4654-b8f4-3c37e62dcf99)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5488] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5491] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5492] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5499] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5503] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5508] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (9b4c1690-e2e4-4579-b009-7857125c9561)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5508] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5512] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5514] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5515] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5517] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5531] device (eth1): disconnecting for new activation request.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5532] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5535] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5538] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5540] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5544] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5550] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5554] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (5fbe1ccb-73f6-4b82-a7c0-fde4f914d991)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5556] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5560] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5562] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5564] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5567] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5573] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5579] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (97986e3b-6fd8-4545-82be-6b25865655e0)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5580] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5584] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5586] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5587] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5591] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5597] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5610] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (4ad4f434-9900-4563-8194-81b04b6a387b)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5612] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5616] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5618] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5620] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5623] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5643] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5647] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5651] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5654] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5663] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5668] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5672] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5675] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5677] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5685] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 kernel: ovs-system: entered promiscuous mode
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5691] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5696] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5699] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5706] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5712] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5717] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 systemd-udevd[57944]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:19:43 np0005531888 kernel: Timeout policy base is empty
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5720] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5725] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5731] dhcp4 (eth0): canceled DHCP transaction
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5731] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5731] dhcp4 (eth0): state changed no lease
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5733] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5745] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5749] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57939 uid=0 result="fail" reason="Device is not activated"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.5757] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 02:19:43 np0005531888 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 02:19:43 np0005531888 kernel: br-ex: entered promiscuous mode
Nov 22 02:19:43 np0005531888 kernel: vlan21: entered promiscuous mode
Nov 22 02:19:43 np0005531888 systemd-udevd[57943]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:19:43 np0005531888 kernel: vlan20: entered promiscuous mode
Nov 22 02:19:43 np0005531888 systemd-udevd[57945]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.6612] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.6617] dhcp4 (eth0): state changed new lease, address=38.129.56.229
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.6627] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.6637] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.6645] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.6652] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 22 02:19:43 np0005531888 kernel: vlan22: entered promiscuous mode
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8336] device (eth1): disconnecting for new activation request.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8337] audit: op="connection-activate" uuid="118125b3-7f6e-507c-b3e2-eabc6d5622d9" name="ci-private-network" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8337] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8432] device (eth1): Activation: starting connection 'ci-private-network' (118125b3-7f6e-507c-b3e2-eabc6d5622d9)
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8440] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8441] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8442] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8443] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8444] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8444] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8478] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8479] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8482] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8487] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8491] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8495] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8498] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8501] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8504] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8508] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8511] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8514] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8517] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8520] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8525] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8528] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57939 uid=0 result="success"
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8559] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8566] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8586] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8589] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8597] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8613] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8622] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8627] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8628] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8631] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8636] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8642] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8646] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8651] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8654] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8656] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8661] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8666] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8671] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8676] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8678] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531888 NetworkManager[55166]: <info>  [1763795983.8683] device (eth1): Activation: successful, device activated.
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.0663] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57939 uid=0 result="success"
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.2638] checkpoint[0x5623d383a950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.2640] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57939 uid=0 result="success"
Nov 22 02:19:45 np0005531888 python3.9[58272]: ansible-ansible.legacy.async_status Invoked with jid=j684272940563.57933 mode=status _async_dir=/root/.ansible_async
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.5492] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57939 uid=0 result="success"
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.5503] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57939 uid=0 result="success"
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.7566] audit: op="networking-control" arg="global-dns-configuration" pid=57939 uid=0 result="success"
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.7609] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.7667] audit: op="networking-control" arg="global-dns-configuration" pid=57939 uid=0 result="success"
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.7698] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57939 uid=0 result="success"
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.9095] checkpoint[0x5623d383aa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 22 02:19:45 np0005531888 NetworkManager[55166]: <info>  [1763795985.9099] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57939 uid=0 result="success"
Nov 22 02:19:45 np0005531888 ansible-async_wrapper.py[57937]: Module complete (57937)
Nov 22 02:19:46 np0005531888 ansible-async_wrapper.py[57936]: Done in kid B.
Nov 22 02:19:46 np0005531888 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 02:19:48 np0005531888 python3.9[58380]: ansible-ansible.legacy.async_status Invoked with jid=j684272940563.57933 mode=status _async_dir=/root/.ansible_async
Nov 22 02:19:49 np0005531888 python3.9[58480]: ansible-ansible.legacy.async_status Invoked with jid=j684272940563.57933 mode=cleanup _async_dir=/root/.ansible_async
Nov 22 02:19:50 np0005531888 python3.9[58632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:19:50 np0005531888 python3.9[58755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795989.777778-928-93234052791751/.source.returncode _original_basename=.fl26vg7f follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:51 np0005531888 python3.9[58907]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:19:52 np0005531888 python3.9[59031]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795991.2231238-976-43757643770524/.source.cfg _original_basename=.1biiekgt follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:53 np0005531888 python3.9[59183]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:19:53 np0005531888 systemd[1]: Reloading Network Manager...
Nov 22 02:19:53 np0005531888 NetworkManager[55166]: <info>  [1763795993.3558] audit: op="reload" arg="0" pid=59187 uid=0 result="success"
Nov 22 02:19:53 np0005531888 NetworkManager[55166]: <info>  [1763795993.3567] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 22 02:19:53 np0005531888 systemd[1]: Reloaded Network Manager.
Nov 22 02:19:53 np0005531888 systemd[1]: session-12.scope: Deactivated successfully.
Nov 22 02:19:53 np0005531888 systemd[1]: session-12.scope: Consumed 51.693s CPU time.
Nov 22 02:19:53 np0005531888 systemd-logind[825]: Session 12 logged out. Waiting for processes to exit.
Nov 22 02:19:53 np0005531888 systemd-logind[825]: Removed session 12.
Nov 22 02:20:00 np0005531888 systemd-logind[825]: New session 13 of user zuul.
Nov 22 02:20:00 np0005531888 systemd[1]: Started Session 13 of User zuul.
Nov 22 02:20:01 np0005531888 python3.9[59371]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:02 np0005531888 python3.9[59526]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:03 np0005531888 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 02:20:03 np0005531888 python3.9[59716]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:20:04 np0005531888 systemd[1]: session-13.scope: Deactivated successfully.
Nov 22 02:20:04 np0005531888 systemd[1]: session-13.scope: Consumed 2.261s CPU time.
Nov 22 02:20:04 np0005531888 systemd-logind[825]: Session 13 logged out. Waiting for processes to exit.
Nov 22 02:20:04 np0005531888 systemd-logind[825]: Removed session 13.
Nov 22 02:20:10 np0005531888 systemd-logind[825]: New session 14 of user zuul.
Nov 22 02:20:10 np0005531888 systemd[1]: Started Session 14 of User zuul.
Nov 22 02:20:11 np0005531888 python3.9[59898]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:12 np0005531888 python3.9[60052]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:14 np0005531888 python3.9[60208]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:15 np0005531888 python3.9[60293]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:20:17 np0005531888 python3.9[60446]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:18 np0005531888 python3.9[60637]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:19 np0005531888 python3.9[60789]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:20:19 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:20:20 np0005531888 python3.9[60952]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:20:20 np0005531888 python3.9[61030]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:21 np0005531888 python3.9[61182]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:20:21 np0005531888 python3.9[61260]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:22 np0005531888 python3.9[61412]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:23 np0005531888 python3.9[61564]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:24 np0005531888 python3.9[61716]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:24 np0005531888 python3.9[61868]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:25 np0005531888 python3.9[62020]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:20:28 np0005531888 python3.9[62173]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:29 np0005531888 python3.9[62327]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:20:30 np0005531888 python3.9[62479]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:20:30 np0005531888 python3.9[62631]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:20:31 np0005531888 python3.9[62784]: ansible-service_facts Invoked
Nov 22 02:20:31 np0005531888 network[62801]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:20:31 np0005531888 network[62802]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:20:31 np0005531888 network[62803]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:20:37 np0005531888 python3.9[63255]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:20:39 np0005531888 python3.9[63408]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 22 02:20:41 np0005531888 python3.9[63560]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:20:41 np0005531888 python3.9[63685]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796040.8136735-665-68199859281333/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:42 np0005531888 python3.9[63839]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:20:43 np0005531888 python3.9[63964]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796042.2438025-710-241758512436835/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:45 np0005531888 python3.9[64118]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:47 np0005531888 python3.9[64272]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:48 np0005531888 python3.9[64356]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:20:49 np0005531888 python3.9[64510]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:50 np0005531888 python3.9[64594]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:20:50 np0005531888 chronyd[833]: chronyd exiting
Nov 22 02:20:50 np0005531888 systemd[1]: Stopping NTP client/server...
Nov 22 02:20:50 np0005531888 systemd[1]: chronyd.service: Deactivated successfully.
Nov 22 02:20:50 np0005531888 systemd[1]: Stopped NTP client/server.
Nov 22 02:20:50 np0005531888 systemd[1]: Starting NTP client/server...
Nov 22 02:20:50 np0005531888 chronyd[64602]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 22 02:20:50 np0005531888 chronyd[64602]: Frequency -23.579 +/- 0.171 ppm read from /var/lib/chrony/drift
Nov 22 02:20:50 np0005531888 chronyd[64602]: Loaded seccomp filter (level 2)
Nov 22 02:20:50 np0005531888 systemd[1]: Started NTP client/server.
Nov 22 02:20:51 np0005531888 systemd[1]: session-14.scope: Deactivated successfully.
Nov 22 02:20:51 np0005531888 systemd[1]: session-14.scope: Consumed 24.930s CPU time.
Nov 22 02:20:51 np0005531888 systemd-logind[825]: Session 14 logged out. Waiting for processes to exit.
Nov 22 02:20:51 np0005531888 systemd-logind[825]: Removed session 14.
Nov 22 02:20:57 np0005531888 systemd-logind[825]: New session 15 of user zuul.
Nov 22 02:20:57 np0005531888 systemd[1]: Started Session 15 of User zuul.
Nov 22 02:20:58 np0005531888 python3.9[64781]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:59 np0005531888 python3.9[64937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:00 np0005531888 python3.9[65112]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:01 np0005531888 python3.9[65190]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.mzsx1_1l recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:02 np0005531888 python3.9[65342]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:03 np0005531888 python3.9[65465]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796061.6948676-150-19273546528037/.source _original_basename=.rwv5e3je follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:03 np0005531888 python3.9[65617]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:21:04 np0005531888 python3.9[65769]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:05 np0005531888 python3.9[65892]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796064.1075628-222-110683153841945/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:21:05 np0005531888 python3.9[66044]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:06 np0005531888 python3.9[66167]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796065.260011-222-38706625265360/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:21:07 np0005531888 python3.9[66319]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:07 np0005531888 python3.9[66471]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:08 np0005531888 python3.9[66594]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796067.3233418-334-214941108398197/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:09 np0005531888 python3.9[66746]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:09 np0005531888 python3.9[66869]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796068.7567182-379-56704817683157/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:11 np0005531888 python3.9[67021]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:11 np0005531888 systemd[1]: Reloading.
Nov 22 02:21:11 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:11 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:11 np0005531888 systemd[1]: Reloading.
Nov 22 02:21:11 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:11 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:11 np0005531888 systemd[1]: Starting EDPM Container Shutdown...
Nov 22 02:21:11 np0005531888 systemd[1]: Finished EDPM Container Shutdown.
Nov 22 02:21:12 np0005531888 python3.9[67249]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:12 np0005531888 python3.9[67372]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796071.842921-447-275244390480296/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:13 np0005531888 python3.9[67524]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:14 np0005531888 python3.9[67647]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796073.0936143-492-183694331681669/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:15 np0005531888 python3.9[67799]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:15 np0005531888 systemd[1]: Reloading.
Nov 22 02:21:15 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:15 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:15 np0005531888 systemd[1]: Reloading.
Nov 22 02:21:15 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:15 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:15 np0005531888 systemd[1]: Starting Create netns directory...
Nov 22 02:21:15 np0005531888 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 02:21:15 np0005531888 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 02:21:15 np0005531888 systemd[1]: Finished Create netns directory.
Nov 22 02:21:16 np0005531888 python3.9[68026]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:21:16 np0005531888 network[68043]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:21:16 np0005531888 network[68044]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:21:16 np0005531888 network[68045]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:21:20 np0005531888 python3.9[68307]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:20 np0005531888 systemd[1]: Reloading.
Nov 22 02:21:21 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:21 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:21 np0005531888 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 22 02:21:21 np0005531888 iptables.init[68347]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 22 02:21:21 np0005531888 iptables.init[68347]: iptables: Flushing firewall rules: [  OK  ]
Nov 22 02:21:21 np0005531888 systemd[1]: iptables.service: Deactivated successfully.
Nov 22 02:21:21 np0005531888 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 22 02:21:22 np0005531888 python3.9[68543]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:23 np0005531888 python3.9[68697]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:23 np0005531888 systemd[1]: Reloading.
Nov 22 02:21:23 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:23 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:23 np0005531888 systemd[1]: Starting Netfilter Tables...
Nov 22 02:21:23 np0005531888 systemd[1]: Finished Netfilter Tables.
Nov 22 02:21:24 np0005531888 python3.9[68888]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:21:25 np0005531888 python3.9[69041]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:26 np0005531888 python3.9[69166]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796085.2783608-700-119252781973083/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:27 np0005531888 python3.9[69319]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:21:27 np0005531888 systemd[1]: Reloading OpenSSH server daemon...
Nov 22 02:21:27 np0005531888 systemd[1]: Reloaded OpenSSH server daemon.
Nov 22 02:21:28 np0005531888 python3.9[69475]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:28 np0005531888 python3.9[69627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:29 np0005531888 python3.9[69750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796088.3494844-793-67059312842276/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:30 np0005531888 python3.9[69902]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 22 02:21:30 np0005531888 systemd[1]: Starting Time & Date Service...
Nov 22 02:21:30 np0005531888 systemd[1]: Started Time & Date Service.
Nov 22 02:21:31 np0005531888 python3.9[70058]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:32 np0005531888 python3.9[70210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:32 np0005531888 python3.9[70333]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796091.9136066-898-126008364279695/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:33 np0005531888 python3.9[70485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:34 np0005531888 python3.9[70608]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796093.2575543-943-57174151193111/.source.yaml _original_basename=.t0tqt02k follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:35 np0005531888 python3.9[70760]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:36 np0005531888 python3.9[70883]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796095.1512864-988-67649619938739/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:36 np0005531888 python3.9[71035]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:21:37 np0005531888 python3.9[71188]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:21:38 np0005531888 python3[71341]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 02:21:39 np0005531888 python3.9[71493]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:39 np0005531888 python3.9[71616]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796098.8076465-1105-208791178434538/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:40 np0005531888 python3.9[71768]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:41 np0005531888 python3.9[71891]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796100.2221084-1150-158116389903728/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:42 np0005531888 python3.9[72043]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:42 np0005531888 python3.9[72166]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796101.5780065-1195-247398259566649/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:43 np0005531888 python3.9[72318]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:44 np0005531888 python3.9[72441]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796103.3898892-1240-125970424842252/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:45 np0005531888 python3.9[72593]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:45 np0005531888 python3.9[72716]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796104.6334968-1285-166940869317072/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:46 np0005531888 python3.9[72868]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:47 np0005531888 python3.9[73020]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:21:48 np0005531888 python3.9[73179]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:49 np0005531888 python3.9[73332]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:50 np0005531888 python3.9[73484]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:51 np0005531888 python3.9[73636]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 22 02:21:51 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:21:51 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:21:51 np0005531888 python3.9[73790]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 22 02:21:52 np0005531888 systemd[1]: session-15.scope: Deactivated successfully.
Nov 22 02:21:52 np0005531888 systemd[1]: session-15.scope: Consumed 33.965s CPU time.
Nov 22 02:21:52 np0005531888 systemd-logind[825]: Session 15 logged out. Waiting for processes to exit.
Nov 22 02:21:52 np0005531888 systemd-logind[825]: Removed session 15.
Nov 22 02:21:58 np0005531888 systemd-logind[825]: New session 16 of user zuul.
Nov 22 02:21:58 np0005531888 systemd[1]: Started Session 16 of User zuul.
Nov 22 02:21:58 np0005531888 python3.9[73971]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 22 02:21:59 np0005531888 python3.9[74123]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:22:00 np0005531888 python3.9[74275]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:22:00 np0005531888 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 02:22:01 np0005531888 python3.9[74429]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmnpAkzBA+/P5ygAqTpHSo/yxyshcDXOqGY2sZ6+LmKpfF/U/3puURRCYPFHLvU6Fe2oRGY6GwNjK7ej5idUzOOTgf6eMc2MfuxlwwYk9lQWXXYu3BIFbZTa/Jz2j3Jd5KpxE11/bi7aYfn5u+oXd0Q+EgbyaX14S6EGKPujybZZbWbPUjXyNBIpHDRP3QOvtmf0oXpNj7FZ/+eQ5okb2AzQeflovexeLh5/TrUuMpBgxJC+IT5bDgtr3scwyEN7Su9iQQos2qnNIIzuFTAJrbao4uS5RsC+rRO10O4Z+2p8nWhQuSG2tQ63gvUhaXg8h1KFhHYfclNow/Nzxq1rSASWv2iNeUsoDWgxH7Yq3GPbGEofld095ADvo32HdVYHmdYEaD9GLY7WKHW6ilz14vUYQ6cN6XZoli1rdTt1Z/UWpQSy64npnbT3IGeztmD2KPGZP2laTkFkxzTh7m0Dz2sBx1rbfhQV8SjNw0ZkeSV+G3sqXWqozNXMvk1k7Ma/8=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPqC7967YjYmXjy5Y1Atr1idIuJEqYVlUbJ/ivnjEtjv#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJYSJRTKIRJHGorxvpDox6ZIDiNrie6EQnECMuD5IFEY7kEn/cP5JLTUpe4kf0aZt1r5R4WnwY6StRedSzkyRk0=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCowOVwCxoeDeu/gjiGxT6DsxjadsI6OsklC9oYH1562wrbeZDtXV2FXgAB8clz6v5hIpHsJBOPMHniNRFVQwlu7A3igwl6rkisIR380P+Ttep7r2pEz15KdpK86MS3svcPZn5qKpfnr+3JUxX9Kt71rH4jGzpDSQCFB/vJPmodINZL7o8vaTg1Gz0vkf+zJlmQjq3fUKVrInLbL6hPyuV8pXqtw3q+JYCIrXJlHDFPOngM4PsGnOJL6j9PaOEdRXK30tQNQlzko6lfntblufy8mAb/o9Sn1ulCIbI1nIJqkTVm9aK31C4nWSPumTQ9GLdi0dvultCwMbw0ym7pzFAWlxrsx4V9GRz2yqAPLbNwEFoaA42ScSLnQpq+Y3747tGiT5jdKz2AyCBa6sN43tUXKR/mtjBpXXoOsCvgvzvnlul+TRmjoju2jFsL05dlNImskQ1UwAn5iIr+7TvzDF03jeQYani/6aykV0z4KJyt0VneL9fYnSlSZ7dnpTfkYgc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII6yEJELXMtPQkFh9QeTL6LtFdllgEtcCx/vTvD4VQRD#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBENm8i1piornPA+bu3lQ1gnDuQ+S2zp/iE9MvAGxNHKKvA3MMS333GJWFx+BV6kDFZ9hDTDj/kimvzvpu8lziNQ=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQtcPDTu5W9vsjUBaPvvVOkaIA7MPqUmOieWOqa7ySB03c8aREkaNDH3Zlp68jXwdC1Qpw0/2EGQ83sfaSlvG6XSE4QBwVDoOMe7GzTY8agr/ZZOIedAz8v04HH0OpnsD0tqLQlZZ0nuBJ7UM7iP5PTbc7O1Z2n35+F+XTqiKfSmsCSxhnwJhgyZBKS0HJUIsvQoVw1N699OnanMSweTImsEURAEEsL3zrVM9Qa/uw2XH5LTuU9kXzfqKNgy/5VXcEbamLe+cPbFPKDc8ei1sCASL3xDbyGriLdNKOiSjytc5GTcG0eg5aHmBxz1/KWYAf9JCs/xAGk0Nifft+xlfC1OwkiPBCHsUlfVWzERxno/lVQQGvrNTgMZ1G/lJwuhRYCWlScgfADkcZSitTszGd/qlunDx3biSKRE1RnACqaNsF0Hum0S7m6d8wxj6TNoD618+lN82HqRhRMhrVQ+hxQySpHEXSWTdhfVLND+2neEL9hyT0cCzB33FvD+YacRk=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIBd7JSklzBfXUPIvKiAxXVL//OQf5r0dI648cExbdgs#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAE41g4YqzC3bfy1t/lRYP5p85+7h3wD8DzLbz0LtdbkROkWg/OHzC73WNbkqdHKqwacHfch6fbycv9mIDE73cM=#012 create=True mode=0644 path=/tmp/ansible.yop7hqe3 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:02 np0005531888 python3.9[74581]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.yop7hqe3' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:22:03 np0005531888 python3.9[74735]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.yop7hqe3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:03 np0005531888 systemd[1]: session-16.scope: Deactivated successfully.
Nov 22 02:22:03 np0005531888 systemd[1]: session-16.scope: Consumed 3.191s CPU time.
Nov 22 02:22:03 np0005531888 systemd-logind[825]: Session 16 logged out. Waiting for processes to exit.
Nov 22 02:22:03 np0005531888 systemd-logind[825]: Removed session 16.
Nov 22 02:22:09 np0005531888 systemd-logind[825]: New session 17 of user zuul.
Nov 22 02:22:09 np0005531888 systemd[1]: Started Session 17 of User zuul.
Nov 22 02:22:10 np0005531888 python3.9[74913]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:22:11 np0005531888 python3.9[75069]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 22 02:22:12 np0005531888 python3.9[75223]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:22:13 np0005531888 python3.9[75376]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:22:14 np0005531888 python3.9[75529]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:22:15 np0005531888 python3.9[75683]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:22:16 np0005531888 python3.9[75839]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:16 np0005531888 systemd[1]: session-17.scope: Deactivated successfully.
Nov 22 02:22:16 np0005531888 systemd[1]: session-17.scope: Consumed 4.225s CPU time.
Nov 22 02:22:16 np0005531888 systemd-logind[825]: Session 17 logged out. Waiting for processes to exit.
Nov 22 02:22:16 np0005531888 systemd-logind[825]: Removed session 17.
Nov 22 02:22:21 np0005531888 systemd-logind[825]: New session 18 of user zuul.
Nov 22 02:22:21 np0005531888 systemd[1]: Started Session 18 of User zuul.
Nov 22 02:22:22 np0005531888 python3.9[76017]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:22:24 np0005531888 python3.9[76173]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:22:24 np0005531888 python3.9[76257]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:22:27 np0005531888 python3.9[76408]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:22:28 np0005531888 python3.9[76559]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:22:29 np0005531888 python3.9[76709]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:22:30 np0005531888 python3.9[76859]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:22:30 np0005531888 systemd[1]: session-18.scope: Deactivated successfully.
Nov 22 02:22:30 np0005531888 systemd[1]: session-18.scope: Consumed 5.674s CPU time.
Nov 22 02:22:30 np0005531888 systemd-logind[825]: Session 18 logged out. Waiting for processes to exit.
Nov 22 02:22:30 np0005531888 systemd-logind[825]: Removed session 18.
Nov 22 02:22:36 np0005531888 systemd-logind[825]: New session 19 of user zuul.
Nov 22 02:22:36 np0005531888 systemd[1]: Started Session 19 of User zuul.
Nov 22 02:22:37 np0005531888 python3.9[77037]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:22:39 np0005531888 python3.9[77193]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:40 np0005531888 python3.9[77345]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:41 np0005531888 python3.9[77497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:42 np0005531888 python3.9[77620]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796160.6250224-162-149780336274556/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=4d9131d4a27625057eaa811efacf92903752180d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:42 np0005531888 python3.9[77772]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:43 np0005531888 python3.9[77895]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796162.160514-162-115271807530538/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=16bd60fbc24423e3fc1bbc9e201827083d9b5e39 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:43 np0005531888 python3.9[78047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:44 np0005531888 python3.9[78170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796163.2797904-162-268814524759861/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=9582de61dfb0bbdab7688531e638cddfcfc7d630 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:44 np0005531888 python3.9[78322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:45 np0005531888 python3.9[78474]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:46 np0005531888 python3.9[78626]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:46 np0005531888 python3.9[78749]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796165.800024-339-245298098905664/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=f71857488a6cf6f47dd0f6d2835c691d683c2ffe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:47 np0005531888 python3.9[78901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:47 np0005531888 python3.9[79024]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796167.0111275-339-48978653549104/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=8ad07d9f15fb881d541cc871f705c812e1318a58 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:48 np0005531888 python3.9[79176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:49 np0005531888 python3.9[79299]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796168.1235278-339-121745981668223/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=827814f70d49db4794055bb335c02b7f65b7b099 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:49 np0005531888 python3.9[79452]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:50 np0005531888 python3.9[79604]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:51 np0005531888 python3.9[79756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:51 np0005531888 python3.9[79879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796170.5990722-506-57599213599560/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=a24dc3a3ff39db2a2194d450d38ed2285a5bab60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:52 np0005531888 python3.9[80031]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:52 np0005531888 python3.9[80154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796171.742677-506-157338717846408/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=ab92e79a33a6e2fca5144cd0532be918fe14e6b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:53 np0005531888 python3.9[80306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:53 np0005531888 python3.9[80429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796172.846344-506-117140923777675/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=421987e49260e1d96fb8c4eb2339a09042d4d7ae backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:54 np0005531888 python3.9[80581]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:55 np0005531888 python3.9[80733]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:55 np0005531888 python3.9[80885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:56 np0005531888 python3.9[81008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796175.3636527-679-16892197362034/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=06922c3b91ec283c8209012eb8b29cd62ade97fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:56 np0005531888 python3.9[81160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:57 np0005531888 python3.9[81283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796176.5047505-679-218184700251849/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=ab92e79a33a6e2fca5144cd0532be918fe14e6b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:58 np0005531888 python3.9[81435]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:58 np0005531888 python3.9[81558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796177.681063-679-232741446262248/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=d85c0f15d1325eec3d779da262f98bf3eea0d17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:59 np0005531888 python3.9[81710]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:00 np0005531888 python3.9[81862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:01 np0005531888 python3.9[81985]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796180.1316128-863-204076637141560/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:01 np0005531888 python3.9[82137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:02 np0005531888 python3.9[82289]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:02 np0005531888 python3.9[82412]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796181.98008-940-103691068809926/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:03 np0005531888 python3.9[82564]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:04 np0005531888 python3.9[82716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:04 np0005531888 python3.9[82839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796183.8703353-1011-142019730327260/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:05 np0005531888 python3.9[82991]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:06 np0005531888 python3.9[83143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:06 np0005531888 python3.9[83266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796185.745507-1082-185622329402233/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:07 np0005531888 python3.9[83418]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:08 np0005531888 python3.9[83570]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:08 np0005531888 python3.9[83693]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796187.6549053-1152-34100927044684/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:09 np0005531888 python3.9[83845]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:10 np0005531888 python3.9[83997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:10 np0005531888 python3.9[84120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796189.7622366-1227-266999368888580/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:11 np0005531888 python3.9[84272]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:12 np0005531888 python3.9[84424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:12 np0005531888 python3.9[84547]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796191.6842256-1298-140776440144070/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:14 np0005531888 systemd[1]: session-19.scope: Deactivated successfully.
Nov 22 02:23:14 np0005531888 systemd[1]: session-19.scope: Consumed 28.039s CPU time.
Nov 22 02:23:14 np0005531888 systemd-logind[825]: Session 19 logged out. Waiting for processes to exit.
Nov 22 02:23:14 np0005531888 systemd-logind[825]: Removed session 19.
Nov 22 02:23:20 np0005531888 systemd-logind[825]: New session 20 of user zuul.
Nov 22 02:23:20 np0005531888 systemd[1]: Started Session 20 of User zuul.
Nov 22 02:23:21 np0005531888 python3.9[84725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:23:22 np0005531888 python3.9[84881]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:23 np0005531888 python3.9[85033]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:24 np0005531888 python3.9[85183]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:23:24 np0005531888 python3.9[85335]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 22 02:23:26 np0005531888 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 22 02:23:27 np0005531888 python3.9[85491]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:23:28 np0005531888 python3.9[85575]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:23:30 np0005531888 python3.9[85728]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:23:31 np0005531888 python3[85883]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 22 02:23:32 np0005531888 python3.9[86035]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:33 np0005531888 python3.9[86187]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:33 np0005531888 python3.9[86265]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:34 np0005531888 python3.9[86417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:34 np0005531888 python3.9[86495]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.uac8heqe recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:35 np0005531888 python3.9[86647]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:35 np0005531888 python3.9[86725]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:36 np0005531888 python3.9[86877]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:37 np0005531888 python3[87030]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 02:23:38 np0005531888 python3.9[87182]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:39 np0005531888 python3.9[87307]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796217.9698215-439-53250899944711/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:39 np0005531888 python3.9[87459]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:40 np0005531888 python3.9[87584]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796219.4015276-484-201413925027059/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:41 np0005531888 python3.9[87736]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:41 np0005531888 python3.9[87861]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796220.6567812-529-2519532829304/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:42 np0005531888 python3.9[88013]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:43 np0005531888 python3.9[88138]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796222.1050844-573-194106437222567/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:43 np0005531888 python3.9[88290]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:44 np0005531888 python3.9[88415]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796223.4568615-619-109340581510458/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:45 np0005531888 python3.9[88567]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:45 np0005531888 python3.9[88719]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:46 np0005531888 python3.9[88874]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:47 np0005531888 python3.9[89026]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:48 np0005531888 python3.9[89179]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:23:49 np0005531888 python3.9[89333]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:49 np0005531888 python3.9[89488]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:51 np0005531888 python3.9[89638]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:23:52 np0005531888 python3.9[89791]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:52 np0005531888 ovs-vsctl[89792]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 22 02:23:53 np0005531888 python3.9[89944]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:54 np0005531888 python3.9[90099]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:54 np0005531888 ovs-vsctl[90100]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 22 02:23:54 np0005531888 python3.9[90250]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:23:55 np0005531888 python3.9[90404]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:56 np0005531888 python3.9[90556]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:57 np0005531888 python3.9[90634]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:57 np0005531888 python3.9[90786]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:58 np0005531888 python3.9[90864]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:58 np0005531888 python3.9[91016]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:59 np0005531888 python3.9[91168]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:00 np0005531888 python3.9[91246]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:00 np0005531888 python3.9[91399]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:01 np0005531888 python3.9[91477]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:02 np0005531888 python3.9[91629]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:24:02 np0005531888 systemd[1]: Reloading.
Nov 22 02:24:02 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:02 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:03 np0005531888 python3.9[91817]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:03 np0005531888 python3.9[91895]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:04 np0005531888 chronyd[64602]: Selected source 162.159.200.123 (pool.ntp.org)
Nov 22 02:24:04 np0005531888 python3.9[92047]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:05 np0005531888 python3.9[92125]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:05 np0005531888 python3.9[92277]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:24:05 np0005531888 systemd[1]: Reloading.
Nov 22 02:24:06 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:06 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:06 np0005531888 systemd[1]: Starting Create netns directory...
Nov 22 02:24:06 np0005531888 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 02:24:06 np0005531888 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 02:24:06 np0005531888 systemd[1]: Finished Create netns directory.
Nov 22 02:24:07 np0005531888 python3.9[92470]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:08 np0005531888 python3.9[92622]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:08 np0005531888 python3.9[92745]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796247.5685296-1372-102454478841416/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:09 np0005531888 python3.9[92897]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:10 np0005531888 python3.9[93049]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:11 np0005531888 python3.9[93172]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796250.402978-1446-87560217881156/.source.json _original_basename=.2ie33u12 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:12 np0005531888 python3.9[93324]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:14 np0005531888 python3.9[93751]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 22 02:24:15 np0005531888 python3.9[93903]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:24:16 np0005531888 python3.9[94055]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 02:24:16 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:24:18 np0005531888 python3[94219]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:24:18 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:24:18 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:24:18 np0005531888 podman[94255]: 2025-11-22 07:24:18.670939837 +0000 UTC m=+0.047407542 container create cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:24:18 np0005531888 podman[94255]: 2025-11-22 07:24:18.64472265 +0000 UTC m=+0.021190375 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 02:24:18 np0005531888 python3[94219]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 02:24:19 np0005531888 python3.9[94443]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:24:19 np0005531888 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:24:20 np0005531888 python3.9[94597]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:20 np0005531888 python3.9[94673]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:24:21 np0005531888 python3.9[94824]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796260.83545-1710-184417179510589/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:22 np0005531888 python3.9[94900]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:24:22 np0005531888 systemd[1]: Reloading.
Nov 22 02:24:22 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:22 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:23 np0005531888 python3.9[95011]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:24:23 np0005531888 systemd[1]: Reloading.
Nov 22 02:24:23 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:23 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:23 np0005531888 systemd[1]: Starting ovn_controller container...
Nov 22 02:24:23 np0005531888 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 22 02:24:23 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:24:23 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/834bfab6235e7e25fcece4eb172d93f4be4fc0ca517f8f3b8da2f744aed0be41/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 02:24:23 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006.
Nov 22 02:24:23 np0005531888 podman[95052]: 2025-11-22 07:24:23.582695578 +0000 UTC m=+0.125008398 container init cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118)
Nov 22 02:24:23 np0005531888 ovn_controller[95067]: + sudo -E kolla_set_configs
Nov 22 02:24:23 np0005531888 podman[95052]: 2025-11-22 07:24:23.615973607 +0000 UTC m=+0.158286407 container start cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:24:23 np0005531888 edpm-start-podman-container[95052]: ovn_controller
Nov 22 02:24:23 np0005531888 systemd[1]: Created slice User Slice of UID 0.
Nov 22 02:24:23 np0005531888 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 22 02:24:23 np0005531888 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 22 02:24:23 np0005531888 systemd[1]: Starting User Manager for UID 0...
Nov 22 02:24:23 np0005531888 edpm-start-podman-container[95051]: Creating additional drop-in dependency for "ovn_controller" (cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006)
Nov 22 02:24:23 np0005531888 podman[95074]: 2025-11-22 07:24:23.686325065 +0000 UTC m=+0.057860216 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:24:23 np0005531888 systemd[1]: cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006-1d059ca37b79f2c1.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:24:23 np0005531888 systemd[1]: cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006-1d059ca37b79f2c1.service: Failed with result 'exit-code'.
Nov 22 02:24:23 np0005531888 systemd[1]: Reloading.
Nov 22 02:24:23 np0005531888 systemd[95099]: Queued start job for default target Main User Target.
Nov 22 02:24:23 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:23 np0005531888 systemd[95099]: Created slice User Application Slice.
Nov 22 02:24:23 np0005531888 systemd[95099]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 22 02:24:23 np0005531888 systemd[95099]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:24:23 np0005531888 systemd[95099]: Reached target Paths.
Nov 22 02:24:23 np0005531888 systemd[95099]: Reached target Timers.
Nov 22 02:24:23 np0005531888 systemd[95099]: Starting D-Bus User Message Bus Socket...
Nov 22 02:24:23 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:23 np0005531888 systemd[95099]: Starting Create User's Volatile Files and Directories...
Nov 22 02:24:23 np0005531888 systemd[95099]: Finished Create User's Volatile Files and Directories.
Nov 22 02:24:23 np0005531888 systemd[95099]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:24:23 np0005531888 systemd[95099]: Reached target Sockets.
Nov 22 02:24:23 np0005531888 systemd[95099]: Reached target Basic System.
Nov 22 02:24:23 np0005531888 systemd[95099]: Reached target Main User Target.
Nov 22 02:24:23 np0005531888 systemd[95099]: Startup finished in 126ms.
Nov 22 02:24:23 np0005531888 systemd[1]: Started User Manager for UID 0.
Nov 22 02:24:23 np0005531888 systemd[1]: Started ovn_controller container.
Nov 22 02:24:23 np0005531888 systemd[1]: Started Session c1 of User root.
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: INFO:__main__:Validating config file
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: INFO:__main__:Writing out command to execute
Nov 22 02:24:24 np0005531888 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: ++ cat /run_command
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: + ARGS=
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: + sudo kolla_copy_cacerts
Nov 22 02:24:24 np0005531888 systemd[1]: Started Session c2 of User root.
Nov 22 02:24:24 np0005531888 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: + [[ ! -n '' ]]
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: + . kolla_extend_start
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: + umask 0022
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 22 02:24:24 np0005531888 NetworkManager[55166]: <info>  [1763796264.1279] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 22 02:24:24 np0005531888 NetworkManager[55166]: <info>  [1763796264.1285] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:24:24 np0005531888 NetworkManager[55166]: <info>  [1763796264.1295] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 22 02:24:24 np0005531888 NetworkManager[55166]: <info>  [1763796264.1301] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 22 02:24:24 np0005531888 NetworkManager[55166]: <info>  [1763796264.1303] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 02:24:24 np0005531888 kernel: br-int: entered promiscuous mode
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 22 02:24:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:24Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 22 02:24:24 np0005531888 systemd-udevd[95200]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:24:24 np0005531888 python3.9[95330]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:24:24 np0005531888 ovs-vsctl[95331]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 22 02:24:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:25Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 02:24:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:25Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 02:24:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:25Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 02:24:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:25Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 02:24:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:25Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 02:24:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:25Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 02:24:25 np0005531888 NetworkManager[55166]: <info>  [1763796265.2661] manager: (ovn-e686e2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 22 02:24:25 np0005531888 kernel: genev_sys_6081: entered promiscuous mode
Nov 22 02:24:25 np0005531888 systemd-udevd[95202]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:24:25 np0005531888 NetworkManager[55166]: <info>  [1763796265.2862] device (genev_sys_6081): carrier: link connected
Nov 22 02:24:25 np0005531888 NetworkManager[55166]: <info>  [1763796265.2865] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 22 02:24:25 np0005531888 python3.9[95487]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:24:25 np0005531888 ovs-vsctl[95489]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 22 02:24:25 np0005531888 NetworkManager[55166]: <info>  [1763796265.9528] manager: (ovn-df0984-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 22 02:24:26 np0005531888 python3.9[95642]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:24:26 np0005531888 ovs-vsctl[95643]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 22 02:24:26 np0005531888 NetworkManager[55166]: <info>  [1763796266.8823] manager: (ovn-73ab13-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 22 02:24:27 np0005531888 systemd-logind[825]: Session 20 logged out. Waiting for processes to exit.
Nov 22 02:24:27 np0005531888 systemd[1]: session-20.scope: Deactivated successfully.
Nov 22 02:24:27 np0005531888 systemd[1]: session-20.scope: Consumed 42.883s CPU time.
Nov 22 02:24:27 np0005531888 systemd-logind[825]: Removed session 20.
Nov 22 02:24:34 np0005531888 systemd[1]: Stopping User Manager for UID 0...
Nov 22 02:24:34 np0005531888 systemd[95099]: Activating special unit Exit the Session...
Nov 22 02:24:34 np0005531888 systemd[95099]: Stopped target Main User Target.
Nov 22 02:24:34 np0005531888 systemd[95099]: Stopped target Basic System.
Nov 22 02:24:34 np0005531888 systemd[95099]: Stopped target Paths.
Nov 22 02:24:34 np0005531888 systemd[95099]: Stopped target Sockets.
Nov 22 02:24:34 np0005531888 systemd[95099]: Stopped target Timers.
Nov 22 02:24:34 np0005531888 systemd[95099]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:24:34 np0005531888 systemd[95099]: Closed D-Bus User Message Bus Socket.
Nov 22 02:24:34 np0005531888 systemd[95099]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:24:34 np0005531888 systemd[95099]: Removed slice User Application Slice.
Nov 22 02:24:34 np0005531888 systemd[95099]: Reached target Shutdown.
Nov 22 02:24:34 np0005531888 systemd[95099]: Finished Exit the Session.
Nov 22 02:24:34 np0005531888 systemd[95099]: Reached target Exit the Session.
Nov 22 02:24:34 np0005531888 systemd[1]: user@0.service: Deactivated successfully.
Nov 22 02:24:34 np0005531888 systemd[1]: Stopped User Manager for UID 0.
Nov 22 02:24:34 np0005531888 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 22 02:24:34 np0005531888 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 22 02:24:34 np0005531888 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 22 02:24:34 np0005531888 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 22 02:24:34 np0005531888 systemd[1]: Removed slice User Slice of UID 0.
Nov 22 02:24:35 np0005531888 systemd-logind[825]: New session 22 of user zuul.
Nov 22 02:24:35 np0005531888 systemd[1]: Started Session 22 of User zuul.
Nov 22 02:24:37 np0005531888 python3.9[95824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:24:38 np0005531888 python3.9[95980]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:39 np0005531888 python3.9[96132]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:40 np0005531888 python3.9[96284]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:40 np0005531888 python3.9[96436]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:41 np0005531888 python3.9[96588]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:42 np0005531888 python3.9[96741]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:24:43 np0005531888 python3.9[96893]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 22 02:24:45 np0005531888 python3.9[97043]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:45 np0005531888 python3.9[97164]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796284.407702-226-280143951757928/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:46 np0005531888 python3.9[97314]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:47 np0005531888 python3.9[97435]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796286.1723948-271-140741278439240/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:48 np0005531888 python3.9[97587]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:24:49 np0005531888 python3.9[97671]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:24:51 np0005531888 python3.9[97824]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:24:52 np0005531888 python3.9[97977]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:53 np0005531888 python3.9[98098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796292.266949-381-92095865216093/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:53 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:53Z|00025|memory|INFO|16256 kB peak resident set size after 29.8 seconds
Nov 22 02:24:53 np0005531888 ovn_controller[95067]: 2025-11-22T07:24:53Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 22 02:24:53 np0005531888 podman[98222]: 2025-11-22 07:24:53.937806683 +0000 UTC m=+0.090108630 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 02:24:54 np0005531888 python3.9[98263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:54 np0005531888 python3.9[98393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796293.608462-381-154081873946223/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:56 np0005531888 python3.9[98543]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:57 np0005531888 python3.9[98664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796296.3625998-513-105241972033682/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:57 np0005531888 python3.9[98814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:58 np0005531888 python3.9[98935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796297.4065819-513-159081343260926/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:59 np0005531888 python3.9[99085]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:24:59 np0005531888 python3.9[99239]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:00 np0005531888 python3.9[99391]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:01 np0005531888 python3.9[99469]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:01 np0005531888 python3.9[99621]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:02 np0005531888 python3.9[99699]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:02 np0005531888 python3.9[99851]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:03 np0005531888 python3.9[100003]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:03 np0005531888 python3.9[100081]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:04 np0005531888 python3.9[100233]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:05 np0005531888 python3.9[100311]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:05 np0005531888 python3.9[100463]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:05 np0005531888 systemd[1]: Reloading.
Nov 22 02:25:06 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:06 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:06 np0005531888 python3.9[100652]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:07 np0005531888 python3.9[100730]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:07 np0005531888 python3.9[100882]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:08 np0005531888 python3.9[100960]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:09 np0005531888 python3.9[101112]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:09 np0005531888 systemd[1]: Reloading.
Nov 22 02:25:09 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:09 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:09 np0005531888 systemd[1]: Starting Create netns directory...
Nov 22 02:25:09 np0005531888 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 02:25:09 np0005531888 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 02:25:09 np0005531888 systemd[1]: Finished Create netns directory.
Nov 22 02:25:10 np0005531888 python3.9[101307]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:10 np0005531888 python3.9[101459]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:11 np0005531888 python3.9[101582]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796310.4731042-966-10257061081167/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:12 np0005531888 python3.9[101734]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:13 np0005531888 python3.9[101886]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:13 np0005531888 python3.9[102009]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796312.819872-1041-3223063041876/.source.json _original_basename=._qwz_6na follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:14 np0005531888 python3.9[102161]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:17 np0005531888 python3.9[102588]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 22 02:25:18 np0005531888 python3.9[102740]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:25:19 np0005531888 python3.9[102892]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 02:25:20 np0005531888 python3[103068]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:25:26 np0005531888 podman[103125]: 2025-11-22 07:25:26.529587754 +0000 UTC m=+1.894604870 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:25:27 np0005531888 podman[103081]: 2025-11-22 07:25:27.59723702 +0000 UTC m=+6.738217467 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:25:27 np0005531888 podman[103204]: 2025-11-22 07:25:27.733682919 +0000 UTC m=+0.051846381 container create c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:25:27 np0005531888 podman[103204]: 2025-11-22 07:25:27.702657631 +0000 UTC m=+0.020821113 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:25:27 np0005531888 python3[103068]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:25:28 np0005531888 python3.9[103394]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:25:29 np0005531888 python3.9[103548]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:29 np0005531888 python3.9[103624]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:25:30 np0005531888 python3.9[103775]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796330.0472827-1305-33097562243061/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:31 np0005531888 python3.9[103851]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:25:31 np0005531888 systemd[1]: Reloading.
Nov 22 02:25:31 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:31 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:32 np0005531888 python3.9[103963]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:32 np0005531888 systemd[1]: Reloading.
Nov 22 02:25:32 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:32 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:33 np0005531888 systemd[1]: Starting ovn_metadata_agent container...
Nov 22 02:25:33 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:25:33 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71ec948109befacdeb6608df1b546503cb644fb52e12118abd69afc19857c45d/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 22 02:25:33 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71ec948109befacdeb6608df1b546503cb644fb52e12118abd69afc19857c45d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:25:34 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba.
Nov 22 02:25:34 np0005531888 podman[104003]: 2025-11-22 07:25:34.70723332 +0000 UTC m=+1.560597551 container init c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + sudo -E kolla_set_configs
Nov 22 02:25:34 np0005531888 podman[104003]: 2025-11-22 07:25:34.738116484 +0000 UTC m=+1.591480665 container start c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Validating config file
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Copying service configuration files
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Writing out command to execute
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: ++ cat /run_command
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + CMD=neutron-ovn-metadata-agent
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + ARGS=
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + sudo kolla_copy_cacerts
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + [[ ! -n '' ]]
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + . kolla_extend_start
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: Running command: 'neutron-ovn-metadata-agent'
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + umask 0022
Nov 22 02:25:34 np0005531888 ovn_metadata_agent[104018]: + exec neutron-ovn-metadata-agent
Nov 22 02:25:35 np0005531888 edpm-start-podman-container[104003]: ovn_metadata_agent
Nov 22 02:25:35 np0005531888 edpm-start-podman-container[104002]: Creating additional drop-in dependency for "ovn_metadata_agent" (c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba)
Nov 22 02:25:35 np0005531888 podman[104025]: 2025-11-22 07:25:35.255320011 +0000 UTC m=+0.505325412 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:25:35 np0005531888 systemd[1]: Reloading.
Nov 22 02:25:35 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:35 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:35 np0005531888 systemd[1]: Started ovn_metadata_agent container.
Nov 22 02:25:36 np0005531888 systemd-logind[825]: Session 22 logged out. Waiting for processes to exit.
Nov 22 02:25:36 np0005531888 systemd[1]: session-22.scope: Deactivated successfully.
Nov 22 02:25:36 np0005531888 systemd[1]: session-22.scope: Consumed 47.124s CPU time.
Nov 22 02:25:36 np0005531888 systemd-logind[825]: Removed session 22.
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.738 104023 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.738 104023 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.738 104023 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.739 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.739 104023 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.739 104023 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.739 104023 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.739 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.740 104023 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.740 104023 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.740 104023 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.740 104023 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.740 104023 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.740 104023 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.740 104023 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.741 104023 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.741 104023 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.741 104023 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.741 104023 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.741 104023 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.741 104023 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.741 104023 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.741 104023 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.742 104023 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.742 104023 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.742 104023 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.742 104023 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.742 104023 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.742 104023 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.742 104023 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.742 104023 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.743 104023 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.743 104023 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.743 104023 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.743 104023 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.743 104023 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.743 104023 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.743 104023 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.744 104023 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.745 104023 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.746 104023 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.746 104023 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.746 104023 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.746 104023 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.746 104023 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.746 104023 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.746 104023 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.746 104023 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.747 104023 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.747 104023 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.747 104023 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.747 104023 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.747 104023 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.747 104023 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.747 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.747 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.748 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.748 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.748 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.748 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.748 104023 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.748 104023 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.748 104023 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.749 104023 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.749 104023 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.749 104023 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.749 104023 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.749 104023 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.749 104023 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.749 104023 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.750 104023 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.750 104023 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.750 104023 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.750 104023 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.750 104023 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.750 104023 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.750 104023 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.751 104023 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.751 104023 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.751 104023 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.751 104023 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.751 104023 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.751 104023 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.751 104023 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.752 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.753 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.754 104023 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.754 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.754 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.754 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.754 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.754 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.754 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.754 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.755 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.755 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.755 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.755 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.755 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.755 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.755 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.755 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.756 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.757 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.757 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.757 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.757 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.757 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.757 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.757 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.757 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.758 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.758 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.758 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.758 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.758 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.758 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.758 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.758 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.759 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.760 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.761 104023 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.762 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.763 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.763 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.763 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.763 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.763 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.763 104023 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.763 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.764 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.765 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.766 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.767 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.768 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.769 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.769 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.769 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.769 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.769 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.769 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.769 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.769 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.770 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.770 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.770 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.770 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.770 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.770 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.770 104023 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.771 104023 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.771 104023 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.771 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.771 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.771 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.771 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.771 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.771 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.772 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.772 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.772 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.772 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.772 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.772 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.772 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.773 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.773 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.773 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.773 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.773 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.774 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.774 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.774 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.774 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.774 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.774 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.775 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.775 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.775 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.775 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.775 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.775 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.775 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.775 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.776 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.776 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.776 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.776 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.776 104023 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.776 104023 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.785 104023 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.785 104023 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.786 104023 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.786 104023 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.786 104023 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.799 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 4984e16e-8f1c-4426-bfc6-5927f375ce79 (UUID: 4984e16e-8f1c-4426-bfc6-5927f375ce79) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.821 104023 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.821 104023 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.821 104023 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.822 104023 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.825 104023 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.830 104023 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.835 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '4984e16e-8f1c-4426-bfc6-5927f375ce79'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], external_ids={}, name=4984e16e-8f1c-4426-bfc6-5927f375ce79, nb_cfg_timestamp=1763796272162, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.836 104023 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f03cf796a90>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.837 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.837 104023 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.837 104023 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.837 104023 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.842 104023 DEBUG oslo_service.service [-] Started child 104131 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.845 104023 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp1gz1m0r1/privsep.sock']#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.845 104131 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-889473'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.869 104131 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.870 104131 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.870 104131 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.874 104131 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.881 104131 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 22 02:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:36.887 104131 INFO eventlet.wsgi.server [-] (104131) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 22 02:25:37 np0005531888 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 22 02:25:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:37.499 104023 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:25:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:37.500 104023 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1gz1m0r1/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 22 02:25:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:37.379 104136 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:25:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:37.384 104136 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:25:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:37.386 104136 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 22 02:25:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:37.386 104136 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104136#033[00m
Nov 22 02:25:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:37.502 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[80a3213f-5261-4fe0-b39d-6625bd6e9bd6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.000 104136 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.000 104136 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.000 104136 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.536 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[9e790f34-0927-4465-8f9c-749d5a97fb8c]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.539 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, column=external_ids, values=({'neutron:ovn-metadata-id': '8c070ab8-39dd-5004-a80e-aeca22afdcf8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.810 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.920 104023 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.920 104023 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.920 104023 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.921 104023 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.921 104023 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.921 104023 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.921 104023 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.921 104023 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.921 104023 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.922 104023 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.922 104023 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.922 104023 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.922 104023 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.922 104023 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.922 104023 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.923 104023 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.923 104023 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.923 104023 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.923 104023 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.923 104023 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.923 104023 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.924 104023 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.924 104023 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.924 104023 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.924 104023 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.924 104023 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.925 104023 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.925 104023 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.925 104023 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.925 104023 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.925 104023 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.925 104023 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.925 104023 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.926 104023 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.926 104023 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.926 104023 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.926 104023 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.926 104023 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.927 104023 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.927 104023 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.927 104023 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.927 104023 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.927 104023 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.927 104023 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.927 104023 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.928 104023 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.928 104023 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.928 104023 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.928 104023 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.928 104023 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.928 104023 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.928 104023 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.929 104023 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.929 104023 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.929 104023 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.929 104023 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.929 104023 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.929 104023 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.929 104023 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.929 104023 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.930 104023 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.930 104023 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.930 104023 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.930 104023 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.930 104023 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.930 104023 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.930 104023 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.931 104023 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.931 104023 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.931 104023 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.931 104023 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.931 104023 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.931 104023 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.931 104023 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.932 104023 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.932 104023 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.932 104023 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.932 104023 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.932 104023 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.932 104023 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.932 104023 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.933 104023 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.933 104023 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.933 104023 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.933 104023 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.933 104023 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.933 104023 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.933 104023 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.934 104023 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.934 104023 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.934 104023 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.934 104023 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.934 104023 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.934 104023 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.934 104023 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.935 104023 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.935 104023 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.935 104023 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.935 104023 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.935 104023 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.935 104023 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.935 104023 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.935 104023 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.936 104023 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.936 104023 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.936 104023 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.936 104023 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.936 104023 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.936 104023 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.937 104023 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.937 104023 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.937 104023 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.937 104023 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.937 104023 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.937 104023 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.937 104023 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.938 104023 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.938 104023 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.938 104023 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.938 104023 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.938 104023 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.938 104023 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.939 104023 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.939 104023 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.939 104023 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.939 104023 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.939 104023 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.939 104023 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.939 104023 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.940 104023 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.940 104023 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.940 104023 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.940 104023 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.940 104023 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.940 104023 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.940 104023 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.941 104023 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.941 104023 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.941 104023 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.941 104023 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.941 104023 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.941 104023 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.941 104023 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.942 104023 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.943 104023 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.944 104023 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.945 104023 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.946 104023 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.947 104023 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.948 104023 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.949 104023 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.950 104023 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.951 104023 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.952 104023 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.952 104023 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.952 104023 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.952 104023 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.952 104023 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.952 104023 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.952 104023 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.952 104023 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.953 104023 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.954 104023 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.954 104023 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.954 104023 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.954 104023 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.954 104023 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.954 104023 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.954 104023 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.955 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.956 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.957 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.958 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.958 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.958 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.958 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.958 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.958 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.958 104023 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.958 104023 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.959 104023 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.959 104023 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.959 104023 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:25:38.959 104023 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 22 02:25:41 np0005531888 systemd-logind[825]: New session 23 of user zuul.
Nov 22 02:25:41 np0005531888 systemd[1]: Started Session 23 of User zuul.
Nov 22 02:25:42 np0005531888 python3.9[104294]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:25:44 np0005531888 python3.9[104450]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:25:45 np0005531888 python3.9[104615]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:25:45 np0005531888 systemd[1]: Reloading.
Nov 22 02:25:45 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:45 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:46 np0005531888 python3.9[104799]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:25:46 np0005531888 network[104816]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:25:46 np0005531888 network[104817]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:25:46 np0005531888 network[104818]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:25:50 np0005531888 python3.9[105079]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:51 np0005531888 python3.9[105232]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:52 np0005531888 python3.9[105385]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:53 np0005531888 python3.9[105538]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:53 np0005531888 python3.9[105691]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:54 np0005531888 python3.9[105844]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:55 np0005531888 python3.9[105997]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:56 np0005531888 python3.9[106150]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:57 np0005531888 podman[106303]: 2025-11-22 07:25:57.750767171 +0000 UTC m=+0.122240819 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 02:25:57 np0005531888 python3.9[106302]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:58 np0005531888 python3.9[106481]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:59 np0005531888 python3.9[106633]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:00 np0005531888 python3.9[106786]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:01 np0005531888 python3.9[106938]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:02 np0005531888 python3.9[107090]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:04 np0005531888 python3.9[107242]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:04 np0005531888 python3.9[107394]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:05 np0005531888 python3.9[107546]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:05 np0005531888 podman[107623]: 2025-11-22 07:26:05.673359936 +0000 UTC m=+0.046818629 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:26:05 np0005531888 python3.9[107717]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:06 np0005531888 python3.9[107869]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:07 np0005531888 python3.9[108021]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:07 np0005531888 python3.9[108173]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:08 np0005531888 python3.9[108325]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:09 np0005531888 python3.9[108477]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:26:10 np0005531888 python3.9[108629]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:26:10 np0005531888 systemd[1]: Reloading.
Nov 22 02:26:10 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:26:10 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:26:11 np0005531888 python3.9[108816]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:12 np0005531888 python3.9[108969]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:12 np0005531888 python3.9[109122]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:13 np0005531888 python3.9[109275]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:13 np0005531888 python3.9[109428]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:14 np0005531888 python3.9[109581]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:14 np0005531888 python3.9[109734]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:16 np0005531888 python3.9[109887]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 22 02:26:17 np0005531888 python3.9[110040]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:26:18 np0005531888 python3.9[110198]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:26:19 np0005531888 python3.9[110358]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:26:20 np0005531888 python3.9[110442]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:26:28 np0005531888 podman[110505]: 2025-11-22 07:26:28.723463998 +0000 UTC m=+0.092006592 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:26:36 np0005531888 podman[110657]: 2025-11-22 07:26:36.684503695 +0000 UTC m=+0.057957054 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 02:26:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:26:36.778 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:26:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:26:36.779 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:26:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:26:36.779 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:26:47 np0005531888 kernel: SELinux:  Converting 2759 SID table entries...
Nov 22 02:26:47 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:26:47 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:26:47 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:26:47 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:26:47 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:26:47 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:26:47 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:26:57 np0005531888 kernel: SELinux:  Converting 2759 SID table entries...
Nov 22 02:26:57 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:26:57 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:26:57 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:26:57 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:26:57 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:26:57 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:26:57 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:26:59 np0005531888 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 22 02:26:59 np0005531888 podman[110693]: 2025-11-22 07:26:59.7636243 +0000 UTC m=+0.130441962 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:27:07 np0005531888 podman[110721]: 2025-11-22 07:27:07.673612154 +0000 UTC m=+0.052149990 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:27:30 np0005531888 podman[123704]: 2025-11-22 07:27:30.737488271 +0000 UTC m=+0.114003749 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 02:27:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:27:36.779 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:27:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:27:36.780 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:27:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:27:36.780 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:27:38 np0005531888 podman[127547]: 2025-11-22 07:27:38.667432229 +0000 UTC m=+0.048056809 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:27:51 np0005531888 kernel: SELinux:  Converting 2760 SID table entries...
Nov 22 02:27:51 np0005531888 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:27:51 np0005531888 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:27:51 np0005531888 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:27:51 np0005531888 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:27:51 np0005531888 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:27:51 np0005531888 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:27:51 np0005531888 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:27:53 np0005531888 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Nov 22 02:27:53 np0005531888 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 22 02:27:53 np0005531888 dbus-broker-launch[814]: Noticed file-system modification, trigger reload.
Nov 22 02:28:01 np0005531888 systemd[1]: Stopping OpenSSH server daemon...
Nov 22 02:28:01 np0005531888 systemd[1]: sshd.service: Deactivated successfully.
Nov 22 02:28:01 np0005531888 systemd[1]: Stopped OpenSSH server daemon.
Nov 22 02:28:01 np0005531888 systemd[1]: sshd.service: Consumed 1.468s CPU time, read 32.0K from disk, written 0B to disk.
Nov 22 02:28:01 np0005531888 systemd[1]: Stopped target sshd-keygen.target.
Nov 22 02:28:01 np0005531888 systemd[1]: Stopping sshd-keygen.target...
Nov 22 02:28:01 np0005531888 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 02:28:01 np0005531888 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 02:28:01 np0005531888 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 02:28:01 np0005531888 systemd[1]: Reached target sshd-keygen.target.
Nov 22 02:28:01 np0005531888 systemd[1]: Starting OpenSSH server daemon...
Nov 22 02:28:01 np0005531888 systemd[1]: Started OpenSSH server daemon.
Nov 22 02:28:01 np0005531888 podman[128337]: 2025-11-22 07:28:01.313762195 +0000 UTC m=+0.121740468 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:28:04 np0005531888 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:28:04 np0005531888 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:28:04 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:04 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:04 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:04 np0005531888 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:28:09 np0005531888 podman[134571]: 2025-11-22 07:28:09.718420833 +0000 UTC m=+0.087350104 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 02:28:12 np0005531888 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:28:12 np0005531888 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:28:12 np0005531888 systemd[1]: man-db-cache-update.service: Consumed 9.702s CPU time.
Nov 22 02:28:12 np0005531888 systemd[1]: run-r5cd2d82a53434f9db1f33068972529a9.service: Deactivated successfully.
Nov 22 02:28:18 np0005531888 python3.9[137172]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:18 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:18 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:18 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:19 np0005531888 python3.9[137362]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:19 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:19 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:19 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:20 np0005531888 python3.9[137552]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:20 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:20 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:20 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:21 np0005531888 python3.9[137742]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:21 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:21 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:21 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:23 np0005531888 python3.9[137932]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:23 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:24 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:24 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:24 np0005531888 python3.9[138122]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:25 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:25 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:25 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:26 np0005531888 python3.9[138312]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:26 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:26 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:26 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:27 np0005531888 python3.9[138502]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:28 np0005531888 python3.9[138657]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:28 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:28 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:28 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:29 np0005531888 python3.9[138848]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:29 np0005531888 systemd[1]: Reloading.
Nov 22 02:28:29 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:29 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:29 np0005531888 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 22 02:28:29 np0005531888 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 22 02:28:30 np0005531888 python3.9[139041]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:31 np0005531888 podman[139045]: 2025-11-22 07:28:31.720818925 +0000 UTC m=+0.096652539 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:28:32 np0005531888 python3.9[139223]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:33 np0005531888 python3.9[139378]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:34 np0005531888 python3.9[139533]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:35 np0005531888 python3.9[139688]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:36 np0005531888 python3.9[139843]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:28:36.781 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:28:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:28:36.782 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:28:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:28:36.782 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:28:37 np0005531888 python3.9[139998]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:38 np0005531888 python3.9[140153]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:38 np0005531888 python3.9[140308]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:39 np0005531888 python3.9[140463]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:39 np0005531888 podman[140465]: 2025-11-22 07:28:39.947359935 +0000 UTC m=+0.045903155 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 02:28:40 np0005531888 python3.9[140637]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:41 np0005531888 python3.9[140792]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:42 np0005531888 python3.9[140947]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:43 np0005531888 python3.9[141102]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:45 np0005531888 python3.9[141257]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:46 np0005531888 python3.9[141409]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:46 np0005531888 python3.9[141561]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:47 np0005531888 python3.9[141713]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:48 np0005531888 python3.9[141865]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:49 np0005531888 python3.9[142017]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:51 np0005531888 python3.9[142169]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:52 np0005531888 python3.9[142294]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796530.301521-1631-182584958691743/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:52 np0005531888 python3.9[142446]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:53 np0005531888 python3.9[142571]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796532.4617546-1631-98663830752129/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:54 np0005531888 python3.9[142723]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:55 np0005531888 python3.9[142848]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796534.008304-1631-130616533722067/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:56 np0005531888 python3.9[143000]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:57 np0005531888 python3.9[143125]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796535.6636314-1631-100831397881107/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:57 np0005531888 python3.9[143277]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:58 np0005531888 python3.9[143402]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796537.326738-1631-102174990644453/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:59 np0005531888 python3.9[143554]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:59 np0005531888 python3.9[143679]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796538.7266097-1631-33524749198041/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:00 np0005531888 python3.9[143831]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:01 np0005531888 python3.9[143954]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796539.9646337-1631-214768167813543/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:01 np0005531888 podman[144106]: 2025-11-22 07:29:01.870675313 +0000 UTC m=+0.091343269 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 02:29:01 np0005531888 python3.9[144107]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:02 np0005531888 python3.9[144259]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796541.4491944-1631-53720916687509/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:03 np0005531888 python3.9[144411]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 22 02:29:04 np0005531888 python3.9[144564]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:05 np0005531888 python3.9[144716]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:06 np0005531888 python3.9[144868]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:06 np0005531888 python3.9[145020]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:07 np0005531888 python3.9[145172]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:08 np0005531888 python3.9[145324]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:08 np0005531888 python3.9[145476]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:09 np0005531888 python3.9[145628]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:10 np0005531888 podman[145752]: 2025-11-22 07:29:10.060907715 +0000 UTC m=+0.043400434 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:29:10 np0005531888 python3.9[145799]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:10 np0005531888 python3.9[145951]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:11 np0005531888 python3.9[146103]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:12 np0005531888 python3.9[146255]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:13 np0005531888 python3.9[146407]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:13 np0005531888 python3.9[146559]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:15 np0005531888 python3.9[146711]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:15 np0005531888 python3.9[146834]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796554.6048303-2293-5921040370470/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:16 np0005531888 python3.9[146986]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:17 np0005531888 python3.9[147109]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796555.957304-2293-235884079287960/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:17 np0005531888 python3.9[147261]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:18 np0005531888 python3.9[147384]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796557.4689853-2293-236522076453752/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:19 np0005531888 python3.9[147536]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:19 np0005531888 python3.9[147659]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796558.7388778-2293-259545283665253/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:20 np0005531888 python3.9[147811]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:21 np0005531888 python3.9[147934]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796559.9718359-2293-147435465727690/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:21 np0005531888 python3.9[148086]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:22 np0005531888 python3.9[148209]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796561.3008132-2293-86602489901372/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:23 np0005531888 python3.9[148361]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:23 np0005531888 python3.9[148484]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796562.6659122-2293-268624814537186/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:24 np0005531888 python3.9[148636]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:25 np0005531888 python3.9[148759]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796564.0008063-2293-199736564328689/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:25 np0005531888 python3.9[148911]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:26 np0005531888 python3.9[149034]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796565.2691238-2293-104198616690/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:27 np0005531888 python3.9[149186]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:27 np0005531888 python3.9[149309]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796566.5104492-2293-212627992544550/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:28 np0005531888 python3.9[149461]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:28 np0005531888 python3.9[149584]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796567.8118007-2293-276286447588937/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:29 np0005531888 python3.9[149736]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:30 np0005531888 python3.9[149859]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796569.0604548-2293-97770520891940/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:30 np0005531888 python3.9[150011]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:31 np0005531888 python3.9[150134]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796570.3410332-2293-58227925362479/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:32 np0005531888 podman[150286]: 2025-11-22 07:29:32.026634931 +0000 UTC m=+0.084416653 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 22 02:29:32 np0005531888 python3.9[150287]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:34 np0005531888 python3.9[150435]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796571.6346474-2293-161253240592856/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:35 np0005531888 python3.9[150585]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:29:36 np0005531888 python3.9[150740]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 22 02:29:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:29:36.782 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:29:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:29:36.783 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:29:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:29:36.783 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:29:38 np0005531888 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 22 02:29:38 np0005531888 python3.9[150896]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:39 np0005531888 python3.9[151048]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:40 np0005531888 python3.9[151200]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:40 np0005531888 podman[151324]: 2025-11-22 07:29:40.54768835 +0000 UTC m=+0.064664314 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 02:29:40 np0005531888 python3.9[151372]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:41 np0005531888 python3.9[151524]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:42 np0005531888 python3.9[151676]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:43 np0005531888 python3.9[151828]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:43 np0005531888 python3.9[151980]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:44 np0005531888 python3.9[152132]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:45 np0005531888 python3.9[152284]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:45 np0005531888 python3.9[152436]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:45 np0005531888 systemd[1]: Reloading.
Nov 22 02:29:45 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:45 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:46 np0005531888 systemd[1]: Starting libvirt logging daemon socket...
Nov 22 02:29:46 np0005531888 systemd[1]: Listening on libvirt logging daemon socket.
Nov 22 02:29:46 np0005531888 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 22 02:29:46 np0005531888 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 22 02:29:46 np0005531888 systemd[1]: Starting libvirt logging daemon...
Nov 22 02:29:46 np0005531888 systemd[1]: Started libvirt logging daemon.
Nov 22 02:29:47 np0005531888 python3.9[152630]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:47 np0005531888 systemd[1]: Reloading.
Nov 22 02:29:47 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:47 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:47 np0005531888 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 22 02:29:47 np0005531888 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 22 02:29:47 np0005531888 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 22 02:29:47 np0005531888 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 22 02:29:47 np0005531888 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 22 02:29:47 np0005531888 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 22 02:29:47 np0005531888 systemd[1]: Starting libvirt nodedev daemon...
Nov 22 02:29:47 np0005531888 systemd[1]: Started libvirt nodedev daemon.
Nov 22 02:29:48 np0005531888 python3.9[152846]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:48 np0005531888 systemd[1]: Reloading.
Nov 22 02:29:48 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:48 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:48 np0005531888 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 22 02:29:48 np0005531888 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 22 02:29:48 np0005531888 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 22 02:29:48 np0005531888 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 22 02:29:48 np0005531888 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 22 02:29:48 np0005531888 systemd[1]: Starting libvirt proxy daemon...
Nov 22 02:29:48 np0005531888 systemd[1]: Started libvirt proxy daemon.
Nov 22 02:29:48 np0005531888 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 22 02:29:49 np0005531888 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 22 02:29:49 np0005531888 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 22 02:29:49 np0005531888 python3.9[153057]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:49 np0005531888 systemd[1]: Reloading.
Nov 22 02:29:49 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:49 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:49 np0005531888 systemd[1]: Listening on libvirt locking daemon socket.
Nov 22 02:29:49 np0005531888 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 22 02:29:49 np0005531888 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 22 02:29:49 np0005531888 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 22 02:29:49 np0005531888 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 22 02:29:49 np0005531888 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 22 02:29:49 np0005531888 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 22 02:29:49 np0005531888 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 22 02:29:49 np0005531888 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 22 02:29:49 np0005531888 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 22 02:29:49 np0005531888 systemd[1]: Starting libvirt QEMU daemon...
Nov 22 02:29:49 np0005531888 systemd[1]: Started libvirt QEMU daemon.
Nov 22 02:29:50 np0005531888 setroubleshoot[152882]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l ef2c3f2c-3c47-49df-900a-cf7c475c561f
Nov 22 02:29:50 np0005531888 setroubleshoot[152882]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 22 02:29:50 np0005531888 setroubleshoot[152882]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l ef2c3f2c-3c47-49df-900a-cf7c475c561f
Nov 22 02:29:50 np0005531888 setroubleshoot[152882]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 22 02:29:50 np0005531888 python3.9[153282]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:50 np0005531888 systemd[1]: Reloading.
Nov 22 02:29:50 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:50 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:50 np0005531888 systemd[1]: Starting libvirt secret daemon socket...
Nov 22 02:29:51 np0005531888 systemd[1]: Listening on libvirt secret daemon socket.
Nov 22 02:29:51 np0005531888 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 22 02:29:51 np0005531888 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 22 02:29:51 np0005531888 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 22 02:29:51 np0005531888 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 22 02:29:51 np0005531888 systemd[1]: Starting libvirt secret daemon...
Nov 22 02:29:51 np0005531888 systemd[1]: Started libvirt secret daemon.
Nov 22 02:29:53 np0005531888 python3.9[153493]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:54 np0005531888 python3.9[153645]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:29:55 np0005531888 python3.9[153797]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:56 np0005531888 python3.9[153921]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796594.7481227-3328-117462668570672/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:57 np0005531888 python3.9[154073]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:58 np0005531888 python3.9[154225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:58 np0005531888 python3.9[154303]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:59 np0005531888 python3.9[154455]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:59 np0005531888 python3.9[154533]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6hxnf6dd recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:00 np0005531888 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 22 02:30:00 np0005531888 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.006s CPU time.
Nov 22 02:30:00 np0005531888 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 22 02:30:00 np0005531888 python3.9[154685]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:01 np0005531888 python3.9[154763]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:02 np0005531888 python3.9[154915]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:02 np0005531888 podman[154993]: 2025-11-22 07:30:02.727860347 +0000 UTC m=+0.089851582 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 02:30:03 np0005531888 python3[155095]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 02:30:04 np0005531888 python3.9[155247]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:05 np0005531888 python3.9[155325]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:06 np0005531888 python3.9[155477]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:06 np0005531888 python3.9[155555]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:07 np0005531888 python3.9[155707]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:08 np0005531888 python3.9[155785]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:08 np0005531888 python3.9[155937]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:09 np0005531888 python3.9[156015]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:10 np0005531888 python3.9[156167]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:10 np0005531888 podman[156293]: 2025-11-22 07:30:10.682720514 +0000 UTC m=+0.050770355 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:30:10 np0005531888 python3.9[156292]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796609.547956-3703-216319497729006/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:11 np0005531888 python3.9[156465]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:12 np0005531888 python3.9[156617]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:13 np0005531888 python3.9[156772]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:14 np0005531888 python3.9[156924]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:14 np0005531888 python3.9[157077]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:30:15 np0005531888 python3.9[157231]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:16 np0005531888 python3.9[157386]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:17 np0005531888 python3.9[157538]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:17 np0005531888 python3.9[157661]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796616.8109093-3919-254484440301635/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:18 np0005531888 python3.9[157813]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:19 np0005531888 python3.9[157936]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796618.338517-3965-82699770749496/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:20 np0005531888 python3.9[158088]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:20 np0005531888 python3.9[158211]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796619.6543252-4008-144558378410188/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:21 np0005531888 python3.9[158363]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:30:21 np0005531888 systemd[1]: Reloading.
Nov 22 02:30:21 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:30:21 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:30:21 np0005531888 systemd[1]: Reached target edpm_libvirt.target.
Nov 22 02:30:22 np0005531888 python3.9[158555]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 22 02:30:22 np0005531888 systemd[1]: Reloading.
Nov 22 02:30:22 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:30:22 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:30:23 np0005531888 systemd[1]: Reloading.
Nov 22 02:30:23 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:30:23 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:30:24 np0005531888 systemd[1]: session-23.scope: Deactivated successfully.
Nov 22 02:30:24 np0005531888 systemd[1]: session-23.scope: Consumed 3min 14.363s CPU time.
Nov 22 02:30:24 np0005531888 systemd-logind[825]: Session 23 logged out. Waiting for processes to exit.
Nov 22 02:30:24 np0005531888 systemd-logind[825]: Removed session 23.
Nov 22 02:30:29 np0005531888 systemd-logind[825]: New session 24 of user zuul.
Nov 22 02:30:29 np0005531888 systemd[1]: Started Session 24 of User zuul.
Nov 22 02:30:30 np0005531888 python3.9[158806]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:30:31 np0005531888 python3.9[158960]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:30:31 np0005531888 network[158977]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:30:31 np0005531888 network[158978]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:30:31 np0005531888 network[158979]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:30:32 np0005531888 podman[158987]: 2025-11-22 07:30:32.874084516 +0000 UTC m=+0.091206248 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 02:30:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:30:36.783 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:30:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:30:36.784 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:30:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:30:36.784 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:30:36 np0005531888 python3.9[159276]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:30:37 np0005531888 python3.9[159360]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:30:41 np0005531888 podman[159362]: 2025-11-22 07:30:41.690642317 +0000 UTC m=+0.056824592 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:30:44 np0005531888 python3.9[159535]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:30:45 np0005531888 python3.9[159687]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:45 np0005531888 python3.9[159840]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:30:46 np0005531888 python3.9[159992]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:47 np0005531888 python3.9[160145]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:48 np0005531888 python3.9[160268]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796647.0292249-253-70013715536862/.source.iscsi _original_basename=.i61es976 follow=False checksum=ce71b536ade639c6949c77bcf7fa21efb9f4cdbd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:49 np0005531888 python3.9[160420]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:49 np0005531888 python3.9[160572]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:49 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:30:49 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:30:49 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:30:51 np0005531888 python3.9[160725]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:30:51 np0005531888 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 22 02:30:52 np0005531888 python3.9[160881]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:30:52 np0005531888 systemd[1]: Reloading.
Nov 22 02:30:52 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:30:52 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:30:52 np0005531888 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 22 02:30:52 np0005531888 systemd[1]: Starting Open-iSCSI...
Nov 22 02:30:52 np0005531888 kernel: Loading iSCSI transport class v2.0-870.
Nov 22 02:30:52 np0005531888 systemd[1]: Started Open-iSCSI.
Nov 22 02:30:52 np0005531888 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 22 02:30:52 np0005531888 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 22 02:30:53 np0005531888 python3.9[161082]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:30:53 np0005531888 network[161099]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:30:53 np0005531888 network[161100]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:30:53 np0005531888 network[161101]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:30:58 np0005531888 python3.9[161372]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 02:30:59 np0005531888 python3.9[161524]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 22 02:30:59 np0005531888 python3.9[161680]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:00 np0005531888 python3.9[161803]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796659.4978948-484-68093506936547/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:01 np0005531888 python3.9[161957]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:02 np0005531888 python3.9[162109]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:31:02 np0005531888 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 22 02:31:02 np0005531888 systemd[1]: Stopped Load Kernel Modules.
Nov 22 02:31:02 np0005531888 systemd[1]: Stopping Load Kernel Modules...
Nov 22 02:31:02 np0005531888 systemd[1]: Starting Load Kernel Modules...
Nov 22 02:31:02 np0005531888 systemd[1]: Finished Load Kernel Modules.
Nov 22 02:31:03 np0005531888 podman[162265]: 2025-11-22 07:31:03.051571319 +0000 UTC m=+0.097992820 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 22 02:31:03 np0005531888 python3.9[162266]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:04 np0005531888 python3.9[162443]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:04 np0005531888 python3.9[162595]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:05 np0005531888 python3.9[162747]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:05 np0005531888 python3.9[162870]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796664.979886-658-127466862603552/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:06 np0005531888 python3.9[163022]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:31:07 np0005531888 python3.9[163175]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:08 np0005531888 python3.9[163327]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:08 np0005531888 python3.9[163479]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:09 np0005531888 python3.9[163631]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:10 np0005531888 python3.9[163783]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:10 np0005531888 python3.9[163935]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:11 np0005531888 python3.9[164087]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:12 np0005531888 podman[164211]: 2025-11-22 07:31:12.006257451 +0000 UTC m=+0.056129260 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 02:31:12 np0005531888 python3.9[164258]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:12 np0005531888 python3.9[164414]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:13 np0005531888 python3.9[164566]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:14 np0005531888 python3.9[164718]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:14 np0005531888 python3.9[164796]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:15 np0005531888 python3.9[164948]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:16 np0005531888 python3.9[165026]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:16 np0005531888 python3.9[165178]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:17 np0005531888 python3.9[165330]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:17 np0005531888 python3.9[165408]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:18 np0005531888 python3.9[165560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:19 np0005531888 python3.9[165638]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:20 np0005531888 python3.9[165790]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:31:20 np0005531888 systemd[1]: Reloading.
Nov 22 02:31:20 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:20 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:21 np0005531888 python3.9[165979]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:21 np0005531888 python3.9[166057]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:22 np0005531888 python3.9[166209]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:22 np0005531888 python3.9[166287]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:23 np0005531888 python3.9[166439]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:31:23 np0005531888 systemd[1]: Reloading.
Nov 22 02:31:23 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:23 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:24 np0005531888 systemd[1]: Starting Create netns directory...
Nov 22 02:31:24 np0005531888 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 02:31:24 np0005531888 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 02:31:24 np0005531888 systemd[1]: Finished Create netns directory.
Nov 22 02:31:25 np0005531888 python3.9[166632]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:25 np0005531888 python3.9[166784]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:26 np0005531888 python3.9[166907]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796685.299417-1278-219510814348012/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:27 np0005531888 python3.9[167059]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:28 np0005531888 python3.9[167211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:28 np0005531888 python3.9[167334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796687.6350596-1353-14106405246461/.source.json _original_basename=.e26oib2k follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:29 np0005531888 python3.9[167486]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:31 np0005531888 python3.9[167913]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 22 02:31:33 np0005531888 podman[168037]: 2025-11-22 07:31:33.201323082 +0000 UTC m=+0.117168827 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 02:31:33 np0005531888 python3.9[168084]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:31:34 np0005531888 python3.9[168245]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 02:31:36 np0005531888 python3[168424]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:31:36 np0005531888 podman[168459]: 2025-11-22 07:31:36.360166667 +0000 UTC m=+0.051181293 container create 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 02:31:36 np0005531888 podman[168459]: 2025-11-22 07:31:36.329029031 +0000 UTC m=+0.020043677 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 02:31:36 np0005531888 python3[168424]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 02:31:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:31:36.784 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:31:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:31:36.784 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:31:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:31:36.785 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:31:37 np0005531888 python3.9[168646]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:38 np0005531888 python3.9[168800]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:38 np0005531888 python3.9[168876]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:39 np0005531888 python3.9[169027]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796698.6771781-1617-194251940807134/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:39 np0005531888 python3.9[169103]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:31:39 np0005531888 systemd[1]: Reloading.
Nov 22 02:31:40 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:40 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:40 np0005531888 python3.9[169214]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:31:40 np0005531888 systemd[1]: Reloading.
Nov 22 02:31:41 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:41 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:41 np0005531888 systemd[1]: Starting multipathd container...
Nov 22 02:31:41 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:31:41 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/132f10213d893961e56ea28a5c34637141647bed717c19b110a3756ab7953b32/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 02:31:41 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/132f10213d893961e56ea28a5c34637141647bed717c19b110a3756ab7953b32/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 02:31:41 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.
Nov 22 02:31:41 np0005531888 podman[169254]: 2025-11-22 07:31:41.60833616 +0000 UTC m=+0.303300880 container init 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 22 02:31:41 np0005531888 multipathd[169269]: + sudo -E kolla_set_configs
Nov 22 02:31:41 np0005531888 podman[169254]: 2025-11-22 07:31:41.641492155 +0000 UTC m=+0.336456855 container start 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 02:31:41 np0005531888 podman[169254]: multipathd
Nov 22 02:31:41 np0005531888 systemd[1]: Started multipathd container.
Nov 22 02:31:41 np0005531888 multipathd[169269]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:31:41 np0005531888 multipathd[169269]: INFO:__main__:Validating config file
Nov 22 02:31:41 np0005531888 multipathd[169269]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:31:41 np0005531888 multipathd[169269]: INFO:__main__:Writing out command to execute
Nov 22 02:31:41 np0005531888 multipathd[169269]: ++ cat /run_command
Nov 22 02:31:41 np0005531888 multipathd[169269]: + CMD='/usr/sbin/multipathd -d'
Nov 22 02:31:41 np0005531888 multipathd[169269]: + ARGS=
Nov 22 02:31:41 np0005531888 multipathd[169269]: + sudo kolla_copy_cacerts
Nov 22 02:31:41 np0005531888 multipathd[169269]: + [[ ! -n '' ]]
Nov 22 02:31:41 np0005531888 multipathd[169269]: + . kolla_extend_start
Nov 22 02:31:41 np0005531888 multipathd[169269]: Running command: '/usr/sbin/multipathd -d'
Nov 22 02:31:41 np0005531888 multipathd[169269]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 22 02:31:41 np0005531888 multipathd[169269]: + umask 0022
Nov 22 02:31:41 np0005531888 multipathd[169269]: + exec /usr/sbin/multipathd -d
Nov 22 02:31:41 np0005531888 multipathd[169269]: 3442.437200 | --------start up--------
Nov 22 02:31:41 np0005531888 multipathd[169269]: 3442.437223 | read /etc/multipath.conf
Nov 22 02:31:41 np0005531888 podman[169276]: 2025-11-22 07:31:41.743581457 +0000 UTC m=+0.090792025 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 02:31:41 np0005531888 multipathd[169269]: 3442.443645 | path checkers start up
Nov 22 02:31:41 np0005531888 systemd[1]: 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b-2c3c2cb3c567e80f.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:31:41 np0005531888 systemd[1]: 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b-2c3c2cb3c567e80f.service: Failed with result 'exit-code'.
Nov 22 02:31:42 np0005531888 podman[169432]: 2025-11-22 07:31:42.255468643 +0000 UTC m=+0.053380341 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 22 02:31:42 np0005531888 python3.9[169475]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:43 np0005531888 python3.9[169634]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:31:44 np0005531888 python3.9[169799]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:31:44 np0005531888 systemd[1]: Stopping multipathd container...
Nov 22 02:31:44 np0005531888 multipathd[169269]: 3445.198909 | exit (signal)
Nov 22 02:31:44 np0005531888 multipathd[169269]: 3445.199033 | --------shut down-------
Nov 22 02:31:44 np0005531888 systemd[1]: libpod-700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.scope: Deactivated successfully.
Nov 22 02:31:44 np0005531888 conmon[169269]: conmon 700e348b810190593def <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.scope/container/memory.events
Nov 22 02:31:44 np0005531888 podman[169803]: 2025-11-22 07:31:44.531824736 +0000 UTC m=+0.279397782 container died 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 22 02:31:44 np0005531888 systemd[1]: 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b-2c3c2cb3c567e80f.timer: Deactivated successfully.
Nov 22 02:31:44 np0005531888 systemd[1]: Stopped /usr/bin/podman healthcheck run 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.
Nov 22 02:31:44 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b-userdata-shm.mount: Deactivated successfully.
Nov 22 02:31:44 np0005531888 systemd[1]: var-lib-containers-storage-overlay-132f10213d893961e56ea28a5c34637141647bed717c19b110a3756ab7953b32-merged.mount: Deactivated successfully.
Nov 22 02:31:44 np0005531888 podman[169803]: 2025-11-22 07:31:44.861406925 +0000 UTC m=+0.608979971 container cleanup 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:31:44 np0005531888 podman[169803]: multipathd
Nov 22 02:31:44 np0005531888 podman[169832]: multipathd
Nov 22 02:31:44 np0005531888 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 22 02:31:44 np0005531888 systemd[1]: Stopped multipathd container.
Nov 22 02:31:44 np0005531888 systemd[1]: Starting multipathd container...
Nov 22 02:31:45 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:31:45 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/132f10213d893961e56ea28a5c34637141647bed717c19b110a3756ab7953b32/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 02:31:45 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/132f10213d893961e56ea28a5c34637141647bed717c19b110a3756ab7953b32/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 02:31:45 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.
Nov 22 02:31:45 np0005531888 podman[169845]: 2025-11-22 07:31:45.091286843 +0000 UTC m=+0.126463664 container init 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:31:45 np0005531888 multipathd[169860]: + sudo -E kolla_set_configs
Nov 22 02:31:45 np0005531888 podman[169845]: 2025-11-22 07:31:45.122964211 +0000 UTC m=+0.158141022 container start 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 02:31:45 np0005531888 podman[169845]: multipathd
Nov 22 02:31:45 np0005531888 systemd[1]: Started multipathd container.
Nov 22 02:31:45 np0005531888 multipathd[169860]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:31:45 np0005531888 multipathd[169860]: INFO:__main__:Validating config file
Nov 22 02:31:45 np0005531888 multipathd[169860]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:31:45 np0005531888 multipathd[169860]: INFO:__main__:Writing out command to execute
Nov 22 02:31:45 np0005531888 multipathd[169860]: ++ cat /run_command
Nov 22 02:31:45 np0005531888 multipathd[169860]: + CMD='/usr/sbin/multipathd -d'
Nov 22 02:31:45 np0005531888 multipathd[169860]: + ARGS=
Nov 22 02:31:45 np0005531888 multipathd[169860]: + sudo kolla_copy_cacerts
Nov 22 02:31:45 np0005531888 podman[169867]: 2025-11-22 07:31:45.188482841 +0000 UTC m=+0.055121717 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:31:45 np0005531888 systemd[1]: 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b-693128619ac861d8.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:31:45 np0005531888 systemd[1]: 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b-693128619ac861d8.service: Failed with result 'exit-code'.
Nov 22 02:31:45 np0005531888 multipathd[169860]: + [[ ! -n '' ]]
Nov 22 02:31:45 np0005531888 multipathd[169860]: + . kolla_extend_start
Nov 22 02:31:45 np0005531888 multipathd[169860]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 22 02:31:45 np0005531888 multipathd[169860]: Running command: '/usr/sbin/multipathd -d'
Nov 22 02:31:45 np0005531888 multipathd[169860]: + umask 0022
Nov 22 02:31:45 np0005531888 multipathd[169860]: + exec /usr/sbin/multipathd -d
Nov 22 02:31:45 np0005531888 multipathd[169860]: 3445.915939 | --------start up--------
Nov 22 02:31:45 np0005531888 multipathd[169860]: 3445.915967 | read /etc/multipath.conf
Nov 22 02:31:45 np0005531888 multipathd[169860]: 3445.921445 | path checkers start up
Nov 22 02:31:45 np0005531888 python3.9[170050]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:47 np0005531888 python3.9[170202]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 02:31:47 np0005531888 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 22 02:31:47 np0005531888 python3.9[170355]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 22 02:31:47 np0005531888 kernel: Key type psk registered
Nov 22 02:31:48 np0005531888 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 02:31:48 np0005531888 python3.9[170518]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:49 np0005531888 python3.9[170641]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796708.224814-1858-42548461440253/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:49 np0005531888 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 22 02:31:50 np0005531888 python3.9[170794]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:51 np0005531888 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 22 02:31:51 np0005531888 python3.9[170946]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:31:51 np0005531888 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 22 02:31:51 np0005531888 systemd[1]: Stopped Load Kernel Modules.
Nov 22 02:31:51 np0005531888 systemd[1]: Stopping Load Kernel Modules...
Nov 22 02:31:51 np0005531888 systemd[1]: Starting Load Kernel Modules...
Nov 22 02:31:51 np0005531888 systemd[1]: Finished Load Kernel Modules.
Nov 22 02:31:52 np0005531888 python3.9[171103]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:31:56 np0005531888 systemd[1]: Reloading.
Nov 22 02:31:56 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:56 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:56 np0005531888 systemd[1]: Reloading.
Nov 22 02:31:56 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:56 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:57 np0005531888 systemd-logind[825]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 22 02:31:57 np0005531888 systemd-logind[825]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 22 02:31:57 np0005531888 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:31:57 np0005531888 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:31:57 np0005531888 systemd[1]: Reloading.
Nov 22 02:31:57 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:57 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:57 np0005531888 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:32:00 np0005531888 python3.9[172557]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:32:00 np0005531888 iscsid[160921]: iscsid shutting down.
Nov 22 02:32:00 np0005531888 systemd[1]: Stopping Open-iSCSI...
Nov 22 02:32:00 np0005531888 systemd[1]: iscsid.service: Deactivated successfully.
Nov 22 02:32:00 np0005531888 systemd[1]: Stopped Open-iSCSI.
Nov 22 02:32:00 np0005531888 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 22 02:32:00 np0005531888 systemd[1]: Starting Open-iSCSI...
Nov 22 02:32:00 np0005531888 systemd[1]: Started Open-iSCSI.
Nov 22 02:32:00 np0005531888 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:32:00 np0005531888 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:32:00 np0005531888 systemd[1]: man-db-cache-update.service: Consumed 1.617s CPU time.
Nov 22 02:32:00 np0005531888 systemd[1]: run-r63ea509827154609b04e3677f8db3f9e.service: Deactivated successfully.
Nov 22 02:32:01 np0005531888 python3.9[172712]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:32:02 np0005531888 python3.9[172868]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:03 np0005531888 podman[172992]: 2025-11-22 07:32:03.571626179 +0000 UTC m=+0.119803635 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:32:03 np0005531888 python3.9[173037]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:32:03 np0005531888 systemd[1]: Reloading.
Nov 22 02:32:03 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:32:03 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:32:05 np0005531888 python3.9[173231]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:32:05 np0005531888 network[173248]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:32:05 np0005531888 network[173249]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:32:05 np0005531888 network[173250]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:32:09 np0005531888 python3.9[173524]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:09 np0005531888 python3.9[173677]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:10 np0005531888 python3.9[173830]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:11 np0005531888 python3.9[173983]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:12 np0005531888 python3.9[174136]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:12 np0005531888 podman[174260]: 2025-11-22 07:32:12.691772973 +0000 UTC m=+0.059142668 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 02:32:13 np0005531888 python3.9[174306]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:13 np0005531888 python3.9[174460]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:14 np0005531888 python3.9[174613]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:15 np0005531888 podman[174727]: 2025-11-22 07:32:15.688491313 +0000 UTC m=+0.055869655 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:32:15 np0005531888 python3.9[174784]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:16 np0005531888 python3.9[174936]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:17 np0005531888 python3.9[175088]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:17 np0005531888 python3.9[175240]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:18 np0005531888 python3.9[175392]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:19 np0005531888 python3.9[175544]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:19 np0005531888 python3.9[175696]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:20 np0005531888 python3.9[175848]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:21 np0005531888 python3.9[176000]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:21 np0005531888 python3.9[176152]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:22 np0005531888 python3.9[176304]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:23 np0005531888 python3.9[176456]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:23 np0005531888 python3.9[176608]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:24 np0005531888 python3.9[176760]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:24 np0005531888 python3.9[176912]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:25 np0005531888 python3.9[177064]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:26 np0005531888 python3.9[177216]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:27 np0005531888 python3.9[177368]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:32:28 np0005531888 python3.9[177520]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:32:28 np0005531888 systemd[1]: Reloading.
Nov 22 02:32:28 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:32:28 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:32:29 np0005531888 python3.9[177707]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:30 np0005531888 python3.9[177860]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:31 np0005531888 python3.9[178013]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:31 np0005531888 python3.9[178166]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:32 np0005531888 python3.9[178319]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:33 np0005531888 python3.9[178472]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:33 np0005531888 python3.9[178625]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:33 np0005531888 podman[178626]: 2025-11-22 07:32:33.714745787 +0000 UTC m=+0.087158152 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 22 02:32:34 np0005531888 python3.9[178804]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:36 np0005531888 python3.9[178957]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:32:36.785 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:32:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:32:36.785 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:32:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:32:36.785 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:32:37 np0005531888 python3.9[179109]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:37 np0005531888 python3.9[179261]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:38 np0005531888 python3.9[179413]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:39 np0005531888 python3.9[179565]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:40 np0005531888 python3.9[179717]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:40 np0005531888 python3.9[179869]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:41 np0005531888 python3.9[180021]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:42 np0005531888 python3.9[180173]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:42 np0005531888 python3.9[180325]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:42 np0005531888 podman[180326]: 2025-11-22 07:32:42.965344445 +0000 UTC m=+0.056045128 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 02:32:46 np0005531888 podman[180370]: 2025-11-22 07:32:46.680639912 +0000 UTC m=+0.055854163 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:32:48 np0005531888 python3.9[180517]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 22 02:32:49 np0005531888 python3.9[180670]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:32:50 np0005531888 python3.9[180828]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:32:52 np0005531888 systemd-logind[825]: New session 25 of user zuul.
Nov 22 02:32:52 np0005531888 systemd[1]: Started Session 25 of User zuul.
Nov 22 02:32:52 np0005531888 systemd[1]: session-25.scope: Deactivated successfully.
Nov 22 02:32:52 np0005531888 systemd-logind[825]: Session 25 logged out. Waiting for processes to exit.
Nov 22 02:32:52 np0005531888 systemd-logind[825]: Removed session 25.
Nov 22 02:32:53 np0005531888 python3.9[181014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:53 np0005531888 python3.9[181135]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796772.6963124-3421-276900791187594/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:54 np0005531888 python3.9[181285]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:54 np0005531888 python3.9[181361]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:55 np0005531888 python3.9[181511]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:55 np0005531888 python3.9[181632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796774.9623203-3421-3703725722748/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:56 np0005531888 python3.9[181782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:57 np0005531888 python3.9[181903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796776.049898-3421-26201636061560/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:57 np0005531888 python3.9[182053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:58 np0005531888 python3.9[182174]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796777.265741-3421-74226819026318/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:58 np0005531888 python3.9[182324]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:59 np0005531888 python3.9[182445]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796778.4745781-3421-233437476166954/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:00 np0005531888 python3.9[182597]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:00 np0005531888 python3.9[182749]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:01 np0005531888 python3.9[182901]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:02 np0005531888 python3.9[183053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:03 np0005531888 python3.9[183176]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1763796781.9606678-3743-199655430363463/.source _original_basename=.fmuj4aje follow=False checksum=1fc8294ddbea7f078c650660c5e0300c8c4aa523 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 22 02:33:03 np0005531888 python3.9[183328]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:04 np0005531888 podman[183454]: 2025-11-22 07:33:04.544817849 +0000 UTC m=+0.118608323 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:33:04 np0005531888 python3.9[183491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:05 np0005531888 python3.9[183625]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796784.1423438-3821-205486207096503/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:05 np0005531888 python3.9[183775]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:06 np0005531888 python3.9[183896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796785.4895122-3865-255390918282670/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:07 np0005531888 python3.9[184048]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 22 02:33:08 np0005531888 python3.9[184200]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:33:09 np0005531888 python3[184352]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:33:09 np0005531888 podman[184389]: 2025-11-22 07:33:09.683603175 +0000 UTC m=+0.059380949 container create 7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:33:09 np0005531888 podman[184389]: 2025-11-22 07:33:09.64801185 +0000 UTC m=+0.023789644 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 02:33:09 np0005531888 python3[184352]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 22 02:33:10 np0005531888 python3.9[184579]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:11 np0005531888 python3.9[184733]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 22 02:33:12 np0005531888 python3.9[184885]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:33:13 np0005531888 podman[185009]: 2025-11-22 07:33:13.376669093 +0000 UTC m=+0.069555558 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:33:13 np0005531888 python3[185055]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:33:13 np0005531888 podman[185087]: 2025-11-22 07:33:13.877705981 +0000 UTC m=+0.059070572 container create bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 22 02:33:13 np0005531888 podman[185087]: 2025-11-22 07:33:13.843333328 +0000 UTC m=+0.024697939 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 02:33:13 np0005531888 python3[185055]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 22 02:33:14 np0005531888 python3.9[185277]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:15 np0005531888 python3.9[185431]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:16 np0005531888 python3.9[185582]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796795.6857784-4140-31758053086999/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:16 np0005531888 python3.9[185658]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:33:16 np0005531888 systemd[1]: Reloading.
Nov 22 02:33:16 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:33:16 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:33:16 np0005531888 podman[185660]: 2025-11-22 07:33:16.957345369 +0000 UTC m=+0.102918012 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 22 02:33:17 np0005531888 python3.9[185790]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:33:17 np0005531888 systemd[1]: Reloading.
Nov 22 02:33:17 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:33:17 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:33:18 np0005531888 systemd[1]: Starting nova_compute container...
Nov 22 02:33:18 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:33:18 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531888 podman[185830]: 2025-11-22 07:33:18.223297059 +0000 UTC m=+0.108315010 container init bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.schema-version=1.0)
Nov 22 02:33:18 np0005531888 podman[185830]: 2025-11-22 07:33:18.231168186 +0000 UTC m=+0.116186107 container start bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:33:18 np0005531888 podman[185830]: nova_compute
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + sudo -E kolla_set_configs
Nov 22 02:33:18 np0005531888 systemd[1]: Started nova_compute container.
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Validating config file
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying service configuration files
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Deleting /etc/ceph
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Creating directory /etc/ceph
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /etc/ceph
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Writing out command to execute
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:18 np0005531888 nova_compute[185846]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 02:33:18 np0005531888 nova_compute[185846]: ++ cat /run_command
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + CMD=nova-compute
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + ARGS=
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + sudo kolla_copy_cacerts
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + [[ ! -n '' ]]
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + . kolla_extend_start
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + echo 'Running command: '\''nova-compute'\'''
Nov 22 02:33:18 np0005531888 nova_compute[185846]: Running command: 'nova-compute'
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + umask 0022
Nov 22 02:33:18 np0005531888 nova_compute[185846]: + exec nova-compute
Nov 22 02:33:19 np0005531888 python3.9[186008]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:20 np0005531888 nova_compute[185846]: 2025-11-22 07:33:20.470 185850 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:20 np0005531888 nova_compute[185846]: 2025-11-22 07:33:20.470 185850 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:20 np0005531888 nova_compute[185846]: 2025-11-22 07:33:20.471 185850 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:20 np0005531888 nova_compute[185846]: 2025-11-22 07:33:20.471 185850 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 22 02:33:20 np0005531888 python3.9[186158]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:20 np0005531888 nova_compute[185846]: 2025-11-22 07:33:20.619 185850 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:33:20 np0005531888 nova_compute[185846]: 2025-11-22 07:33:20.643 185850 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:33:20 np0005531888 nova_compute[185846]: 2025-11-22 07:33:20.644 185850 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.215 185850 INFO nova.virt.driver [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 22 02:33:21 np0005531888 python3.9[186312]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.333 185850 INFO nova.compute.provider_config [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.359 185850 DEBUG oslo_concurrency.lockutils [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.359 185850 DEBUG oslo_concurrency.lockutils [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.360 185850 DEBUG oslo_concurrency.lockutils [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.360 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.360 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.361 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.361 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.361 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.361 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.361 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.361 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.362 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.362 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.362 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.362 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.362 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.362 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.362 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.363 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.363 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.363 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.363 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.363 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.363 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.364 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.364 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.364 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.364 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.364 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.365 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.365 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.365 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.365 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.365 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.366 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.366 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.366 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.366 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.366 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.366 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.367 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.367 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.367 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.367 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.367 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.367 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.368 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.368 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.368 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.368 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.368 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.369 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.369 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.369 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.369 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.369 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.369 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.370 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.370 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.370 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.370 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.370 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.371 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.371 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.371 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.371 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.371 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.371 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.371 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.372 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.372 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.372 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.372 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.372 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.372 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.373 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.373 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.373 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.373 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.373 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.373 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.373 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.374 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.374 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.374 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.374 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.374 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.375 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.375 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.375 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.375 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.375 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.375 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.376 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.376 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.376 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.376 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.376 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.376 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.377 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.377 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.377 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.377 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.377 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.378 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.378 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.378 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.378 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.378 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.378 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.379 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.379 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.379 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.379 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.379 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.380 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.380 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.380 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.380 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.380 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.380 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.380 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.381 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.381 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.381 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.381 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.381 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.381 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.381 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.382 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.382 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.382 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.382 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.382 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.382 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.382 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.383 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.383 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.383 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.383 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.383 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.383 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.383 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.384 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.384 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.384 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.384 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.384 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.384 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.385 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.385 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.385 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.385 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.385 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.385 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.386 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.386 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.386 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.386 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.386 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.387 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.388 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.388 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.388 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.389 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.389 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.389 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.389 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.389 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.390 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.390 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.390 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.390 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.390 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.391 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.391 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.391 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.391 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.391 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.391 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.392 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.392 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.392 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.392 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.393 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.393 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.393 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.393 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.393 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.393 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.394 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.394 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.394 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.394 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.394 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.395 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.395 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.395 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.395 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.395 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.396 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.396 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.396 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.396 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.396 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.397 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.397 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.397 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.397 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.397 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.398 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.398 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.398 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.398 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.398 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.398 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.399 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.399 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.399 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.399 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.399 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.399 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.400 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.400 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.400 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.400 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.400 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.401 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.401 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.401 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.401 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.401 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.401 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.402 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.402 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.402 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.402 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.402 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.403 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.403 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.403 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.403 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.403 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.404 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.404 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.404 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.404 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.405 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.405 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.405 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.405 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.405 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.406 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.406 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.406 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.406 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.406 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.407 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.407 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.407 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.407 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.408 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.408 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.408 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.408 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.408 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.408 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.409 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.409 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.409 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.409 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.409 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.409 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.410 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.410 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.410 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.410 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.410 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.410 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.411 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.411 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.411 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.411 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.411 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.411 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.412 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.412 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.412 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.412 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.412 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.412 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.412 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.413 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.413 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.413 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.413 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.413 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.413 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.413 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.414 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.414 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.414 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.414 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.414 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.414 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.414 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.415 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.415 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.415 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.415 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.415 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.415 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.415 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.416 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.416 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.416 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.416 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.416 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.416 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.417 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.417 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.417 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.417 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.417 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.417 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.417 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.418 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.418 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.418 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.418 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.418 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.418 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.419 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.419 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.419 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.419 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.419 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.420 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.420 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.420 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.420 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.420 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.421 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.421 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.421 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.421 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.421 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.422 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.422 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.422 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.422 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.422 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.422 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.423 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.423 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.423 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.423 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.423 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.424 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.424 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.424 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.424 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.424 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.424 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.425 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.425 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.425 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.425 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.425 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.425 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.425 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.426 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.426 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.426 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.426 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.426 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.426 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.427 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.427 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.427 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.427 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.427 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.428 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.428 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.428 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.428 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.428 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.428 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.429 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.429 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.429 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.429 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.429 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.430 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.430 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.430 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.430 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.430 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.430 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.431 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.431 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.431 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.431 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.431 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.431 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.432 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.432 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.432 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.432 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.432 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.432 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.433 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.433 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.433 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.433 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.433 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.433 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.433 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.434 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.434 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.434 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.434 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.434 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.435 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.435 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.435 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.435 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.435 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.436 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.436 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.436 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.436 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.436 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.436 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.437 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.437 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.437 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.437 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.437 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.438 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.438 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.438 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.438 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.438 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.439 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.439 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.439 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.439 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.439 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.439 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.440 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.440 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.440 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.440 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.440 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.441 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.441 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.441 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.441 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.441 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.442 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.442 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.442 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.442 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.442 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.442 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.443 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.443 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.443 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.443 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.443 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.444 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.444 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.444 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.444 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.444 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.445 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.445 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.445 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.445 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.445 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.446 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.446 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.446 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.446 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.446 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.446 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.447 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.447 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.447 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.447 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.447 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.448 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.448 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.448 185850 WARNING oslo_config.cfg [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 22 02:33:21 np0005531888 nova_compute[185846]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 22 02:33:21 np0005531888 nova_compute[185846]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 22 02:33:21 np0005531888 nova_compute[185846]: and ``live_migration_inbound_addr`` respectively.
Nov 22 02:33:21 np0005531888 nova_compute[185846]: ).  Its value may be silently ignored in the future.#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.448 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.449 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.449 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.449 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.449 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.449 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.450 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.450 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.450 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.450 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.450 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.451 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.451 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.451 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.451 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.451 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.452 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.452 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.452 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.452 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.452 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.453 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.453 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.453 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.453 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.453 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.454 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.454 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.454 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.454 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.454 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.455 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.455 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.455 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.455 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.455 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.456 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.456 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.456 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.456 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.456 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.457 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.457 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.457 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.457 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.457 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.457 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.458 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.458 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.458 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.458 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.458 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.459 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.459 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.459 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.459 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.459 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.460 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.460 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.460 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.460 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.460 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.461 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.461 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.461 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.461 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.461 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.461 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.462 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.462 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.462 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.462 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.462 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.463 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.463 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.463 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.463 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.463 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.463 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.464 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.464 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.464 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.464 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.464 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.465 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.465 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.465 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.465 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.465 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.466 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.466 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.466 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.466 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.466 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.466 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.467 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.467 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.467 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.467 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.467 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.468 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.468 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.468 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.468 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.468 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.468 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.469 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.469 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.469 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.469 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.469 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.470 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.470 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.470 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.470 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.470 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.470 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.471 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.471 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.471 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.471 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.471 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.472 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.472 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.472 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.472 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.472 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.473 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.473 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.473 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.473 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.473 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.473 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.474 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.474 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.474 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.474 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.474 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.475 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.475 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.475 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.475 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.476 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.476 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.476 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.476 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.476 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.477 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.477 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.477 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.477 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.477 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.477 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.478 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.478 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.478 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.478 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.478 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.479 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.479 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.479 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.479 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.479 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.480 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.480 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.480 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.480 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.480 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.480 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.481 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.481 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.481 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.481 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.481 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.481 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.482 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.482 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.482 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.482 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.482 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.482 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.482 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.483 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.483 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.483 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.483 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.483 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.484 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.484 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.484 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.484 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.484 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.484 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.485 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.485 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.485 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.485 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.485 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.485 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.486 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.486 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.486 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.486 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.486 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.486 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.486 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.487 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.487 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.487 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.487 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.487 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.488 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.488 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.488 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.489 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.489 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.489 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.489 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.489 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.489 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.490 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.490 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.490 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.490 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.490 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.490 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.491 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.491 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.491 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.491 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.491 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.491 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.491 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.492 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.492 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.492 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.492 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.492 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.492 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.493 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.493 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.493 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.493 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.493 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.493 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.494 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.494 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.494 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.494 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.494 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.495 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.495 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.495 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.495 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.495 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.495 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.496 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.496 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.496 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.496 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.496 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.497 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.497 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.497 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.497 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.497 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.497 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.498 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.498 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.498 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.498 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.498 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.498 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.499 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.499 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.499 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.499 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.499 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.499 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.500 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.500 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.500 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.500 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.500 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.500 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.500 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.501 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.501 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.501 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.501 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.501 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.501 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.502 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.502 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.502 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.502 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.502 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.502 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.503 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.503 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.503 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.503 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.503 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.503 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.504 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.504 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.504 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.504 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.505 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.505 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.505 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.505 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.506 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.506 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.506 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.506 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.506 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.506 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.507 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.507 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.507 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.507 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.507 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.507 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.508 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.508 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.508 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.508 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.508 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.508 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.508 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.508 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.509 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.509 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.509 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.509 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.509 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.509 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.510 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.510 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.510 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.510 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.510 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.511 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.511 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.511 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.511 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.511 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.511 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.512 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.512 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.512 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.512 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.512 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.512 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.513 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.513 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.513 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.513 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.513 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.513 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.513 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.513 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.514 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.514 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.514 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.514 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.514 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.514 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.515 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.515 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.515 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.515 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.515 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.515 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.515 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.515 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.516 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.516 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.516 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.516 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.516 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.516 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.516 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.517 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.517 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.517 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.517 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.517 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.517 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.517 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.518 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.518 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.518 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.518 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.518 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.518 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.518 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.519 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.519 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.519 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.519 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.519 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.519 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.519 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.520 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.520 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.520 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.520 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.520 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.520 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.520 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.521 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.521 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.521 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.521 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.521 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.522 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.522 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.522 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.522 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.522 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.522 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.523 185850 DEBUG oslo_service.service [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.524 185850 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.540 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.541 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.541 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.541 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 22 02:33:21 np0005531888 systemd[1]: Starting libvirt QEMU daemon...
Nov 22 02:33:21 np0005531888 systemd[1]: Started libvirt QEMU daemon.
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.601 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f6dcfb4e190> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.603 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f6dcfb4e190> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.604 185850 INFO nova.virt.libvirt.driver [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.666 185850 WARNING nova.virt.libvirt.driver [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Nov 22 02:33:21 np0005531888 nova_compute[185846]: 2025-11-22 07:33:21.667 185850 DEBUG nova.virt.libvirt.volume.mount [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.415 185850 INFO nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Libvirt host capabilities <capabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <host>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <uuid>0008dc3a-3a62-409d-9804-94baff3c1d3a</uuid>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <arch>x86_64</arch>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <microcode version='16777317'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <signature family='23' model='49' stepping='0'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='x2apic'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='tsc-deadline'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='osxsave'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='hypervisor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='tsc_adjust'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='spec-ctrl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='stibp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='arch-capabilities'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='cmp_legacy'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='topoext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='virt-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='lbrv'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='tsc-scale'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='vmcb-clean'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='pause-filter'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='pfthreshold'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='rdctl-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='skip-l1dfl-vmentry'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='mds-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature name='pschange-mc-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <pages unit='KiB' size='4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <pages unit='KiB' size='2048'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <pages unit='KiB' size='1048576'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <power_management>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <suspend_mem/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <suspend_disk/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <suspend_hybrid/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </power_management>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <iommu support='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <migration_features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <live/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <uri_transports>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <uri_transport>tcp</uri_transport>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <uri_transport>rdma</uri_transport>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </uri_transports>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </migration_features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <topology>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <cells num='1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <cell id='0'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:          <memory unit='KiB'>7864320</memory>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:          <pages unit='KiB' size='2048'>0</pages>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:          <distances>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <sibling id='0' value='10'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:          </distances>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:          <cpus num='8'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:          </cpus>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        </cell>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </cells>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </topology>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <cache>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </cache>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <secmodel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model>selinux</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <doi>0</doi>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </secmodel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <secmodel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model>dac</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <doi>0</doi>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </secmodel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </host>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <guest>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <os_type>hvm</os_type>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <arch name='i686'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <wordsize>32</wordsize>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <domain type='qemu'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <domain type='kvm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </arch>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <pae/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <nonpae/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <acpi default='on' toggle='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <apic default='on' toggle='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <cpuselection/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <deviceboot/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <disksnapshot default='on' toggle='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <externalSnapshot/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </guest>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <guest>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <os_type>hvm</os_type>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <arch name='x86_64'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <wordsize>64</wordsize>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <domain type='qemu'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <domain type='kvm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </arch>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <acpi default='on' toggle='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <apic default='on' toggle='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <cpuselection/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <deviceboot/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <disksnapshot default='on' toggle='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <externalSnapshot/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </guest>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 
Nov 22 02:33:22 np0005531888 nova_compute[185846]: </capabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: #033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.422 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.441 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 22 02:33:22 np0005531888 nova_compute[185846]: <domainCapabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <domain>kvm</domain>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <arch>i686</arch>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <vcpu max='4096'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <iothreads supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <os supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <enum name='firmware'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <loader supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>rom</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pflash</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='readonly'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>yes</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>no</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='secure'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>no</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </loader>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </os>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>on</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>off</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='maximumMigratable'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>on</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>off</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='succor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='custom' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-128'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-256'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-512'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='KnightsMill'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 python3.9[186524]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SierraForest'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='athlon'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='athlon-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='core2duo'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='core2duo-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='coreduo'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='coreduo-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='n270'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='n270-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='phenom'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='phenom-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <memoryBacking supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <enum name='sourceType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>file</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>anonymous</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>memfd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </memoryBacking>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <devices>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <disk supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='diskDevice'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>disk</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>cdrom</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>floppy</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>lun</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='bus'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>fdc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>scsi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>sata</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </disk>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <graphics supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vnc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>egl-headless</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dbus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </graphics>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <video supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='modelType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vga</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>cirrus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>none</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>bochs</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ramfb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </video>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <hostdev supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='mode'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>subsystem</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='startupPolicy'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>default</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>mandatory</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>requisite</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>optional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='subsysType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pci</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>scsi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='capsType'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='pciBackend'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </hostdev>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <rng supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>random</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>egd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>builtin</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </rng>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <filesystem supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='driverType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>path</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>handle</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtiofs</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </filesystem>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <tpm supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tpm-tis</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tpm-crb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>emulator</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>external</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendVersion'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>2.0</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </tpm>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <redirdev supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='bus'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </redirdev>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <channel supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pty</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>unix</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </channel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <crypto supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>qemu</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>builtin</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </crypto>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <interface supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>default</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>passt</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </interface>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <panic supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>isa</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>hyperv</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </panic>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <console supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>null</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pty</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dev</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>file</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pipe</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>stdio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>udp</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tcp</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>unix</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>qemu-vdagent</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dbus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </console>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </devices>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <gic supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <genid supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <backup supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <async-teardown supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <ps2 supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <sev supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <sgx supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <hyperv supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='features'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>relaxed</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vapic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>spinlocks</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vpindex</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>runtime</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>synic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>stimer</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>reset</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vendor_id</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>frequencies</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>reenlightenment</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tlbflush</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ipi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>avic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>emsr_bitmap</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>xmm_input</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <defaults>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </defaults>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </hyperv>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <launchSecurity supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='sectype'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tdx</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </launchSecurity>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: </domainCapabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.448 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 22 02:33:22 np0005531888 nova_compute[185846]: <domainCapabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <domain>kvm</domain>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <arch>i686</arch>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <vcpu max='240'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <iothreads supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <os supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <enum name='firmware'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <loader supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>rom</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pflash</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='readonly'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>yes</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>no</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='secure'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>no</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </loader>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </os>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>on</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>off</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='maximumMigratable'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>on</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>off</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='succor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='custom' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-128'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-256'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-512'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge'>
Nov 22 02:33:22 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='KnightsMill'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SierraForest'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='athlon'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='athlon-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='core2duo'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='core2duo-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='coreduo'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='coreduo-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='n270'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='n270-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='phenom'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='phenom-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <memoryBacking supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <enum name='sourceType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>file</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>anonymous</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>memfd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </memoryBacking>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <devices>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <disk supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='diskDevice'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>disk</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>cdrom</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>floppy</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>lun</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='bus'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ide</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>fdc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>scsi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>sata</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </disk>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <graphics supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vnc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>egl-headless</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dbus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </graphics>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <video supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='modelType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vga</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>cirrus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>none</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>bochs</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ramfb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </video>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <hostdev supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='mode'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>subsystem</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='startupPolicy'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>default</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>mandatory</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>requisite</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>optional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='subsysType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pci</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>scsi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='capsType'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='pciBackend'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </hostdev>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <rng supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>random</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>egd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>builtin</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </rng>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <filesystem supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='driverType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>path</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>handle</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtiofs</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </filesystem>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <tpm supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tpm-tis</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tpm-crb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>emulator</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>external</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendVersion'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>2.0</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </tpm>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <redirdev supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='bus'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </redirdev>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <channel supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pty</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>unix</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </channel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <crypto supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>qemu</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>builtin</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </crypto>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <interface supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>default</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>passt</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </interface>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <panic supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>isa</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>hyperv</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </panic>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <console supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>null</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pty</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dev</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>file</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pipe</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>stdio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>udp</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tcp</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>unix</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>qemu-vdagent</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dbus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </console>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </devices>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <gic supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <genid supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <backup supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <async-teardown supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <ps2 supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <sev supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <sgx supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <hyperv supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='features'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>relaxed</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vapic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>spinlocks</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vpindex</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>runtime</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>synic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>stimer</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>reset</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vendor_id</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>frequencies</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>reenlightenment</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tlbflush</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ipi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>avic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>emsr_bitmap</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>xmm_input</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <defaults>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </defaults>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </hyperv>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <launchSecurity supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='sectype'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tdx</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </launchSecurity>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: </domainCapabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.497 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.502 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 22 02:33:22 np0005531888 nova_compute[185846]: <domainCapabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <domain>kvm</domain>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <arch>x86_64</arch>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <vcpu max='4096'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <iothreads supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <os supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <enum name='firmware'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>efi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <loader supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>rom</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pflash</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='readonly'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>yes</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>no</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='secure'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>yes</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>no</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </loader>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </os>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>on</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>off</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='maximumMigratable'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>on</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>off</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='succor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='custom' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-128'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-256'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-512'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='KnightsMill'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SierraForest'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='athlon'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='athlon-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='core2duo'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='core2duo-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='coreduo'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='coreduo-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='n270'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='n270-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='phenom'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='phenom-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <memoryBacking supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <enum name='sourceType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>file</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>anonymous</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>memfd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </memoryBacking>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <devices>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <disk supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='diskDevice'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>disk</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>cdrom</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>floppy</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>lun</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='bus'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>fdc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>scsi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>sata</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </disk>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <graphics supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vnc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>egl-headless</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dbus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </graphics>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <video supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='modelType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vga</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>cirrus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>none</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>bochs</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ramfb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </video>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <hostdev supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='mode'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>subsystem</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='startupPolicy'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>default</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>mandatory</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>requisite</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>optional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='subsysType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pci</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>scsi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='capsType'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='pciBackend'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </hostdev>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <rng supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>random</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>egd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>builtin</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </rng>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <filesystem supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='driverType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>path</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>handle</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtiofs</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </filesystem>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <tpm supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tpm-tis</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tpm-crb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>emulator</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>external</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendVersion'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>2.0</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </tpm>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <redirdev supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='bus'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </redirdev>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <channel supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pty</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>unix</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </channel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <crypto supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>qemu</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>builtin</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </crypto>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <interface supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>default</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>passt</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </interface>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <panic supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>isa</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>hyperv</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </panic>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <console supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>null</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pty</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dev</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>file</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pipe</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>stdio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>udp</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tcp</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>unix</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>qemu-vdagent</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dbus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </console>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </devices>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <gic supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <genid supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <backup supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <async-teardown supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <ps2 supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <sev supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <sgx supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <hyperv supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='features'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>relaxed</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vapic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>spinlocks</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vpindex</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>runtime</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>synic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>stimer</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>reset</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vendor_id</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>frequencies</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>reenlightenment</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tlbflush</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ipi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>avic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>emsr_bitmap</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>xmm_input</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <defaults>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </defaults>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </hyperv>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <launchSecurity supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='sectype'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tdx</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </launchSecurity>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: </domainCapabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.615 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 22 02:33:22 np0005531888 nova_compute[185846]: <domainCapabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <domain>kvm</domain>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <arch>x86_64</arch>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <vcpu max='240'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <iothreads supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <os supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <enum name='firmware'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <loader supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>rom</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pflash</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='readonly'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>yes</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>no</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='secure'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>no</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </loader>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </os>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>on</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>off</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='maximumMigratable'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>on</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>off</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='succor'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <mode name='custom' supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Denverton-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='EPYC-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-128'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-256'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx10-512'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Haswell-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='KnightsMill'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SierraForest'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='athlon'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='athlon-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='core2duo'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='core2duo-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='coreduo'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='coreduo-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='n270'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='n270-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='phenom'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <blockers model='phenom-v1'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </blockers>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </mode>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <memoryBacking supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <enum name='sourceType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>file</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>anonymous</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <value>memfd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </memoryBacking>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <devices>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <disk supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='diskDevice'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>disk</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>cdrom</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>floppy</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>lun</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='bus'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ide</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>fdc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>scsi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>sata</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </disk>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <graphics supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vnc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>egl-headless</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dbus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </graphics>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <video supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='modelType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vga</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>cirrus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>none</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>bochs</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ramfb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </video>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <hostdev supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='mode'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>subsystem</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='startupPolicy'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>default</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>mandatory</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>requisite</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>optional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='subsysType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pci</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>scsi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='capsType'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='pciBackend'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </hostdev>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <rng supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>random</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>egd</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>builtin</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </rng>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <filesystem supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='driverType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>path</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>handle</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>virtiofs</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </filesystem>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <tpm supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tpm-tis</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tpm-crb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>emulator</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>external</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendVersion'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>2.0</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </tpm>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <redirdev supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='bus'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>usb</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </redirdev>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <channel supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pty</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>unix</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </channel>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <crypto supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>qemu</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>builtin</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </crypto>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <interface supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='backendType'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>default</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>passt</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </interface>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <panic supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='model'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>isa</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>hyperv</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </panic>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <console supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='type'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>null</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vc</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pty</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dev</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>file</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>pipe</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>stdio</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>udp</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tcp</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>unix</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>qemu-vdagent</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>dbus</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </console>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </devices>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <gic supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <genid supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <backup supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <async-teardown supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <ps2 supported='yes'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <sev supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <sgx supported='no'/>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <hyperv supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='features'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>relaxed</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vapic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>spinlocks</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vpindex</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>runtime</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>synic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>stimer</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>reset</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>vendor_id</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>frequencies</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>reenlightenment</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tlbflush</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>ipi</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>avic</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>emsr_bitmap</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>xmm_input</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <defaults>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </defaults>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </hyperv>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    <launchSecurity supported='yes'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      <enum name='sectype'>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:        <value>tdx</value>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:      </enum>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:    </launchSecurity>
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  </features>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: </domainCapabilities>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.676 185850 DEBUG nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.677 185850 INFO nova.virt.libvirt.host [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Secure Boot support detected#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.680 185850 INFO nova.virt.libvirt.driver [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.680 185850 INFO nova.virt.libvirt.driver [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.692 185850 DEBUG nova.virt.libvirt.driver [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] cpu compare xml: <cpu match="exact">
Nov 22 02:33:22 np0005531888 nova_compute[185846]:  <model>Nehalem</model>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: </cpu>
Nov 22 02:33:22 np0005531888 nova_compute[185846]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.695 185850 DEBUG nova.virt.libvirt.driver [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.729 185850 INFO nova.virt.node [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Determined node identity 1afd6948-7df7-46e7-8718-35e2b3007a5d from /var/lib/nova/compute_id#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.745 185850 WARNING nova.compute.manager [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Compute nodes ['1afd6948-7df7-46e7-8718-35e2b3007a5d'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.774 185850 INFO nova.compute.manager [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.876 185850 WARNING nova.compute.manager [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.876 185850 DEBUG oslo_concurrency.lockutils [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.876 185850 DEBUG oslo_concurrency.lockutils [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.877 185850 DEBUG oslo_concurrency.lockutils [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:22 np0005531888 nova_compute[185846]: 2025-11-22 07:33:22.877 185850 DEBUG nova.compute.resource_tracker [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:33:22 np0005531888 systemd[1]: Starting libvirt nodedev daemon...
Nov 22 02:33:22 np0005531888 systemd[1]: Started libvirt nodedev daemon.
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.257 185850 WARNING nova.virt.libvirt.driver [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.259 185850 DEBUG nova.compute.resource_tracker [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6205MB free_disk=73.66464614868164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.260 185850 DEBUG oslo_concurrency.lockutils [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.260 185850 DEBUG oslo_concurrency.lockutils [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.273 185850 WARNING nova.compute.resource_tracker [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] No compute node record for compute-2.ctlplane.example.com:1afd6948-7df7-46e7-8718-35e2b3007a5d: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 1afd6948-7df7-46e7-8718-35e2b3007a5d could not be found.#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.310 185850 INFO nova.compute.resource_tracker [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 1afd6948-7df7-46e7-8718-35e2b3007a5d#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.373 185850 DEBUG nova.compute.resource_tracker [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.374 185850 DEBUG nova.compute.resource_tracker [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:33:23 np0005531888 python3.9[186727]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:33:23 np0005531888 systemd[1]: Stopping nova_compute container...
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.679 185850 DEBUG oslo_concurrency.lockutils [None req-238464a6-41bb-42d7-b940-68429550b23f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.679 185850 DEBUG oslo_concurrency.lockutils [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.680 185850 DEBUG oslo_concurrency.lockutils [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:33:23 np0005531888 nova_compute[185846]: 2025-11-22 07:33:23.680 185850 DEBUG oslo_concurrency.lockutils [None req-47b3fd86-61ba-4d16-a957-4b663ea5c0d7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:33:24 np0005531888 virtqemud[186358]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 22 02:33:24 np0005531888 virtqemud[186358]: hostname: compute-2
Nov 22 02:33:24 np0005531888 virtqemud[186358]: End of file while reading data: Input/output error
Nov 22 02:33:24 np0005531888 systemd[1]: libpod-bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81.scope: Deactivated successfully.
Nov 22 02:33:24 np0005531888 systemd[1]: libpod-bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81.scope: Consumed 3.550s CPU time.
Nov 22 02:33:24 np0005531888 podman[186732]: 2025-11-22 07:33:24.201282986 +0000 UTC m=+0.608321350 container died bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:33:24 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81-userdata-shm.mount: Deactivated successfully.
Nov 22 02:33:24 np0005531888 systemd[1]: var-lib-containers-storage-overlay-4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb-merged.mount: Deactivated successfully.
Nov 22 02:33:24 np0005531888 podman[186732]: 2025-11-22 07:33:24.279231403 +0000 UTC m=+0.686269757 container cleanup bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:33:24 np0005531888 podman[186732]: nova_compute
Nov 22 02:33:24 np0005531888 podman[186758]: nova_compute
Nov 22 02:33:24 np0005531888 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 22 02:33:24 np0005531888 systemd[1]: Stopped nova_compute container.
Nov 22 02:33:24 np0005531888 systemd[1]: Starting nova_compute container...
Nov 22 02:33:24 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:33:24 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e531c7a7a93389a40bd1eab0e0643ec13a68a4e141af727375e32e5dd247aeb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531888 podman[186772]: 2025-11-22 07:33:24.47459908 +0000 UTC m=+0.098622525 container init bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:33:24 np0005531888 podman[186772]: 2025-11-22 07:33:24.485345195 +0000 UTC m=+0.109368620 container start bb4dd6963de7fee3f1089e5e9949390eac22917790768bd99ec909e6c4b3fb81 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + sudo -E kolla_set_configs
Nov 22 02:33:24 np0005531888 podman[186772]: nova_compute
Nov 22 02:33:24 np0005531888 systemd[1]: Started nova_compute container.
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Validating config file
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying service configuration files
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /etc/ceph
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Creating directory /etc/ceph
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /etc/ceph
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Writing out command to execute
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:24 np0005531888 nova_compute[186788]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 02:33:24 np0005531888 nova_compute[186788]: ++ cat /run_command
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + CMD=nova-compute
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + ARGS=
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + sudo kolla_copy_cacerts
Nov 22 02:33:24 np0005531888 nova_compute[186788]: Running command: 'nova-compute'
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + [[ ! -n '' ]]
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + . kolla_extend_start
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + echo 'Running command: '\''nova-compute'\'''
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + umask 0022
Nov 22 02:33:24 np0005531888 nova_compute[186788]: + exec nova-compute
Nov 22 02:33:25 np0005531888 python3.9[186951]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 22 02:33:25 np0005531888 systemd[1]: Started libpod-conmon-7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115.scope.
Nov 22 02:33:25 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:33:25 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f183df94b8bb2f82771bd43a5a55cdd718bfd9e1d85550eed8ca58848e15f76/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:25 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f183df94b8bb2f82771bd43a5a55cdd718bfd9e1d85550eed8ca58848e15f76/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:25 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f183df94b8bb2f82771bd43a5a55cdd718bfd9e1d85550eed8ca58848e15f76/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:25 np0005531888 podman[186977]: 2025-11-22 07:33:25.73871727 +0000 UTC m=+0.244002681 container init 7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:33:25 np0005531888 podman[186977]: 2025-11-22 07:33:25.748338184 +0000 UTC m=+0.253623575 container start 7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:33:25 np0005531888 python3.9[186951]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Applying nova statedir ownership
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 22 02:33:25 np0005531888 nova_compute_init[186998]: INFO:nova_statedir:Nova statedir ownership complete
Nov 22 02:33:25 np0005531888 systemd[1]: libpod-7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115.scope: Deactivated successfully.
Nov 22 02:33:25 np0005531888 podman[187012]: 2025-11-22 07:33:25.850276088 +0000 UTC m=+0.030135847 container died 7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, config_id=edpm)
Nov 22 02:33:25 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115-userdata-shm.mount: Deactivated successfully.
Nov 22 02:33:25 np0005531888 systemd[1]: var-lib-containers-storage-overlay-5f183df94b8bb2f82771bd43a5a55cdd718bfd9e1d85550eed8ca58848e15f76-merged.mount: Deactivated successfully.
Nov 22 02:33:25 np0005531888 podman[187012]: 2025-11-22 07:33:25.895228522 +0000 UTC m=+0.075088251 container cleanup 7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:33:25 np0005531888 systemd[1]: libpod-conmon-7e792c1b00ee95b9c54a4b33869d8bb7120a3f3a52f4b83d2a5943c9dd74e115.scope: Deactivated successfully.
Nov 22 02:33:26 np0005531888 systemd[1]: session-24.scope: Deactivated successfully.
Nov 22 02:33:26 np0005531888 systemd[1]: session-24.scope: Consumed 1min 52.415s CPU time.
Nov 22 02:33:26 np0005531888 systemd-logind[825]: Session 24 logged out. Waiting for processes to exit.
Nov 22 02:33:26 np0005531888 systemd-logind[825]: Removed session 24.
Nov 22 02:33:26 np0005531888 nova_compute[186788]: 2025-11-22 07:33:26.763 186792 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:26 np0005531888 nova_compute[186788]: 2025-11-22 07:33:26.764 186792 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:26 np0005531888 nova_compute[186788]: 2025-11-22 07:33:26.764 186792 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:26 np0005531888 nova_compute[186788]: 2025-11-22 07:33:26.764 186792 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 22 02:33:26 np0005531888 nova_compute[186788]: 2025-11-22 07:33:26.919 186792 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:33:26 np0005531888 nova_compute[186788]: 2025-11-22 07:33:26.943 186792 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:33:26 np0005531888 nova_compute[186788]: 2025-11-22 07:33:26.943 186792 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.411 186792 INFO nova.virt.driver [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.512 186792 INFO nova.compute.provider_config [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.527 186792 DEBUG oslo_concurrency.lockutils [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.528 186792 DEBUG oslo_concurrency.lockutils [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.528 186792 DEBUG oslo_concurrency.lockutils [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.528 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.528 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.528 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.529 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.529 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.529 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.529 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.530 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.530 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.530 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.530 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.530 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.530 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.531 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.531 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.531 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.531 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.531 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.532 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.532 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.532 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.532 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.532 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.532 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.533 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.533 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.533 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.533 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.533 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.533 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.534 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.534 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.534 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.534 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.534 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.534 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.535 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.535 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.535 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.535 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.535 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.535 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.536 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.536 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.536 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.536 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.536 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.536 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.537 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.537 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.537 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.537 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.537 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.538 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.538 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.538 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.538 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.538 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.538 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.539 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.539 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.539 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.539 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.539 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.539 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.539 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.540 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.540 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.540 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.540 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.540 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.541 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.541 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.541 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.541 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.541 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.542 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.542 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.542 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.542 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.542 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.542 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.542 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.543 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.543 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.543 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.543 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.543 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.544 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.544 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.544 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.544 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.544 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.544 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.545 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.545 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.545 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.545 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.545 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.545 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.545 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.546 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.546 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.546 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.546 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.546 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.546 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.546 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.547 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.547 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.547 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.547 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.547 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.547 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.547 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.548 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.548 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.548 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.548 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.548 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.548 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.549 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.549 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.549 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.549 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.549 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.549 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.549 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.550 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.550 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.550 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.550 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.550 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.550 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.550 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.551 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.551 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.551 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.551 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.551 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.551 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.551 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.552 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.552 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.552 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.552 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.552 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.552 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.552 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.553 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.553 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.553 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.553 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.553 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.553 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.554 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.554 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.554 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.554 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.554 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.554 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.554 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.555 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.555 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.555 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.555 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.555 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.555 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.555 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.556 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.556 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.556 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.556 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.556 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.556 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.556 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.557 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.557 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.557 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.557 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.557 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.557 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.558 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.558 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.558 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.558 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.558 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.558 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.558 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.559 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.559 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.559 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.559 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.559 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.559 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.559 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.560 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.560 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.560 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.560 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.560 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.560 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.560 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.561 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.561 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.561 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.561 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.561 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.561 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.561 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.562 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.562 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.562 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.562 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.562 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.562 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.563 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.563 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.563 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.563 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.563 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.563 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.563 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.564 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.564 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.564 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.564 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.564 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.564 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.565 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.565 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.565 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.565 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.565 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.565 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.565 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.566 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.566 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.566 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.566 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.566 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.566 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.566 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.567 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.567 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.567 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.567 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.567 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.567 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.567 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.568 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.568 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.568 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.568 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.568 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.568 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.568 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.568 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.569 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.569 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.569 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.569 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.569 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.569 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.570 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.570 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.570 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.570 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.570 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.570 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.570 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.571 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.571 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.571 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.571 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.571 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.571 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.571 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.572 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.572 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.572 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.572 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.572 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.572 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.573 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.573 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.573 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.573 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.573 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.573 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.574 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.574 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.574 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.574 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.574 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.574 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.574 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.575 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.575 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.575 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.575 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.575 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.576 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.576 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.576 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.576 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.577 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.577 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.577 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.577 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.577 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.577 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.577 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.578 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.578 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.578 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.578 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.578 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.578 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.578 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.579 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.579 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.579 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.579 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.579 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.580 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.580 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.580 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.580 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.580 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.580 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.580 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.581 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.581 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.581 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.581 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.581 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.582 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.582 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.582 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.582 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.582 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.582 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.583 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.583 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.583 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.583 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.583 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.583 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.584 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.584 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.584 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.584 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.584 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.584 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.585 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.585 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.585 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.585 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.585 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.586 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.586 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.586 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.586 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.586 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.586 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.587 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.587 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.587 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.587 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.587 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.587 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.588 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.588 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.588 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.588 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.588 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.588 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.588 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.589 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.589 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.589 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.589 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.589 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.589 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.589 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.590 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.590 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.590 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.590 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.590 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.590 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.590 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.591 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.591 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.591 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.591 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.591 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.591 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.592 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.592 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.592 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.592 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.592 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.592 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.593 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.593 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.593 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.593 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.593 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.593 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.593 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.594 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.594 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.594 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.594 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.594 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.594 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.594 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.595 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.595 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.595 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.595 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.595 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.595 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.596 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.596 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.596 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.596 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.596 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.597 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.597 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.597 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.597 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.597 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.597 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.598 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.598 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.598 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.598 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.598 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.598 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.598 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.599 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.599 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.599 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.599 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.599 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.599 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.599 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.600 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.600 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.600 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.600 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.600 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.600 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.601 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.601 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.601 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.601 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.601 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.601 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.601 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.602 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.602 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.602 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.602 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.602 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.602 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.603 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.603 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.603 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.603 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.603 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.603 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.604 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.604 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.604 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.604 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.604 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.604 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.605 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.605 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.605 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.605 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.605 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.605 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.605 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.606 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.606 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.606 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.606 186792 WARNING oslo_config.cfg [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 22 02:33:27 np0005531888 nova_compute[186788]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 22 02:33:27 np0005531888 nova_compute[186788]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 22 02:33:27 np0005531888 nova_compute[186788]: and ``live_migration_inbound_addr`` respectively.
Nov 22 02:33:27 np0005531888 nova_compute[186788]: ).  Its value may be silently ignored in the future.#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.606 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.607 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.607 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.607 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.607 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.607 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.607 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.608 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.608 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.608 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.608 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.608 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.608 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.609 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.609 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.609 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.609 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.609 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.609 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.609 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.610 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.610 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.610 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.610 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.610 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.610 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.611 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.611 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.611 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.611 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.611 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.611 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.612 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.612 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.612 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.612 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.612 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.613 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.613 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.613 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.613 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.613 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.613 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.614 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.614 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.614 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.614 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.614 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.615 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.615 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.615 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.615 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.615 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.615 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.615 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.616 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.616 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.616 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.616 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.616 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.616 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.617 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.617 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.617 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.617 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.617 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.617 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.618 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.618 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.618 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.618 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.618 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.618 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.618 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.619 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.619 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.619 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.619 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.619 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.619 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.620 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.620 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.620 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.620 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.620 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.621 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.621 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.621 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.621 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.621 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.621 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.621 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.622 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.622 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.622 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.622 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.622 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.622 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.622 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.623 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.623 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.623 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.623 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.623 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.623 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.624 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.624 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.624 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.624 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.624 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.624 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.624 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.625 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.625 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.625 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.625 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.625 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.625 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.626 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.626 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.626 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.626 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.626 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.626 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.626 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.627 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.627 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.627 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.627 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.627 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.627 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.627 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.628 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.628 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.628 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.628 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.628 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.628 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.629 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.629 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.629 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.629 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.630 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.630 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.630 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.630 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.630 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.631 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.631 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.631 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.631 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.631 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.631 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.632 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.632 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.632 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.632 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.632 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.632 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.633 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.633 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.633 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.633 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.633 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.633 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.633 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.634 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.634 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.634 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.634 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.634 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.634 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.635 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.635 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.635 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.635 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.635 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.636 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.636 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.636 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.636 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.636 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.637 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.637 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.637 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.637 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.637 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.637 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.638 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.638 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.638 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.638 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.638 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.639 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.639 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.639 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.639 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.639 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.640 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.640 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.640 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.640 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.641 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.641 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.641 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.641 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.642 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.642 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.642 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.642 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.642 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.642 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.643 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.643 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.643 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.643 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.643 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.644 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.644 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.644 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.644 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.644 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.644 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.645 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.645 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.645 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.645 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.645 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.645 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.646 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.646 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.646 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.646 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.646 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.646 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.646 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.647 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.647 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.647 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.647 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.647 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.647 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.648 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.648 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.648 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.648 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.648 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.649 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.649 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.649 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.649 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.649 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.650 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.650 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.650 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.650 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.650 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.650 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.651 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.651 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.651 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.651 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.651 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.651 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.651 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.652 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.652 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.652 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.652 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.652 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.652 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.653 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.653 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.653 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.653 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.653 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.653 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.654 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.654 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.654 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.654 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.654 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.654 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.655 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.655 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.655 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.655 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.655 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.655 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.656 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.656 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.656 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.656 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.656 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.656 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.656 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.657 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.657 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.657 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.657 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.657 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.657 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.657 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.658 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.658 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.658 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.658 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.658 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.658 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.658 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.659 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.659 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.659 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.659 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.659 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.659 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.660 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.660 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.660 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.660 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.660 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.660 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.661 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.661 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.661 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.661 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.661 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.661 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.661 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.662 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.662 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.662 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.662 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.662 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.662 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.663 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.663 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.663 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.663 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.663 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.664 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.664 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.664 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.664 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.664 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.665 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.665 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.665 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.665 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.665 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.665 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.666 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.666 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.666 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.666 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.666 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.667 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.667 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.667 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.667 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.667 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.668 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.668 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.668 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.668 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.668 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.668 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.668 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.669 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.669 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.669 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.669 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.669 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.669 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.670 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.670 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.670 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.670 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.670 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.670 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.670 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.670 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.671 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.671 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.671 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.671 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.671 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.671 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.671 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.672 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.672 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.672 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.672 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.672 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.673 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.673 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.673 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.673 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.673 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.673 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.674 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.674 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.674 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.674 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.674 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.674 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.675 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.675 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.675 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.675 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.675 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.675 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.675 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.676 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.676 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.676 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.676 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.676 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.676 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.677 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.677 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.677 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.677 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.677 186792 DEBUG oslo_service.service [None req-a235465b-3c34-482d-9177-95ad2a85fe2b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.678 186792 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.695 186792 INFO nova.virt.node [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Determined node identity 1afd6948-7df7-46e7-8718-35e2b3007a5d from /var/lib/nova/compute_id#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.696 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.696 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.696 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.697 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.715 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fb001f42a90> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.719 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fb001f42a90> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.721 186792 INFO nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.726 186792 INFO nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Libvirt host capabilities <capabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <host>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <uuid>0008dc3a-3a62-409d-9804-94baff3c1d3a</uuid>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <arch>x86_64</arch>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <microcode version='16777317'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <signature family='23' model='49' stepping='0'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='x2apic'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='tsc-deadline'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='osxsave'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='hypervisor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='tsc_adjust'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='spec-ctrl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='stibp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='arch-capabilities'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='cmp_legacy'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='topoext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='virt-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='lbrv'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='tsc-scale'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='vmcb-clean'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='pause-filter'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='pfthreshold'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='rdctl-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='skip-l1dfl-vmentry'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='mds-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature name='pschange-mc-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <pages unit='KiB' size='4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <pages unit='KiB' size='2048'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <pages unit='KiB' size='1048576'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <power_management>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <suspend_mem/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <suspend_disk/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <suspend_hybrid/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </power_management>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <iommu support='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <migration_features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <live/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <uri_transports>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <uri_transport>tcp</uri_transport>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <uri_transport>rdma</uri_transport>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </uri_transports>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </migration_features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <topology>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <cells num='1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <cell id='0'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:          <memory unit='KiB'>7864320</memory>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:          <pages unit='KiB' size='2048'>0</pages>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:          <distances>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <sibling id='0' value='10'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:          </distances>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:          <cpus num='8'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:          </cpus>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        </cell>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </cells>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </topology>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <cache>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </cache>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <secmodel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model>selinux</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <doi>0</doi>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </secmodel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <secmodel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model>dac</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <doi>0</doi>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </secmodel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </host>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <guest>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <os_type>hvm</os_type>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <arch name='i686'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <wordsize>32</wordsize>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <domain type='qemu'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <domain type='kvm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </arch>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <pae/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <nonpae/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <acpi default='on' toggle='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <apic default='on' toggle='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <cpuselection/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <deviceboot/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <disksnapshot default='on' toggle='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <externalSnapshot/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </guest>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <guest>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <os_type>hvm</os_type>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <arch name='x86_64'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <wordsize>64</wordsize>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <domain type='qemu'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <domain type='kvm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </arch>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <acpi default='on' toggle='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <apic default='on' toggle='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <cpuselection/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <deviceboot/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <disksnapshot default='on' toggle='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <externalSnapshot/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </guest>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 
Nov 22 02:33:27 np0005531888 nova_compute[186788]: </capabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: #033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.736 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.742 186792 DEBUG nova.virt.libvirt.volume.mount [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.743 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 22 02:33:27 np0005531888 nova_compute[186788]: <domainCapabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <domain>kvm</domain>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <arch>i686</arch>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <vcpu max='240'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <iothreads supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <os supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <enum name='firmware'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <loader supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>rom</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pflash</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='readonly'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>yes</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>no</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='secure'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>no</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </loader>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>on</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>off</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='maximumMigratable'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>on</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>off</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='succor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='custom' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-128'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-256'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-512'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='KnightsMill'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SierraForest'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='athlon'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='athlon-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='core2duo'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='core2duo-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='coreduo'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='coreduo-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='n270'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='n270-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='phenom'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='phenom-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <memoryBacking supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <enum name='sourceType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>file</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>anonymous</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>memfd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </memoryBacking>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <disk supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='diskDevice'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>disk</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>cdrom</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>floppy</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>lun</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='bus'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>ide</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>fdc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>scsi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>sata</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <graphics supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vnc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>egl-headless</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dbus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <video supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='modelType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vga</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>cirrus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>none</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>bochs</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>ramfb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <hostdev supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='mode'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>subsystem</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='startupPolicy'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>default</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>mandatory</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>requisite</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>optional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='subsysType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pci</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>scsi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='capsType'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='pciBackend'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </hostdev>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <rng supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>random</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>egd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>builtin</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <filesystem supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='driverType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>path</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>handle</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtiofs</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </filesystem>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <tpm supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tpm-tis</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tpm-crb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>emulator</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>external</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendVersion'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>2.0</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </tpm>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <redirdev supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='bus'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </redirdev>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <channel supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pty</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>unix</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </channel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <crypto supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>qemu</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>builtin</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </crypto>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <interface supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>default</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>passt</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <panic supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>isa</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>hyperv</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </panic>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <console supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>null</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pty</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dev</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>file</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pipe</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>stdio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>udp</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tcp</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>unix</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>qemu-vdagent</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dbus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </console>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <gic supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <genid supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <backup supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <async-teardown supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <ps2 supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <sev supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <sgx supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <hyperv supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='features'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>relaxed</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vapic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>spinlocks</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vpindex</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>runtime</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>synic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>stimer</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>reset</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vendor_id</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>frequencies</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>reenlightenment</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tlbflush</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>ipi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>avic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>emsr_bitmap</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>xmm_input</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <defaults>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </defaults>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </hyperv>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <launchSecurity supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='sectype'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tdx</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </launchSecurity>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: </domainCapabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.754 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 22 02:33:27 np0005531888 nova_compute[186788]: <domainCapabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <domain>kvm</domain>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <arch>i686</arch>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <vcpu max='4096'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <iothreads supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <os supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <enum name='firmware'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <loader supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>rom</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pflash</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='readonly'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>yes</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>no</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='secure'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>no</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </loader>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>on</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>off</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='maximumMigratable'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>on</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>off</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='succor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='custom' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-128'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-256'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-512'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='KnightsMill'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SierraForest'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='athlon'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='athlon-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='core2duo'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='core2duo-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='coreduo'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='coreduo-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='n270'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='n270-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='phenom'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='phenom-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <memoryBacking supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <enum name='sourceType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>file</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>anonymous</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>memfd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </memoryBacking>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <disk supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='diskDevice'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>disk</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>cdrom</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>floppy</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>lun</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='bus'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>fdc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>scsi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>sata</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <graphics supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vnc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>egl-headless</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dbus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <video supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='modelType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vga</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>cirrus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>none</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>bochs</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>ramfb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <hostdev supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='mode'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>subsystem</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='startupPolicy'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>default</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>mandatory</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>requisite</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>optional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='subsysType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pci</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>scsi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='capsType'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='pciBackend'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </hostdev>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <rng supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>random</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>egd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>builtin</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <filesystem supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='driverType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>path</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>handle</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtiofs</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </filesystem>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <tpm supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tpm-tis</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tpm-crb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>emulator</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>external</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendVersion'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>2.0</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </tpm>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <redirdev supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='bus'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </redirdev>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <channel supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pty</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>unix</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </channel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <crypto supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>qemu</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>builtin</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </crypto>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <interface supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>default</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>passt</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <panic supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>isa</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>hyperv</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </panic>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <console supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>null</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pty</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dev</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>file</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pipe</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>stdio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>udp</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tcp</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>unix</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>qemu-vdagent</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dbus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </console>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <gic supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <genid supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <backup supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <async-teardown supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <ps2 supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <sev supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <sgx supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <hyperv supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='features'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>relaxed</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vapic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>spinlocks</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vpindex</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>runtime</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>synic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>stimer</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>reset</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vendor_id</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>frequencies</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>reenlightenment</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tlbflush</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>ipi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>avic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>emsr_bitmap</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>xmm_input</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <defaults>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </defaults>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </hyperv>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <launchSecurity supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='sectype'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tdx</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </launchSecurity>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: </domainCapabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.788 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.793 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 22 02:33:27 np0005531888 nova_compute[186788]: <domainCapabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <domain>kvm</domain>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <arch>x86_64</arch>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <vcpu max='240'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <iothreads supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <os supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <enum name='firmware'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <loader supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>rom</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pflash</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='readonly'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>yes</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>no</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='secure'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>no</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </loader>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>on</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>off</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='maximumMigratable'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>on</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>off</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='succor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='custom' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-128'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-256'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx10-512'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='KnightsMill'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SierraForest'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='athlon'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='athlon-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='core2duo'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='core2duo-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='coreduo'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='coreduo-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='n270'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='n270-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='phenom'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='phenom-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <memoryBacking supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <enum name='sourceType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>file</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>anonymous</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>memfd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </memoryBacking>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <disk supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='diskDevice'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>disk</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>cdrom</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>floppy</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>lun</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='bus'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>ide</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>fdc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>scsi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>sata</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <graphics supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vnc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>egl-headless</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dbus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <video supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='modelType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vga</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>cirrus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>none</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>bochs</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>ramfb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <hostdev supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='mode'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>subsystem</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='startupPolicy'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>default</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>mandatory</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>requisite</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>optional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='subsysType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pci</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>scsi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='capsType'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='pciBackend'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </hostdev>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <rng supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>random</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>egd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>builtin</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <filesystem supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='driverType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>path</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>handle</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>virtiofs</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </filesystem>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <tpm supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tpm-tis</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tpm-crb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>emulator</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>external</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendVersion'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>2.0</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </tpm>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <redirdev supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='bus'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </redirdev>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <channel supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pty</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>unix</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </channel>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <crypto supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>qemu</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>builtin</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </crypto>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <interface supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='backendType'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>default</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>passt</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <panic supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>isa</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>hyperv</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </panic>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <console supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>null</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vc</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pty</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dev</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>file</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pipe</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>stdio</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>udp</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tcp</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>unix</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>qemu-vdagent</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>dbus</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </console>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <gic supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <genid supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <backup supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <async-teardown supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <ps2 supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <sev supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <sgx supported='no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <hyperv supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='features'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>relaxed</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vapic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>spinlocks</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vpindex</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>runtime</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>synic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>stimer</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>reset</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>vendor_id</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>frequencies</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>reenlightenment</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tlbflush</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>ipi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>avic</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>emsr_bitmap</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>xmm_input</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <defaults>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </defaults>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </hyperv>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <launchSecurity supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='sectype'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>tdx</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </launchSecurity>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: </domainCapabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:27 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.861 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 22 02:33:27 np0005531888 nova_compute[186788]: <domainCapabilities>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <domain>kvm</domain>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <arch>x86_64</arch>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <vcpu max='4096'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <iothreads supported='yes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <os supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <enum name='firmware'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>efi</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <loader supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>rom</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>pflash</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='readonly'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>yes</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>no</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='secure'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>yes</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>no</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </loader>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:  <cpu>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>on</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>off</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <enum name='maximumMigratable'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>on</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <value>off</value>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='succor'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:    <mode name='custom' supported='yes'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Denverton-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-v3'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:      <blockers model='EPYC-v4'>
Nov 22 02:33:27 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx10'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx10-128'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx10-256'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx10-512'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='prefetchiti'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Haswell'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v2'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v3'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Haswell-v4'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='KnightsMill'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512er'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512pf'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512er'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512pf'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G4'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G5'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tbm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fma4'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tbm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xop'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='amx-tile'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-bf16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-fp16'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bitalg'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrc'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fzrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='la57'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='taa-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xfd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='SierraForest'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cmpccxadd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-ifma'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cmpccxadd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fbsdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='fsrs'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ibrs-all'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='mcdt-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pbrsb-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='psdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='serialize'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vaes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='hle'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='rtm'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512bw'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512cd'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512dq'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512f'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='avx512vl'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='invpcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pcid'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='pku'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Snowridge'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='mpx'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='core-capability'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='split-lock-detect'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='cldemote'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='erms'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='gfni'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdir64b'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='movdiri'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='xsaves'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='athlon'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='athlon-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='core2duo'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='core2duo-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='coreduo'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='coreduo-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='n270'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='n270-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='ss'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='phenom'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <blockers model='phenom-v1'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='3dnow'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <feature name='3dnowext'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </blockers>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </mode>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  <memoryBacking supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <enum name='sourceType'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <value>file</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <value>anonymous</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <value>memfd</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  </memoryBacking>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <disk supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='diskDevice'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>disk</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>cdrom</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>floppy</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>lun</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='bus'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>fdc</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>scsi</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>sata</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtio-transitional</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtio-non-transitional</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <graphics supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>vnc</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>egl-headless</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>dbus</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <video supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='modelType'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>vga</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>cirrus</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>none</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>bochs</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>ramfb</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <hostdev supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='mode'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>subsystem</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='startupPolicy'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>default</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>mandatory</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>requisite</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>optional</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='subsysType'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>pci</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>scsi</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='capsType'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='pciBackend'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </hostdev>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <rng supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtio</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtio-transitional</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtio-non-transitional</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>random</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>egd</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>builtin</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <filesystem supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='driverType'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>path</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>handle</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>virtiofs</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </filesystem>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <tpm supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>tpm-tis</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>tpm-crb</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>emulator</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>external</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='backendVersion'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>2.0</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </tpm>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <redirdev supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='bus'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>usb</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </redirdev>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <channel supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>pty</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>unix</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </channel>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <crypto supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='model'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>qemu</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='backendModel'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>builtin</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </crypto>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <interface supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='backendType'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>default</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>passt</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <panic supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='model'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>isa</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>hyperv</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </panic>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <console supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='type'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>null</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>vc</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>pty</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>dev</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>file</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>pipe</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>stdio</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>udp</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>tcp</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>unix</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>qemu-vdagent</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>dbus</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </console>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <gic supported='no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <genid supported='yes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <backup supported='yes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <async-teardown supported='yes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <ps2 supported='yes'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <sev supported='no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <sgx supported='no'/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <hyperv supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='features'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>relaxed</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>vapic</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>spinlocks</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>vpindex</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>runtime</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>synic</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>stimer</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>reset</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>vendor_id</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>frequencies</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>reenlightenment</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>tlbflush</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>ipi</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>avic</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>emsr_bitmap</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>xmm_input</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <defaults>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </defaults>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </hyperv>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    <launchSecurity supported='yes'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      <enum name='sectype'>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:        <value>tdx</value>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:      </enum>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:    </launchSecurity>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:33:28 np0005531888 nova_compute[186788]: </domainCapabilities>
Nov 22 02:33:28 np0005531888 nova_compute[186788]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.948 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.948 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.949 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.949 186792 INFO nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Secure Boot support detected#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.951 186792 INFO nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.951 186792 INFO nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.961 186792 DEBUG nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] cpu compare xml: <cpu match="exact">
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  <model>Nehalem</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]: </cpu>
Nov 22 02:33:28 np0005531888 nova_compute[186788]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.964 186792 DEBUG nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.981 186792 INFO nova.virt.node [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Determined node identity 1afd6948-7df7-46e7-8718-35e2b3007a5d from /var/lib/nova/compute_id#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:27.997 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Verified node 1afd6948-7df7-46e7-8718-35e2b3007a5d matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.025 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.533 186792 ERROR nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Could not retrieve compute node resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '1afd6948-7df7-46e7-8718-35e2b3007a5d' not found: No resource provider with uuid 1afd6948-7df7-46e7-8718-35e2b3007a5d found  ", "request_id": "req-60425b9e-d8f6-4a2e-bd11-0e03c25ccd23"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '1afd6948-7df7-46e7-8718-35e2b3007a5d' not found: No resource provider with uuid 1afd6948-7df7-46e7-8718-35e2b3007a5d found  ", "request_id": "req-60425b9e-d8f6-4a2e-bd11-0e03c25ccd23"}]}#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.549 186792 DEBUG oslo_concurrency.lockutils [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.549 186792 DEBUG oslo_concurrency.lockutils [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.549 186792 DEBUG oslo_concurrency.lockutils [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.550 186792 DEBUG nova.compute.resource_tracker [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.697 186792 WARNING nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.698 186792 DEBUG nova.compute.resource_tracker [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6180MB free_disk=73.66292572021484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.698 186792 DEBUG oslo_concurrency.lockutils [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.698 186792 DEBUG oslo_concurrency.lockutils [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.795 186792 ERROR nova.compute.resource_tracker [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '1afd6948-7df7-46e7-8718-35e2b3007a5d' not found: No resource provider with uuid 1afd6948-7df7-46e7-8718-35e2b3007a5d found  ", "request_id": "req-6f623044-8e64-4c2f-ac44-dbdcbfbf4b06"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '1afd6948-7df7-46e7-8718-35e2b3007a5d' not found: No resource provider with uuid 1afd6948-7df7-46e7-8718-35e2b3007a5d found  ", "request_id": "req-6f623044-8e64-4c2f-ac44-dbdcbfbf4b06"}]}#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.796 186792 DEBUG nova.compute.resource_tracker [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.797 186792 DEBUG nova.compute.resource_tracker [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.960 186792 INFO nova.scheduler.client.report [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [req-c8eb690c-5748-48fa-bf5e-3b14b0a761c3] Created resource provider record via placement API for resource provider with UUID 1afd6948-7df7-46e7-8718-35e2b3007a5d and name compute-2.ctlplane.example.com.#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.985 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 22 02:33:28 np0005531888 nova_compute[186788]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.986 186792 INFO nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.987 186792 DEBUG nova.compute.provider_tree [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.987 186792 DEBUG nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:33:28 np0005531888 nova_compute[186788]: 2025-11-22 07:33:28.990 186792 DEBUG nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Libvirt baseline CPU <cpu>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  <arch>x86_64</arch>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  <model>Nehalem</model>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  <vendor>AMD</vendor>
Nov 22 02:33:28 np0005531888 nova_compute[186788]:  <topology sockets="8" cores="1" threads="1"/>
Nov 22 02:33:28 np0005531888 nova_compute[186788]: </cpu>
Nov 22 02:33:28 np0005531888 nova_compute[186788]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.279 186792 DEBUG nova.scheduler.client.report [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Updated inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.279 186792 DEBUG nova.compute.provider_tree [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Updating resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.279 186792 DEBUG nova.compute.provider_tree [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.565 186792 DEBUG nova.compute.provider_tree [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Updating resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.832 186792 DEBUG nova.compute.resource_tracker [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.832 186792 DEBUG oslo_concurrency.lockutils [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.832 186792 DEBUG nova.service [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.874 186792 DEBUG nova.service [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 22 02:33:29 np0005531888 nova_compute[186788]: 2025-11-22 07:33:29.875 186792 DEBUG nova.servicegroup.drivers.db [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 22 02:33:31 np0005531888 systemd-logind[825]: New session 26 of user zuul.
Nov 22 02:33:31 np0005531888 systemd[1]: Started Session 26 of User zuul.
Nov 22 02:33:32 np0005531888 python3.9[187239]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:33:34 np0005531888 python3.9[187395]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:33:34 np0005531888 systemd[1]: Reloading.
Nov 22 02:33:34 np0005531888 podman[187396]: 2025-11-22 07:33:34.747977209 +0000 UTC m=+0.115867577 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:33:34 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:33:34 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:33:35 np0005531888 python3.9[187605]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:33:35 np0005531888 network[187622]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:33:35 np0005531888 network[187623]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:33:35 np0005531888 network[187624]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:33:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:33:36.786 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:33:36.787 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:33:36.787 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:40 np0005531888 python3.9[187898]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:33:41 np0005531888 python3.9[188051]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:42 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:33:42 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:33:42 np0005531888 python3.9[188204]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:43 np0005531888 podman[188328]: 2025-11-22 07:33:43.634480092 +0000 UTC m=+0.066267868 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 02:33:43 np0005531888 python3.9[188375]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:33:44 np0005531888 python3.9[188528]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:33:45 np0005531888 python3.9[188680]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:33:45 np0005531888 systemd[1]: Reloading.
Nov 22 02:33:45 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:33:45 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:33:46 np0005531888 python3.9[188867]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:33:47 np0005531888 podman[188992]: 2025-11-22 07:33:47.309622548 +0000 UTC m=+0.079106223 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 02:33:47 np0005531888 python3.9[189037]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:48 np0005531888 python3.9[189189]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:49 np0005531888 python3.9[189341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:49 np0005531888 python3.9[189462]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796828.8087187-367-69065518587956/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=d3d36c542f4af449a66988015465dd0bb4b47bb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:50 np0005531888 python3.9[189614]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 22 02:33:51 np0005531888 python3.9[189766]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 22 02:33:52 np0005531888 python3.9[189919]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:33:53 np0005531888 python3.9[190077]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:33:55 np0005531888 python3.9[190235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:55 np0005531888 python3.9[190356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796834.9034755-571-277896377656847/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:56 np0005531888 python3.9[190506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:57 np0005531888 python3.9[190627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796836.114872-571-257147059089182/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:57 np0005531888 python3.9[190777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:58 np0005531888 python3.9[190898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796837.291835-571-130020032447793/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:59 np0005531888 python3.9[191048]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:59 np0005531888 python3.9[191200]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:34:00 np0005531888 python3.9[191352]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:01 np0005531888 python3.9[191473]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796839.9830832-748-7035118706003/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:01 np0005531888 python3.9[191623]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:02 np0005531888 python3.9[191699]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:02 np0005531888 python3.9[191849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:03 np0005531888 python3.9[191970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796842.4147842-748-59261213488586/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:04 np0005531888 python3.9[192120]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:04 np0005531888 python3.9[192241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796843.5889647-748-233605366150481/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:05 np0005531888 python3.9[192391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:05 np0005531888 podman[192486]: 2025-11-22 07:34:05.608405819 +0000 UTC m=+0.091302448 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:34:05 np0005531888 python3.9[192526]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796844.7339308-748-192774829878133/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:06 np0005531888 python3.9[192688]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:06 np0005531888 python3.9[192809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796845.9111373-748-74267896239097/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:07 np0005531888 python3.9[192959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:08 np0005531888 python3.9[193080]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796847.0624707-748-38653957736726/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:08 np0005531888 python3.9[193230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:09 np0005531888 python3.9[193351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796848.1634252-748-97043464224853/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:09 np0005531888 python3.9[193501]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:10 np0005531888 python3.9[193622]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796849.30329-748-67529044905296/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:10 np0005531888 python3.9[193772]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:11 np0005531888 python3.9[193893]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796850.4748685-748-272978265929687/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:12 np0005531888 python3.9[194043]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:12 np0005531888 python3.9[194164]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796851.5951269-748-26298640753268/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:12 np0005531888 nova_compute[186788]: 2025-11-22 07:34:12.877 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:12 np0005531888 nova_compute[186788]: 2025-11-22 07:34:12.890 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:13 np0005531888 podman[194288]: 2025-11-22 07:34:13.904165965 +0000 UTC m=+0.058509193 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:34:14 np0005531888 python3.9[194327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:14 np0005531888 python3.9[194409]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:15 np0005531888 python3.9[194559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:15 np0005531888 python3.9[194635]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:16 np0005531888 python3.9[194785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:16 np0005531888 python3.9[194861]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:17 np0005531888 podman[194985]: 2025-11-22 07:34:17.467353146 +0000 UTC m=+0.062884800 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:34:17 np0005531888 python3.9[195033]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:18 np0005531888 python3.9[195185]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:19 np0005531888 python3.9[195337]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:19 np0005531888 python3.9[195489]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:34:20 np0005531888 systemd[1]: Reloading.
Nov 22 02:34:20 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:20 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:20 np0005531888 systemd[1]: Listening on Podman API Socket.
Nov 22 02:34:21 np0005531888 python3.9[195680]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:21 np0005531888 python3.9[195803]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796860.7676103-1414-155344193667717/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:22 np0005531888 python3.9[195879]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:22 np0005531888 python3.9[196002]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796860.7676103-1414-155344193667717/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:24 np0005531888 python3.9[196154]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 22 02:34:25 np0005531888 python3.9[196306]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:34:26 np0005531888 python3[196458]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:34:26 np0005531888 podman[196494]: 2025-11-22 07:34:26.749617974 +0000 UTC m=+0.021261799 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.957 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.958 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.958 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.971 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.972 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.999 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:26.999 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.000 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.000 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:34:27 np0005531888 podman[196494]: 2025-11-22 07:34:27.013153758 +0000 UTC m=+0.284797583 container create 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 22 02:34:27 np0005531888 python3[196458]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.212 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.213 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6178MB free_disk=73.66235733032227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.213 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.213 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.273 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.274 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.302 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.314 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.316 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:34:27 np0005531888 nova_compute[186788]: 2025-11-22 07:34:27.316 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:34:27 np0005531888 python3.9[196684]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:34:28 np0005531888 python3.9[196838]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:29 np0005531888 python3.9[196989]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796868.835117-1605-105593720045954/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:30 np0005531888 python3.9[197065]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:34:30 np0005531888 systemd[1]: Reloading.
Nov 22 02:34:30 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:30 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:31 np0005531888 python3.9[197176]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:34:31 np0005531888 systemd[1]: Reloading.
Nov 22 02:34:31 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:31 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:31 np0005531888 systemd[1]: Starting ceilometer_agent_compute container...
Nov 22 02:34:32 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:34:32 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:32 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:32 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:32 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:32 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.
Nov 22 02:34:32 np0005531888 podman[197216]: 2025-11-22 07:34:32.47695978 +0000 UTC m=+0.585201681 container init 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + sudo -E kolla_set_configs
Nov 22 02:34:32 np0005531888 podman[197216]: 2025-11-22 07:34:32.502350048 +0000 UTC m=+0.610591949 container start 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: sudo: unable to send audit message: Operation not permitted
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Validating config file
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Copying service configuration files
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: INFO:__main__:Writing out command to execute
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: ++ cat /run_command
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + ARGS=
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + sudo kolla_copy_cacerts
Nov 22 02:34:32 np0005531888 podman[197216]: ceilometer_agent_compute
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: sudo: unable to send audit message: Operation not permitted
Nov 22 02:34:32 np0005531888 systemd[1]: Started ceilometer_agent_compute container.
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + [[ ! -n '' ]]
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + . kolla_extend_start
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + umask 0022
Nov 22 02:34:32 np0005531888 ceilometer_agent_compute[197232]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 22 02:34:32 np0005531888 podman[197239]: 2025-11-22 07:34:32.636681994 +0000 UTC m=+0.117913608 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 02:34:32 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-646eab722ed8c674.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:34:32 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-646eab722ed8c674.service: Failed with result 'exit-code'.
Nov 22 02:34:33 np0005531888 python3.9[197415]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:34:33 np0005531888 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.622 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.623 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.623 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.623 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.623 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.623 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.623 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.623 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.624 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.624 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.624 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.624 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.624 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.624 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.624 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.624 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.625 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.626 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.627 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.628 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.628 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.628 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.628 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.628 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.628 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.628 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.628 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.629 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.630 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.630 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.630 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.630 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.630 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.630 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.630 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.630 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.631 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.631 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.631 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.631 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.631 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.631 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.631 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.631 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.632 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.633 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.634 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.635 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.636 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.636 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.636 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.636 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.636 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.636 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.637 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.637 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.637 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.637 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.637 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.637 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.637 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.637 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.638 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.639 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.639 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.639 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.639 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.639 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.639 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.639 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.640 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.640 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.640 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.640 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.640 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.641 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.641 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.641 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.641 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.641 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.641 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.641 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.660 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.662 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.663 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.792 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.902 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.902 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.903 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.903 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.903 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.903 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.903 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.903 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.904 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.904 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.904 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.904 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.904 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.904 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.904 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.904 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.905 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.905 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.905 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.905 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.905 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.905 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.906 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.907 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.908 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.909 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.910 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.910 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.910 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.910 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.910 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.910 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.910 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.910 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.911 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.913 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.913 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.913 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.913 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.913 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.913 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.913 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.914 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.914 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.914 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.914 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.914 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.914 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.914 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.914 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.915 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.915 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.915 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.915 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.915 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.915 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.915 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.916 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.916 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.916 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.916 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.916 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.916 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.916 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.916 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.917 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.917 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.917 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.917 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.917 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.917 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.917 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.918 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.918 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.918 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.918 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.918 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.918 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.918 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.918 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.919 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.919 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.919 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.919 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.919 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.919 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.919 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.919 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.920 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.922 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.922 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.922 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.922 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.922 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.922 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.923 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.924 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.928 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.928 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.932 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.939 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.939 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.942 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.943 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:33.944 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:34 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:34.040 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 22 02:34:34 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:34.041 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 22 02:34:34 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:34.041 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 22 02:34:34 np0005531888 ceilometer_agent_compute[197232]: 2025-11-22 07:34:34.052 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 22 02:34:34 np0005531888 virtqemud[186358]: End of file while reading data: Input/output error
Nov 22 02:34:34 np0005531888 virtqemud[186358]: End of file while reading data: Input/output error
Nov 22 02:34:34 np0005531888 systemd[1]: libpod-7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.scope: Deactivated successfully.
Nov 22 02:34:34 np0005531888 systemd[1]: libpod-7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.scope: Consumed 1.701s CPU time.
Nov 22 02:34:34 np0005531888 podman[197419]: 2025-11-22 07:34:34.29280562 +0000 UTC m=+0.684301367 container died 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:34:34 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-646eab722ed8c674.timer: Deactivated successfully.
Nov 22 02:34:34 np0005531888 systemd[1]: Stopped /usr/bin/podman healthcheck run 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.
Nov 22 02:34:34 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-userdata-shm.mount: Deactivated successfully.
Nov 22 02:34:34 np0005531888 systemd[1]: var-lib-containers-storage-overlay-ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65-merged.mount: Deactivated successfully.
Nov 22 02:34:34 np0005531888 podman[197419]: 2025-11-22 07:34:34.799129265 +0000 UTC m=+1.190625012 container cleanup 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_managed=true)
Nov 22 02:34:34 np0005531888 podman[197419]: ceilometer_agent_compute
Nov 22 02:34:34 np0005531888 podman[197452]: ceilometer_agent_compute
Nov 22 02:34:34 np0005531888 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 22 02:34:34 np0005531888 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 22 02:34:34 np0005531888 systemd[1]: Starting ceilometer_agent_compute container...
Nov 22 02:34:35 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:34:35 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:35 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:35 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:35 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae231241ea79f9669cbc6124ecdaa732521425c377d38e87600deae9cc958a65/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:35 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.
Nov 22 02:34:35 np0005531888 podman[197465]: 2025-11-22 07:34:35.413806753 +0000 UTC m=+0.499627148 container init 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2)
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + sudo -E kolla_set_configs
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: sudo: unable to send audit message: Operation not permitted
Nov 22 02:34:35 np0005531888 podman[197465]: 2025-11-22 07:34:35.446027153 +0000 UTC m=+0.531847558 container start 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Validating config file
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Copying service configuration files
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: INFO:__main__:Writing out command to execute
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: ++ cat /run_command
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + ARGS=
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + sudo kolla_copy_cacerts
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: sudo: unable to send audit message: Operation not permitted
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + [[ ! -n '' ]]
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + . kolla_extend_start
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + umask 0022
Nov 22 02:34:35 np0005531888 ceilometer_agent_compute[197480]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 22 02:34:35 np0005531888 podman[197465]: ceilometer_agent_compute
Nov 22 02:34:35 np0005531888 systemd[1]: Started ceilometer_agent_compute container.
Nov 22 02:34:36 np0005531888 podman[197487]: 2025-11-22 07:34:36.145201446 +0000 UTC m=+0.689996369 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 02:34:36 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-5c15b4b899337be9.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:34:36 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-5c15b4b899337be9.service: Failed with result 'exit-code'.
Nov 22 02:34:36 np0005531888 podman[197537]: 2025-11-22 07:34:36.260726919 +0000 UTC m=+0.114367853 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.566 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.566 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.566 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.566 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.568 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.568 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.568 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.568 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.568 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.569 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.569 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.569 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.569 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.569 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.569 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.569 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.570 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.570 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.570 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.570 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.570 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.570 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.570 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.570 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.571 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.572 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.572 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.572 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.572 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.572 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.572 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.572 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.572 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.573 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.574 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.575 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.575 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.575 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.575 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.575 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.575 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.575 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.575 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.576 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.577 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.578 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.579 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.580 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.581 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.581 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.581 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.581 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.581 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.581 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.581 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.582 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.582 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.582 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.582 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.582 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.582 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.582 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.582 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.583 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.584 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.585 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.586 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.586 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.586 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.586 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.586 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.586 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.586 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.586 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.605 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.607 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.607 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.621 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 02:34:36 np0005531888 python3.9[197690]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:34:36.787 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:34:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:34:36.789 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:34:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:34:36.789 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.795 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.795 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.796 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.796 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.796 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.796 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.796 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.796 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.797 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.797 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.797 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.797 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.797 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.797 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.797 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.797 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.798 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.799 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.800 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.800 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.800 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.800 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.800 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.800 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.800 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.800 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.801 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.801 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.801 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.801 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.801 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.801 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.801 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.801 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.802 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.803 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.803 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.803 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.803 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.803 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.803 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.803 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.803 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.804 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.804 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.804 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.804 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.804 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.804 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.804 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.804 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.805 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.806 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.807 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.807 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.807 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.807 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.808 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.808 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.808 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.808 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.808 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.808 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.808 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.808 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.809 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.809 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.809 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.809 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.809 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.809 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.809 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.809 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.810 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.810 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.810 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.810 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.810 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.810 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.810 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.810 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.811 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.811 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.811 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.811 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.811 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.811 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.811 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.811 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.812 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.813 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.813 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.813 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.813 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.813 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.813 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.813 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.813 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.814 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.815 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.816 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.816 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.816 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.816 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.816 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.816 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.816 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.816 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.817 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.818 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.819 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.819 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.819 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.819 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.819 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.819 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.819 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.820 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.823 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.829 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.834 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.834 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.834 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.834 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.834 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.834 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:34:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:37 np0005531888 python3.9[197819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796876.162481-1702-16727174585133/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:38 np0005531888 python3.9[197971]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 22 02:34:39 np0005531888 python3.9[198123]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:34:40 np0005531888 python3[198275]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:34:40 np0005531888 podman[198312]: 2025-11-22 07:34:40.470275783 +0000 UTC m=+0.024013462 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 22 02:34:41 np0005531888 podman[198312]: 2025-11-22 07:34:41.16626133 +0000 UTC m=+0.719998979 container create 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible)
Nov 22 02:34:41 np0005531888 python3[198275]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 22 02:34:42 np0005531888 python3.9[198503]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:34:42 np0005531888 python3.9[198657]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:43 np0005531888 python3.9[198808]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796883.0643718-1860-155702680947804/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:44 np0005531888 podman[198856]: 2025-11-22 07:34:44.045463864 +0000 UTC m=+0.072404473 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:34:44 np0005531888 python3.9[198901]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:34:44 np0005531888 systemd[1]: Reloading.
Nov 22 02:34:44 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:44 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:45 np0005531888 python3.9[199014]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:34:45 np0005531888 systemd[1]: Reloading.
Nov 22 02:34:45 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:45 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:46 np0005531888 systemd[1]: Starting node_exporter container...
Nov 22 02:34:46 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:34:46 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46cc6a1813533ad36dee5eeb915d753c5d161bb6c5031f3b45c801aadec5363a/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:46 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46cc6a1813533ad36dee5eeb915d753c5d161bb6c5031f3b45c801aadec5363a/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:46 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256.
Nov 22 02:34:46 np0005531888 podman[199054]: 2025-11-22 07:34:46.948034351 +0000 UTC m=+0.865124073 container init 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.964Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.964Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.964Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.964Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.964Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.964Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=arp
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=bcache
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=bonding
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=cpu
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=edac
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=filefd
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=netclass
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=netdev
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=netstat
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=nfs
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=nvme
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=softnet
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=systemd
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=xfs
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.965Z caller=node_exporter.go:117 level=info collector=zfs
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.966Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 22 02:34:46 np0005531888 node_exporter[199069]: ts=2025-11-22T07:34:46.967Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 22 02:34:46 np0005531888 podman[199054]: 2025-11-22 07:34:46.978433542 +0000 UTC m=+0.895523234 container start 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:34:47 np0005531888 podman[199054]: node_exporter
Nov 22 02:34:47 np0005531888 systemd[1]: Started node_exporter container.
Nov 22 02:34:47 np0005531888 podman[199078]: 2025-11-22 07:34:47.189395124 +0000 UTC m=+0.195804588 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:34:47 np0005531888 podman[199209]: 2025-11-22 07:34:47.701614166 +0000 UTC m=+0.067198214 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:34:48 np0005531888 python3.9[199273]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:34:48 np0005531888 systemd[1]: Stopping node_exporter container...
Nov 22 02:34:48 np0005531888 systemd[1]: libpod-936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256.scope: Deactivated successfully.
Nov 22 02:34:48 np0005531888 podman[199277]: 2025-11-22 07:34:48.849824305 +0000 UTC m=+0.699075542 container died 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:34:49 np0005531888 systemd[1]: 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256-1108c36e352c096c.timer: Deactivated successfully.
Nov 22 02:34:49 np0005531888 systemd[1]: Stopped /usr/bin/podman healthcheck run 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256.
Nov 22 02:34:49 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256-userdata-shm.mount: Deactivated successfully.
Nov 22 02:34:49 np0005531888 systemd[1]: var-lib-containers-storage-overlay-46cc6a1813533ad36dee5eeb915d753c5d161bb6c5031f3b45c801aadec5363a-merged.mount: Deactivated successfully.
Nov 22 02:34:50 np0005531888 podman[199277]: 2025-11-22 07:34:50.454407146 +0000 UTC m=+2.303658343 container cleanup 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:34:50 np0005531888 podman[199277]: node_exporter
Nov 22 02:34:50 np0005531888 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 02:34:50 np0005531888 podman[199306]: node_exporter
Nov 22 02:34:50 np0005531888 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 22 02:34:50 np0005531888 systemd[1]: Stopped node_exporter container.
Nov 22 02:34:50 np0005531888 systemd[1]: Starting node_exporter container...
Nov 22 02:34:50 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:34:50 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46cc6a1813533ad36dee5eeb915d753c5d161bb6c5031f3b45c801aadec5363a/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:50 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46cc6a1813533ad36dee5eeb915d753c5d161bb6c5031f3b45c801aadec5363a/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:51 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256.
Nov 22 02:34:51 np0005531888 podman[199319]: 2025-11-22 07:34:51.31320125 +0000 UTC m=+0.734557219 container init 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.328Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.328Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.328Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.329Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.329Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.329Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.329Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.329Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.329Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=arp
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=bcache
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=bonding
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=cpu
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=edac
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=filefd
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=netclass
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=netdev
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=netstat
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=nfs
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=nvme
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=softnet
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=systemd
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=xfs
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=node_exporter.go:117 level=info collector=zfs
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.330Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 22 02:34:51 np0005531888 node_exporter[199335]: ts=2025-11-22T07:34:51.331Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 22 02:34:51 np0005531888 podman[199319]: 2025-11-22 07:34:51.337078307 +0000 UTC m=+0.758434256 container start 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:34:51 np0005531888 podman[199319]: node_exporter
Nov 22 02:34:51 np0005531888 systemd[1]: Started node_exporter container.
Nov 22 02:34:51 np0005531888 podman[199344]: 2025-11-22 07:34:51.546549894 +0000 UTC m=+0.194251459 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:34:52 np0005531888 python3.9[199519]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:53 np0005531888 python3.9[199642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796891.8105884-1957-114663166346810/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:54 np0005531888 python3.9[199794]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 22 02:34:55 np0005531888 python3.9[199946]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:34:56 np0005531888 python3[200098]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:35:01 np0005531888 podman[200110]: 2025-11-22 07:35:01.40394078 +0000 UTC m=+4.803001551 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 22 02:35:01 np0005531888 podman[200207]: 2025-11-22 07:35:01.599064597 +0000 UTC m=+0.080988474 container create 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Nov 22 02:35:01 np0005531888 podman[200207]: 2025-11-22 07:35:01.544450427 +0000 UTC m=+0.026374304 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 22 02:35:01 np0005531888 python3[200098]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 22 02:35:02 np0005531888 python3.9[200397]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:35:03 np0005531888 python3.9[200551]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:35:04 np0005531888 python3.9[200702]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796903.911714-2115-195670278727402/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:35:05 np0005531888 python3.9[200778]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:35:05 np0005531888 systemd[1]: Reloading.
Nov 22 02:35:05 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:35:05 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:35:06 np0005531888 podman[200891]: 2025-11-22 07:35:06.73321643 +0000 UTC m=+0.098805152 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:35:06 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-5c15b4b899337be9.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:35:06 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-5c15b4b899337be9.service: Failed with result 'exit-code'.
Nov 22 02:35:06 np0005531888 podman[200892]: 2025-11-22 07:35:06.775011817 +0000 UTC m=+0.138929341 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:35:06 np0005531888 python3.9[200890]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:35:06 np0005531888 systemd[1]: Reloading.
Nov 22 02:35:07 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:35:07 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:35:07 np0005531888 systemd[1]: Starting podman_exporter container...
Nov 22 02:35:07 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:35:07 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1339ecf3980b8704f0950bb6afdd20cbdbd51649269705c3e99996d632bfe82c/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:07 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1339ecf3980b8704f0950bb6afdd20cbdbd51649269705c3e99996d632bfe82c/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:07 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120.
Nov 22 02:35:07 np0005531888 podman[200970]: 2025-11-22 07:35:07.701270426 +0000 UTC m=+0.378649752 container init 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:35:07 np0005531888 podman_exporter[200985]: ts=2025-11-22T07:35:07.721Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 22 02:35:07 np0005531888 podman_exporter[200985]: ts=2025-11-22T07:35:07.721Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 22 02:35:07 np0005531888 podman_exporter[200985]: ts=2025-11-22T07:35:07.721Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 22 02:35:07 np0005531888 podman_exporter[200985]: ts=2025-11-22T07:35:07.721Z caller=handler.go:105 level=info collector=container
Nov 22 02:35:07 np0005531888 podman[200970]: 2025-11-22 07:35:07.728619921 +0000 UTC m=+0.405999217 container start 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:35:07 np0005531888 systemd[1]: Starting Podman API Service...
Nov 22 02:35:07 np0005531888 systemd[1]: Started Podman API Service.
Nov 22 02:35:07 np0005531888 podman[200996]: time="2025-11-22T07:35:07Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 22 02:35:07 np0005531888 podman[200996]: time="2025-11-22T07:35:07Z" level=info msg="Setting parallel job count to 25"
Nov 22 02:35:07 np0005531888 podman[200996]: time="2025-11-22T07:35:07Z" level=info msg="Using sqlite as database backend"
Nov 22 02:35:07 np0005531888 podman[200970]: podman_exporter
Nov 22 02:35:07 np0005531888 systemd[1]: Started podman_exporter container.
Nov 22 02:35:07 np0005531888 podman[200996]: time="2025-11-22T07:35:07Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 22 02:35:07 np0005531888 podman[200996]: time="2025-11-22T07:35:07Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 22 02:35:07 np0005531888 podman[200996]: time="2025-11-22T07:35:07Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 22 02:35:07 np0005531888 podman[200996]: @ - - [22/Nov/2025:07:35:07 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 22 02:35:07 np0005531888 podman[200996]: time="2025-11-22T07:35:07Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 02:35:07 np0005531888 podman[200996]: @ - - [22/Nov/2025:07:35:07 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19568 "" "Go-http-client/1.1"
Nov 22 02:35:07 np0005531888 podman_exporter[200985]: ts=2025-11-22T07:35:07.878Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 22 02:35:07 np0005531888 podman_exporter[200985]: ts=2025-11-22T07:35:07.879Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 22 02:35:07 np0005531888 podman_exporter[200985]: ts=2025-11-22T07:35:07.879Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 22 02:35:07 np0005531888 podman[200994]: 2025-11-22 07:35:07.885914963 +0000 UTC m=+0.144477150 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:35:07 np0005531888 systemd[1]: 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120-4dbec8b92c97d12b.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:35:07 np0005531888 systemd[1]: 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120-4dbec8b92c97d12b.service: Failed with result 'exit-code'.
Nov 22 02:35:08 np0005531888 python3.9[201182]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:35:08 np0005531888 systemd[1]: Stopping podman_exporter container...
Nov 22 02:35:09 np0005531888 podman[200996]: @ - - [22/Nov/2025:07:35:07 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Nov 22 02:35:09 np0005531888 systemd[1]: libpod-1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120.scope: Deactivated successfully.
Nov 22 02:35:09 np0005531888 podman[201186]: 2025-11-22 07:35:09.110632484 +0000 UTC m=+0.169184845 container died 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:35:09 np0005531888 systemd[1]: 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120-4dbec8b92c97d12b.timer: Deactivated successfully.
Nov 22 02:35:09 np0005531888 systemd[1]: Stopped /usr/bin/podman healthcheck run 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120.
Nov 22 02:35:09 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120-userdata-shm.mount: Deactivated successfully.
Nov 22 02:35:09 np0005531888 systemd[1]: var-lib-containers-storage-overlay-1339ecf3980b8704f0950bb6afdd20cbdbd51649269705c3e99996d632bfe82c-merged.mount: Deactivated successfully.
Nov 22 02:35:09 np0005531888 podman[201186]: 2025-11-22 07:35:09.634278384 +0000 UTC m=+0.692830745 container cleanup 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:35:09 np0005531888 podman[201186]: podman_exporter
Nov 22 02:35:09 np0005531888 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 02:35:09 np0005531888 podman[201216]: podman_exporter
Nov 22 02:35:09 np0005531888 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 22 02:35:09 np0005531888 systemd[1]: Stopped podman_exporter container.
Nov 22 02:35:09 np0005531888 systemd[1]: Starting podman_exporter container...
Nov 22 02:35:09 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:35:09 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1339ecf3980b8704f0950bb6afdd20cbdbd51649269705c3e99996d632bfe82c/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:09 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1339ecf3980b8704f0950bb6afdd20cbdbd51649269705c3e99996d632bfe82c/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:09 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120.
Nov 22 02:35:09 np0005531888 podman[201227]: 2025-11-22 07:35:09.963008531 +0000 UTC m=+0.232897674 container init 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:35:09 np0005531888 podman_exporter[201242]: ts=2025-11-22T07:35:09.978Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 22 02:35:09 np0005531888 podman_exporter[201242]: ts=2025-11-22T07:35:09.978Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 22 02:35:09 np0005531888 podman_exporter[201242]: ts=2025-11-22T07:35:09.978Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 22 02:35:09 np0005531888 podman_exporter[201242]: ts=2025-11-22T07:35:09.978Z caller=handler.go:105 level=info collector=container
Nov 22 02:35:09 np0005531888 podman[200996]: @ - - [22/Nov/2025:07:35:09 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 22 02:35:09 np0005531888 podman[200996]: time="2025-11-22T07:35:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 02:35:09 np0005531888 podman[201227]: 2025-11-22 07:35:09.991189956 +0000 UTC m=+0.261079089 container start 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:35:10 np0005531888 podman[201227]: podman_exporter
Nov 22 02:35:10 np0005531888 systemd[1]: Started podman_exporter container.
Nov 22 02:35:10 np0005531888 podman[200996]: @ - - [22/Nov/2025:07:35:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19570 "" "Go-http-client/1.1"
Nov 22 02:35:10 np0005531888 podman_exporter[201242]: ts=2025-11-22T07:35:10.102Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 22 02:35:10 np0005531888 podman_exporter[201242]: ts=2025-11-22T07:35:10.103Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 22 02:35:10 np0005531888 podman_exporter[201242]: ts=2025-11-22T07:35:10.103Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 22 02:35:10 np0005531888 podman[201252]: 2025-11-22 07:35:10.141106008 +0000 UTC m=+0.135479032 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:35:11 np0005531888 python3.9[201426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:35:12 np0005531888 python3.9[201549]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796910.9720201-2212-11692220606796/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:35:13 np0005531888 python3.9[201701]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 22 02:35:14 np0005531888 python3.9[201853]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:35:14 np0005531888 podman[201930]: 2025-11-22 07:35:14.699168521 +0000 UTC m=+0.071556880 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 02:35:14 np0005531888 auditd[705]: Audit daemon rotating log files
Nov 22 02:35:15 np0005531888 python3[202024]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:35:17 np0005531888 podman[202037]: 2025-11-22 07:35:17.736708378 +0000 UTC m=+2.514827570 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 02:35:17 np0005531888 podman[202132]: 2025-11-22 07:35:17.900430647 +0000 UTC m=+0.056604406 container create 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:35:17 np0005531888 podman[202132]: 2025-11-22 07:35:17.866337597 +0000 UTC m=+0.022511376 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 02:35:17 np0005531888 python3[202024]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 02:35:18 np0005531888 podman[202292]: 2025-11-22 07:35:18.608670653 +0000 UTC m=+0.075710474 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 02:35:18 np0005531888 python3.9[202340]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:35:19 np0005531888 python3.9[202494]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:35:20 np0005531888 python3.9[202645]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796919.7508547-2370-40125040620356/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:35:21 np0005531888 python3.9[202721]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:35:21 np0005531888 systemd[1]: Reloading.
Nov 22 02:35:21 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:35:21 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:35:21 np0005531888 podman[202804]: 2025-11-22 07:35:21.723110272 +0000 UTC m=+0.094059184 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:35:22 np0005531888 python3.9[202856]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:35:22 np0005531888 systemd[1]: Reloading.
Nov 22 02:35:22 np0005531888 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:35:22 np0005531888 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:35:22 np0005531888 systemd[1]: Starting openstack_network_exporter container...
Nov 22 02:35:22 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:35:22 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2bc4b6f83e3efb2af58b21d3faad32f935f698058a4762a1f6f8a1ca0c09966/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:22 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2bc4b6f83e3efb2af58b21d3faad32f935f698058a4762a1f6f8a1ca0c09966/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:22 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2bc4b6f83e3efb2af58b21d3faad32f935f698058a4762a1f6f8a1ca0c09966/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:22 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e.
Nov 22 02:35:22 np0005531888 podman[202897]: 2025-11-22 07:35:22.590349919 +0000 UTC m=+0.138149664 container init 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, config_id=edpm)
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *bridge.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *coverage.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *datapath.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *iface.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *memory.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *ovnnorthd.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *ovn.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *ovsdbserver.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *pmd_perf.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *pmd_rxq.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: INFO    07:35:22 main.go:48: registering *vswitch.Collector
Nov 22 02:35:22 np0005531888 openstack_network_exporter[202913]: NOTICE  07:35:22 main.go:76: listening on https://:9105/metrics
Nov 22 02:35:22 np0005531888 podman[202897]: 2025-11-22 07:35:22.620392837 +0000 UTC m=+0.168192562 container start 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:35:22 np0005531888 podman[202897]: openstack_network_exporter
Nov 22 02:35:22 np0005531888 systemd[1]: Started openstack_network_exporter container.
Nov 22 02:35:22 np0005531888 podman[202923]: 2025-11-22 07:35:22.727918409 +0000 UTC m=+0.090115064 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, config_id=edpm)
Nov 22 02:35:23 np0005531888 python3.9[203097]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:35:23 np0005531888 systemd[1]: Stopping openstack_network_exporter container...
Nov 22 02:35:23 np0005531888 systemd[1]: libpod-5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e.scope: Deactivated successfully.
Nov 22 02:35:23 np0005531888 podman[203101]: 2025-11-22 07:35:23.741691621 +0000 UTC m=+0.093682037 container died 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 02:35:24 np0005531888 systemd[1]: 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e-5688af1541f2d311.timer: Deactivated successfully.
Nov 22 02:35:24 np0005531888 systemd[1]: Stopped /usr/bin/podman healthcheck run 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e.
Nov 22 02:35:24 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e-userdata-shm.mount: Deactivated successfully.
Nov 22 02:35:24 np0005531888 systemd[1]: var-lib-containers-storage-overlay-a2bc4b6f83e3efb2af58b21d3faad32f935f698058a4762a1f6f8a1ca0c09966-merged.mount: Deactivated successfully.
Nov 22 02:35:26 np0005531888 podman[203101]: 2025-11-22 07:35:26.755434024 +0000 UTC m=+3.107424440 container cleanup 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 02:35:26 np0005531888 podman[203101]: openstack_network_exporter
Nov 22 02:35:26 np0005531888 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 02:35:26 np0005531888 podman[203131]: openstack_network_exporter
Nov 22 02:35:26 np0005531888 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 22 02:35:26 np0005531888 systemd[1]: Stopped openstack_network_exporter container.
Nov 22 02:35:26 np0005531888 systemd[1]: Starting openstack_network_exporter container...
Nov 22 02:35:27 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:35:27 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2bc4b6f83e3efb2af58b21d3faad32f935f698058a4762a1f6f8a1ca0c09966/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:27 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2bc4b6f83e3efb2af58b21d3faad32f935f698058a4762a1f6f8a1ca0c09966/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:27 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2bc4b6f83e3efb2af58b21d3faad32f935f698058a4762a1f6f8a1ca0c09966/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:27 np0005531888 systemd[1]: Started /usr/bin/podman healthcheck run 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e.
Nov 22 02:35:27 np0005531888 podman[203142]: 2025-11-22 07:35:27.134532893 +0000 UTC m=+0.274227480 container init 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *bridge.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *coverage.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *datapath.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *iface.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *memory.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *ovnnorthd.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *ovn.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *ovsdbserver.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *pmd_perf.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *pmd_rxq.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: INFO    07:35:27 main.go:48: registering *vswitch.Collector
Nov 22 02:35:27 np0005531888 openstack_network_exporter[203157]: NOTICE  07:35:27 main.go:76: listening on https://:9105/metrics
Nov 22 02:35:27 np0005531888 podman[203142]: 2025-11-22 07:35:27.164728165 +0000 UTC m=+0.304422722 container start 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Nov 22 02:35:27 np0005531888 podman[203142]: openstack_network_exporter
Nov 22 02:35:27 np0005531888 systemd[1]: Started openstack_network_exporter container.
Nov 22 02:35:27 np0005531888 podman[203167]: 2025-11-22 07:35:27.252527595 +0000 UTC m=+0.076732378 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.306 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.322 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.322 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.345 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.345 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.345 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.346 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.543 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.545 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5917MB free_disk=73.49401473999023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.545 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.546 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.625 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.625 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.658 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.688 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.690 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:35:27 np0005531888 nova_compute[186788]: 2025-11-22 07:35:27.690 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:35:28 np0005531888 python3.9[203339]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.321 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.323 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.323 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.323 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.339 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.339 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.340 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.340 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.340 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.340 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:35:28 np0005531888 nova_compute[186788]: 2025-11-22 07:35:28.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:35:36.788 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:35:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:35:36.789 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:35:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:35:36.789 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:35:37 np0005531888 podman[203364]: 2025-11-22 07:35:37.702352517 +0000 UTC m=+0.068197113 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:35:37 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-5c15b4b899337be9.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:35:37 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-5c15b4b899337be9.service: Failed with result 'exit-code'.
Nov 22 02:35:37 np0005531888 podman[203365]: 2025-11-22 07:35:37.746229951 +0000 UTC m=+0.109282403 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 22 02:35:40 np0005531888 podman[203406]: 2025-11-22 07:35:40.679038711 +0000 UTC m=+0.055738827 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:35:45 np0005531888 podman[203430]: 2025-11-22 07:35:45.685590383 +0000 UTC m=+0.057913797 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 02:35:49 np0005531888 podman[203449]: 2025-11-22 07:35:49.683394328 +0000 UTC m=+0.051311846 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:35:52 np0005531888 podman[203469]: 2025-11-22 07:35:52.691224885 +0000 UTC m=+0.061450468 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:35:57 np0005531888 podman[203493]: 2025-11-22 07:35:57.694277922 +0000 UTC m=+0.064258378 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public)
Nov 22 02:36:08 np0005531888 podman[203514]: 2025-11-22 07:36:08.689877872 +0000 UTC m=+0.059800152 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:36:08 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-5c15b4b899337be9.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:36:08 np0005531888 systemd[1]: 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0-5c15b4b899337be9.service: Failed with result 'exit-code'.
Nov 22 02:36:08 np0005531888 podman[203515]: 2025-11-22 07:36:08.727852569 +0000 UTC m=+0.092049565 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller)
Nov 22 02:36:11 np0005531888 podman[203555]: 2025-11-22 07:36:11.676306799 +0000 UTC m=+0.049670002 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:36:16 np0005531888 podman[203582]: 2025-11-22 07:36:16.677509817 +0000 UTC m=+0.049988259 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:36:20 np0005531888 podman[203601]: 2025-11-22 07:36:20.705831606 +0000 UTC m=+0.083416827 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 02:36:22 np0005531888 podman[203720]: 2025-11-22 07:36:22.889664206 +0000 UTC m=+0.056141402 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:36:23 np0005531888 python3.9[203772]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 22 02:36:23 np0005531888 python3.9[203937]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:24 np0005531888 systemd[1]: Started libpod-conmon-cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006.scope.
Nov 22 02:36:24 np0005531888 podman[203938]: 2025-11-22 07:36:24.03639445 +0000 UTC m=+0.132066235 container exec cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 02:36:24 np0005531888 podman[203938]: 2025-11-22 07:36:24.042380401 +0000 UTC m=+0.138052186 container exec_died cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 02:36:24 np0005531888 systemd[1]: libpod-conmon-cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006.scope: Deactivated successfully.
Nov 22 02:36:24 np0005531888 python3.9[204122]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:24 np0005531888 systemd[1]: Started libpod-conmon-cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006.scope.
Nov 22 02:36:24 np0005531888 podman[204123]: 2025-11-22 07:36:24.913000136 +0000 UTC m=+0.084733196 container exec cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 02:36:24 np0005531888 podman[204123]: 2025-11-22 07:36:24.950030361 +0000 UTC m=+0.121763411 container exec_died cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 02:36:24 np0005531888 systemd[1]: libpod-conmon-cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006.scope: Deactivated successfully.
Nov 22 02:36:25 np0005531888 python3.9[204306]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:26 np0005531888 python3.9[204458]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 22 02:36:26 np0005531888 nova_compute[186788]: 2025-11-22 07:36:26.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:27 np0005531888 python3.9[204623]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:27 np0005531888 systemd[1]: Started libpod-conmon-c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba.scope.
Nov 22 02:36:27 np0005531888 podman[204624]: 2025-11-22 07:36:27.23899318 +0000 UTC m=+0.071308443 container exec c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 02:36:27 np0005531888 podman[204624]: 2025-11-22 07:36:27.275192858 +0000 UTC m=+0.107508111 container exec_died c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 02:36:27 np0005531888 systemd[1]: libpod-conmon-c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba.scope: Deactivated successfully.
Nov 22 02:36:27 np0005531888 podman[204780]: 2025-11-22 07:36:27.855620128 +0000 UTC m=+0.071625419 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc.)
Nov 22 02:36:27 np0005531888 nova_compute[186788]: 2025-11-22 07:36:27.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:27 np0005531888 nova_compute[186788]: 2025-11-22 07:36:27.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:27 np0005531888 nova_compute[186788]: 2025-11-22 07:36:27.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:36:27 np0005531888 nova_compute[186788]: 2025-11-22 07:36:27.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:36:27 np0005531888 nova_compute[186788]: 2025-11-22 07:36:27.966 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:36:27 np0005531888 nova_compute[186788]: 2025-11-22 07:36:27.967 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531888 python3.9[204827]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:28 np0005531888 systemd[1]: Started libpod-conmon-c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba.scope.
Nov 22 02:36:28 np0005531888 podman[204828]: 2025-11-22 07:36:28.153456849 +0000 UTC m=+0.088344883 container exec c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:36:28 np0005531888 podman[204828]: 2025-11-22 07:36:28.190924955 +0000 UTC m=+0.125813129 container exec_died c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 22 02:36:28 np0005531888 systemd[1]: libpod-conmon-c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba.scope: Deactivated successfully.
Nov 22 02:36:28 np0005531888 python3.9[205011]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:28 np0005531888 nova_compute[186788]: 2025-11-22 07:36:28.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531888 nova_compute[186788]: 2025-11-22 07:36:28.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531888 nova_compute[186788]: 2025-11-22 07:36:28.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531888 nova_compute[186788]: 2025-11-22 07:36:28.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:36:28 np0005531888 nova_compute[186788]: 2025-11-22 07:36:28.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:36:28 np0005531888 nova_compute[186788]: 2025-11-22 07:36:28.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:36:28 np0005531888 nova_compute[186788]: 2025-11-22 07:36:28.984 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.185 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.187 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5929MB free_disk=73.49368667602539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.187 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.187 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.260 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.260 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.281 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.294 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.297 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:36:29 np0005531888 nova_compute[186788]: 2025-11-22 07:36:29.298 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:36:29 np0005531888 python3.9[205163]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 22 02:36:30 np0005531888 nova_compute[186788]: 2025-11-22 07:36:30.296 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:30 np0005531888 nova_compute[186788]: 2025-11-22 07:36:30.296 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:30 np0005531888 nova_compute[186788]: 2025-11-22 07:36:30.296 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:36:30 np0005531888 python3.9[205328]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:30 np0005531888 systemd[1]: Started libpod-conmon-700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.scope.
Nov 22 02:36:30 np0005531888 podman[205329]: 2025-11-22 07:36:30.605699851 +0000 UTC m=+0.090569201 container exec 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:36:30 np0005531888 podman[205329]: 2025-11-22 07:36:30.640972069 +0000 UTC m=+0.125841399 container exec_died 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 02:36:30 np0005531888 systemd[1]: libpod-conmon-700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.scope: Deactivated successfully.
Nov 22 02:36:31 np0005531888 python3.9[205511]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:31 np0005531888 systemd[1]: Started libpod-conmon-700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.scope.
Nov 22 02:36:31 np0005531888 podman[205512]: 2025-11-22 07:36:31.509821605 +0000 UTC m=+0.085188435 container exec 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 02:36:31 np0005531888 podman[205512]: 2025-11-22 07:36:31.545155804 +0000 UTC m=+0.120522634 container exec_died 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 22 02:36:31 np0005531888 systemd[1]: libpod-conmon-700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b.scope: Deactivated successfully.
Nov 22 02:36:32 np0005531888 python3.9[205694]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:32 np0005531888 python3.9[205846]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 22 02:36:33 np0005531888 python3.9[206011]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:34 np0005531888 systemd[1]: Started libpod-conmon-7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.scope.
Nov 22 02:36:34 np0005531888 podman[206012]: 2025-11-22 07:36:34.088295935 +0000 UTC m=+0.089164032 container exec 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:36:34 np0005531888 podman[206032]: 2025-11-22 07:36:34.159828691 +0000 UTC m=+0.056279975 container exec_died 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 02:36:34 np0005531888 podman[206012]: 2025-11-22 07:36:34.165577686 +0000 UTC m=+0.166445753 container exec_died 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:36:34 np0005531888 systemd[1]: libpod-conmon-7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.scope: Deactivated successfully.
Nov 22 02:36:34 np0005531888 python3.9[206197]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:35 np0005531888 systemd[1]: Started libpod-conmon-7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.scope.
Nov 22 02:36:35 np0005531888 podman[206198]: 2025-11-22 07:36:35.032299247 +0000 UTC m=+0.118351367 container exec 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:36:35 np0005531888 podman[206218]: 2025-11-22 07:36:35.119764609 +0000 UTC m=+0.073611382 container exec_died 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:36:35 np0005531888 podman[206198]: 2025-11-22 07:36:35.218660441 +0000 UTC m=+0.304712541 container exec_died 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS)
Nov 22 02:36:35 np0005531888 systemd[1]: libpod-conmon-7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0.scope: Deactivated successfully.
Nov 22 02:36:36 np0005531888 python3.9[206381]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:36 np0005531888 python3.9[206533]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 22 02:36:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:36:36.789 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:36:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:36:36.790 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:36:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:36:36.790 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.831 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.831 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.831 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.831 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.831 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.831 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.831 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:36:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:37 np0005531888 python3.9[206696]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:37 np0005531888 systemd[1]: Started libpod-conmon-936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256.scope.
Nov 22 02:36:37 np0005531888 podman[206697]: 2025-11-22 07:36:37.752075659 +0000 UTC m=+0.229103276 container exec 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:36:37 np0005531888 podman[206716]: 2025-11-22 07:36:37.879749008 +0000 UTC m=+0.113550743 container exec_died 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:36:37 np0005531888 podman[206697]: 2025-11-22 07:36:37.885909192 +0000 UTC m=+0.362936819 container exec_died 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:36:37 np0005531888 systemd[1]: libpod-conmon-936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256.scope: Deactivated successfully.
Nov 22 02:36:38 np0005531888 python3.9[206879]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:38 np0005531888 systemd[1]: Started libpod-conmon-936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256.scope.
Nov 22 02:36:38 np0005531888 podman[206880]: 2025-11-22 07:36:38.799344828 +0000 UTC m=+0.085267427 container exec 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:36:38 np0005531888 podman[206880]: 2025-11-22 07:36:38.830975877 +0000 UTC m=+0.116898436 container exec_died 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:36:38 np0005531888 systemd[1]: libpod-conmon-936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256.scope: Deactivated successfully.
Nov 22 02:36:38 np0005531888 podman[206897]: 2025-11-22 07:36:38.904773273 +0000 UTC m=+0.098423994 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible)
Nov 22 02:36:38 np0005531888 podman[206899]: 2025-11-22 07:36:38.930610015 +0000 UTC m=+0.123476118 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Nov 22 02:36:39 np0005531888 python3.9[207107]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:40 np0005531888 python3.9[207259]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 22 02:36:41 np0005531888 python3.9[207424]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:42 np0005531888 systemd[1]: Started libpod-conmon-1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120.scope.
Nov 22 02:36:42 np0005531888 podman[207425]: 2025-11-22 07:36:42.047188533 +0000 UTC m=+0.831476844 container exec 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:36:42 np0005531888 podman[207425]: 2025-11-22 07:36:42.10080069 +0000 UTC m=+0.885088981 container exec_died 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:36:42 np0005531888 podman[207442]: 2025-11-22 07:36:42.207909861 +0000 UTC m=+0.158883159 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:36:42 np0005531888 systemd[1]: libpod-conmon-1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120.scope: Deactivated successfully.
Nov 22 02:36:42 np0005531888 python3.9[207631]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:43 np0005531888 systemd[1]: Started libpod-conmon-1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120.scope.
Nov 22 02:36:43 np0005531888 podman[207632]: 2025-11-22 07:36:43.07964805 +0000 UTC m=+0.091432041 container exec 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:36:43 np0005531888 podman[207652]: 2025-11-22 07:36:43.207801639 +0000 UTC m=+0.113260266 container exec_died 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:36:43 np0005531888 podman[207632]: 2025-11-22 07:36:43.273385876 +0000 UTC m=+0.285169887 container exec_died 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:36:43 np0005531888 systemd[1]: libpod-conmon-1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120.scope: Deactivated successfully.
Nov 22 02:36:44 np0005531888 python3.9[207816]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:44 np0005531888 python3.9[207968]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 22 02:36:45 np0005531888 python3.9[208134]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:46 np0005531888 systemd[1]: Started libpod-conmon-5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e.scope.
Nov 22 02:36:46 np0005531888 podman[208135]: 2025-11-22 07:36:46.062731434 +0000 UTC m=+0.517783509 container exec 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc.)
Nov 22 02:36:46 np0005531888 podman[208135]: 2025-11-22 07:36:46.316077827 +0000 UTC m=+0.771129942 container exec_died 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, version=9.6)
Nov 22 02:36:47 np0005531888 systemd[1]: libpod-conmon-5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e.scope: Deactivated successfully.
Nov 22 02:36:47 np0005531888 podman[208190]: 2025-11-22 07:36:47.352323787 +0000 UTC m=+0.069621156 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 02:36:47 np0005531888 python3.9[208337]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:47 np0005531888 systemd[1]: Started libpod-conmon-5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e.scope.
Nov 22 02:36:47 np0005531888 podman[208338]: 2025-11-22 07:36:47.973089375 +0000 UTC m=+0.088232791 container exec 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 22 02:36:48 np0005531888 podman[208338]: 2025-11-22 07:36:48.009195941 +0000 UTC m=+0.124339377 container exec_died 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Nov 22 02:36:48 np0005531888 systemd[1]: libpod-conmon-5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e.scope: Deactivated successfully.
Nov 22 02:36:48 np0005531888 python3.9[208521]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:49 np0005531888 python3.9[208673]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:50 np0005531888 python3.9[208825]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:50 np0005531888 python3.9[208948]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763797009.7608855-3214-248674013691797/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:51 np0005531888 podman[209072]: 2025-11-22 07:36:51.655828974 +0000 UTC m=+0.077048638 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 22 02:36:51 np0005531888 python3.9[209119]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:52 np0005531888 python3.9[209272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:53 np0005531888 python3.9[209350]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:53 np0005531888 podman[209474]: 2025-11-22 07:36:53.611310475 +0000 UTC m=+0.059018775 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:36:53 np0005531888 python3.9[209526]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:54 np0005531888 python3.9[209604]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.92m44nlj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:54 np0005531888 python3.9[209756]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:55 np0005531888 python3.9[209834]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:56 np0005531888 python3.9[209986]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:36:57 np0005531888 python3[210139]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 02:36:57 np0005531888 python3.9[210291]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:58 np0005531888 podman[210341]: 2025-11-22 07:36:58.17665656 +0000 UTC m=+0.060456167 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:36:58 np0005531888 python3.9[210388]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:59 np0005531888 python3.9[210542]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:59 np0005531888 python3.9[210620]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:00 np0005531888 python3.9[210772]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:37:00 np0005531888 python3.9[210850]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:01 np0005531888 python3.9[211002]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:37:02 np0005531888 python3.9[211080]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:02 np0005531888 python3.9[211232]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:37:03 np0005531888 python3.9[211357]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763797022.2954843-3589-250100854846293/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:04 np0005531888 python3.9[211509]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:04 np0005531888 python3.9[211661]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:37:05 np0005531888 python3.9[211816]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:06 np0005531888 python3.9[211968]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:37:07 np0005531888 python3.9[212121]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:37:08 np0005531888 python3.9[212275]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:37:09 np0005531888 python3.9[212430]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:09 np0005531888 systemd[1]: session-26.scope: Deactivated successfully.
Nov 22 02:37:09 np0005531888 systemd[1]: session-26.scope: Consumed 1min 44.756s CPU time.
Nov 22 02:37:09 np0005531888 systemd-logind[825]: Session 26 logged out. Waiting for processes to exit.
Nov 22 02:37:09 np0005531888 systemd-logind[825]: Removed session 26.
Nov 22 02:37:09 np0005531888 podman[212455]: 2025-11-22 07:37:09.687593962 +0000 UTC m=+0.064734486 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:37:09 np0005531888 podman[212456]: 2025-11-22 07:37:09.712939827 +0000 UTC m=+0.088826722 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:37:12 np0005531888 podman[212502]: 2025-11-22 07:37:12.681291566 +0000 UTC m=+0.053170529 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:37:17 np0005531888 podman[212527]: 2025-11-22 07:37:17.708659003 +0000 UTC m=+0.081389281 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 02:37:22 np0005531888 podman[212547]: 2025-11-22 07:37:22.688399388 +0000 UTC m=+0.056570089 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:37:24 np0005531888 podman[212567]: 2025-11-22 07:37:24.703670594 +0000 UTC m=+0.080082970 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:37:26 np0005531888 nova_compute[186788]: 2025-11-22 07:37:26.963 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:27 np0005531888 nova_compute[186788]: 2025-11-22 07:37:27.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:28 np0005531888 podman[212591]: 2025-11-22 07:37:28.706589569 +0000 UTC m=+0.078100315 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:37:28 np0005531888 nova_compute[186788]: 2025-11-22 07:37:28.951 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:28 np0005531888 nova_compute[186788]: 2025-11-22 07:37:28.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:28 np0005531888 nova_compute[186788]: 2025-11-22 07:37:28.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:37:28 np0005531888 nova_compute[186788]: 2025-11-22 07:37:28.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:37:28 np0005531888 nova_compute[186788]: 2025-11-22 07:37:28.974 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:37:29 np0005531888 nova_compute[186788]: 2025-11-22 07:37:29.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:29 np0005531888 nova_compute[186788]: 2025-11-22 07:37:29.970 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:29 np0005531888 nova_compute[186788]: 2025-11-22 07:37:29.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:30 np0005531888 nova_compute[186788]: 2025-11-22 07:37:30.957 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:30 np0005531888 nova_compute[186788]: 2025-11-22 07:37:30.958 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:30 np0005531888 nova_compute[186788]: 2025-11-22 07:37:30.958 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:37:30 np0005531888 nova_compute[186788]: 2025-11-22 07:37:30.958 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:30 np0005531888 nova_compute[186788]: 2025-11-22 07:37:30.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:37:30 np0005531888 nova_compute[186788]: 2025-11-22 07:37:30.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:37:30 np0005531888 nova_compute[186788]: 2025-11-22 07:37:30.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:37:30 np0005531888 nova_compute[186788]: 2025-11-22 07:37:30.983 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.173 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.174 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6002MB free_disk=73.49697494506836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.174 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.175 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.222 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.223 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.250 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.263 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.264 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:37:31 np0005531888 nova_compute[186788]: 2025-11-22 07:37:31.265 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:37:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:37:36.790 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:37:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:37:36.790 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:37:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:37:36.790 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:37:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:37:40.075 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:37:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:37:40.076 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:37:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:37:40.077 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:37:40 np0005531888 podman[212611]: 2025-11-22 07:37:40.6875775 +0000 UTC m=+0.061554363 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:37:40 np0005531888 podman[212612]: 2025-11-22 07:37:40.724255657 +0000 UTC m=+0.093217484 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:37:43 np0005531888 podman[212656]: 2025-11-22 07:37:43.675411818 +0000 UTC m=+0.052407232 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:37:48 np0005531888 podman[212679]: 2025-11-22 07:37:48.687839858 +0000 UTC m=+0.050123589 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:37:53 np0005531888 podman[212699]: 2025-11-22 07:37:53.688610059 +0000 UTC m=+0.065010132 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:37:55 np0005531888 podman[212719]: 2025-11-22 07:37:55.690714002 +0000 UTC m=+0.059523196 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:37:59 np0005531888 podman[212743]: 2025-11-22 07:37:59.682759945 +0000 UTC m=+0.058431150 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter)
Nov 22 02:38:11 np0005531888 podman[212764]: 2025-11-22 07:38:11.69669773 +0000 UTC m=+0.063106600 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251118)
Nov 22 02:38:11 np0005531888 podman[212765]: 2025-11-22 07:38:11.726704107 +0000 UTC m=+0.086761865 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 02:38:14 np0005531888 podman[212809]: 2025-11-22 07:38:14.677649344 +0000 UTC m=+0.055612941 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:38:19 np0005531888 podman[212833]: 2025-11-22 07:38:19.699832248 +0000 UTC m=+0.062794593 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:38:24 np0005531888 podman[212853]: 2025-11-22 07:38:24.693904188 +0000 UTC m=+0.068065109 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:38:26 np0005531888 podman[212874]: 2025-11-22 07:38:26.691700479 +0000 UTC m=+0.061385110 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:38:26 np0005531888 nova_compute[186788]: 2025-11-22 07:38:26.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:26 np0005531888 nova_compute[186788]: 2025-11-22 07:38:26.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:38:26 np0005531888 nova_compute[186788]: 2025-11-22 07:38:26.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:38:26 np0005531888 nova_compute[186788]: 2025-11-22 07:38:26.970 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:26 np0005531888 nova_compute[186788]: 2025-11-22 07:38:26.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:38:26 np0005531888 nova_compute[186788]: 2025-11-22 07:38:26.986 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:28 np0005531888 nova_compute[186788]: 2025-11-22 07:38:28.987 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:28 np0005531888 nova_compute[186788]: 2025-11-22 07:38:28.988 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:28 np0005531888 nova_compute[186788]: 2025-11-22 07:38:28.988 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:29 np0005531888 nova_compute[186788]: 2025-11-22 07:38:29.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:29 np0005531888 nova_compute[186788]: 2025-11-22 07:38:29.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:38:29 np0005531888 nova_compute[186788]: 2025-11-22 07:38:29.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:38:29 np0005531888 nova_compute[186788]: 2025-11-22 07:38:29.980 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:38:29 np0005531888 nova_compute[186788]: 2025-11-22 07:38:29.980 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:30 np0005531888 podman[212898]: 2025-11-22 07:38:30.683213711 +0000 UTC m=+0.054959795 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Nov 22 02:38:30 np0005531888 nova_compute[186788]: 2025-11-22 07:38:30.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:31 np0005531888 nova_compute[186788]: 2025-11-22 07:38:31.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:31 np0005531888 nova_compute[186788]: 2025-11-22 07:38:31.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:38:31 np0005531888 nova_compute[186788]: 2025-11-22 07:38:31.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:38:31 np0005531888 nova_compute[186788]: 2025-11-22 07:38:31.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:38:31 np0005531888 nova_compute[186788]: 2025-11-22 07:38:31.982 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.140 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.141 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6048MB free_disk=73.49697494506836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.141 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.142 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.274 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.275 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.376 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.474 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.475 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.498 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.525 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.554 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.574 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.575 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:38:32 np0005531888 nova_compute[186788]: 2025-11-22 07:38:32.576 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:38:33 np0005531888 nova_compute[186788]: 2025-11-22 07:38:33.575 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:33 np0005531888 nova_compute[186788]: 2025-11-22 07:38:33.576 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:33 np0005531888 nova_compute[186788]: 2025-11-22 07:38:33.576 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:38:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:38:36.791 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:38:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:38:36.792 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:38:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:38:36.793 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.831 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.832 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:38:36.833 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:42 np0005531888 podman[212920]: 2025-11-22 07:38:42.743676886 +0000 UTC m=+0.097346779 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 02:38:42 np0005531888 podman[212921]: 2025-11-22 07:38:42.77060829 +0000 UTC m=+0.123056193 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:38:45 np0005531888 podman[212964]: 2025-11-22 07:38:45.686311424 +0000 UTC m=+0.058969231 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:38:50 np0005531888 podman[212987]: 2025-11-22 07:38:50.708069755 +0000 UTC m=+0.076152892 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 02:38:55 np0005531888 podman[213007]: 2025-11-22 07:38:55.718717753 +0000 UTC m=+0.095537216 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 02:38:57 np0005531888 podman[213027]: 2025-11-22 07:38:57.685524172 +0000 UTC m=+0.054970995 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:39:01 np0005531888 podman[213051]: 2025-11-22 07:39:01.67427212 +0000 UTC m=+0.051833161 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Nov 22 02:39:13 np0005531888 podman[213072]: 2025-11-22 07:39:13.691674358 +0000 UTC m=+0.066517419 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 02:39:13 np0005531888 podman[213073]: 2025-11-22 07:39:13.727545628 +0000 UTC m=+0.096553449 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 22 02:39:16 np0005531888 podman[213114]: 2025-11-22 07:39:16.681485245 +0000 UTC m=+0.057407219 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:39:21 np0005531888 podman[213138]: 2025-11-22 07:39:21.679459305 +0000 UTC m=+0.050918294 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 02:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:39:22.513 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:39:22.514 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.265 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "a2baedff-f8ef-4615-b8fc-25275eb918a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.265 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "a2baedff-f8ef-4615-b8fc-25275eb918a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.281 186792 DEBUG nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.480 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.481 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.488 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.489 186792 INFO nova.compute.claims [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.616 186792 DEBUG nova.compute.provider_tree [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.631 186792 DEBUG nova.scheduler.client.report [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.654 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.655 186792 DEBUG nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.714 186792 DEBUG nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.715 186792 DEBUG nova.network.neutron [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.742 186792 INFO nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.760 186792 DEBUG nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.873 186792 DEBUG nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.875 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.876 186792 INFO nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Creating image(s)#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.877 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "/var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.878 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "/var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.879 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "/var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.879 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:24 np0005531888 nova_compute[186788]: 2025-11-22 07:39:24.880 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:25 np0005531888 nova_compute[186788]: 2025-11-22 07:39:25.562 186792 DEBUG nova.network.neutron [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:39:25 np0005531888 nova_compute[186788]: 2025-11-22 07:39:25.563 186792 DEBUG nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:39:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:39:26.516 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:39:26 np0005531888 podman[213157]: 2025-11-22 07:39:26.679634827 +0000 UTC m=+0.056971099 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 22 02:39:27 np0005531888 nova_compute[186788]: 2025-11-22 07:39:27.377 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:27 np0005531888 nova_compute[186788]: 2025-11-22 07:39:27.440 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:27 np0005531888 nova_compute[186788]: 2025-11-22 07:39:27.441 186792 DEBUG nova.virt.images [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] eb6eb4ac-7956-4021-b3a0-d612ae61d38c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 02:39:27 np0005531888 nova_compute[186788]: 2025-11-22 07:39:27.447 186792 DEBUG nova.privsep.utils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:39:27 np0005531888 nova_compute[186788]: 2025-11-22 07:39:27.448 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:28 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.334 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted" returned: 0 in 0.886s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:28 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.339 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:28 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.398 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:28 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.400 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:28 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.415 186792 INFO oslo.privsep.daemon [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmprhzgrc0y/privsep.sock']#033[00m
Nov 22 02:39:28 np0005531888 podman[213196]: 2025-11-22 07:39:28.673783061 +0000 UTC m=+0.049664503 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.151 186792 INFO oslo.privsep.daemon [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.988 213221 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.993 213221 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.995 213221 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:28.995 213221 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213221#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.269 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.325 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.327 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.328 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.340 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.398 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.400 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.434 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.436 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.436 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.509 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.510 186792 DEBUG nova.virt.disk.api [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Checking if we can resize image /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.511 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.578 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.580 186792 DEBUG nova.virt.disk.api [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Cannot resize image /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.580 186792 DEBUG nova.objects.instance [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lazy-loading 'migration_context' on Instance uuid a2baedff-f8ef-4615-b8fc-25275eb918a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.595 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.596 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Ensure instance console log exists: /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.596 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.596 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.597 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.599 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.605 186792 WARNING nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.610 186792 DEBUG nova.virt.libvirt.host [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.611 186792 DEBUG nova.virt.libvirt.host [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.614 186792 DEBUG nova.virt.libvirt.host [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.615 186792 DEBUG nova.virt.libvirt.host [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.617 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.617 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.617 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.618 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.618 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.618 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.618 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.619 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.619 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.619 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.619 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.619 186792 DEBUG nova.virt.hardware [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.624 186792 DEBUG nova.privsep.utils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.625 186792 DEBUG nova.objects.instance [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a2baedff-f8ef-4615-b8fc-25275eb918a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.638 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <uuid>a2baedff-f8ef-4615-b8fc-25275eb918a1</uuid>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <name>instance-00000001</name>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-179332878</nova:name>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:39:29</nova:creationTime>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:        <nova:user uuid="ae0a9bb236424581bf35f94644a5484c">tempest-DeleteServersAdminTestJSON-2048590971-project-member</nova:user>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:        <nova:project uuid="dd45d638bd73499da80359efc81898a3">tempest-DeleteServersAdminTestJSON-2048590971</nova:project>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <entry name="serial">a2baedff-f8ef-4615-b8fc-25275eb918a1</entry>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <entry name="uuid">a2baedff-f8ef-4615-b8fc-25275eb918a1</entry>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk.config"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/console.log" append="off"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:39:29 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:39:29 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:39:29 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:39:29 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.697 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.697 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.698 186792 INFO nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Using config drive#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.972 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.973 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:29 np0005531888 nova_compute[186788]: 2025-11-22 07:39:29.974 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:30 np0005531888 nova_compute[186788]: 2025-11-22 07:39:30.467 186792 INFO nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Creating config drive at /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk.config#033[00m
Nov 22 02:39:30 np0005531888 nova_compute[186788]: 2025-11-22 07:39:30.471 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ch4s7lx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:30 np0005531888 nova_compute[186788]: 2025-11-22 07:39:30.596 186792 DEBUG oslo_concurrency.processutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ch4s7lx" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:30 np0005531888 systemd-machined[153106]: New machine qemu-1-instance-00000001.
Nov 22 02:39:30 np0005531888 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 22 02:39:30 np0005531888 nova_compute[186788]: 2025-11-22 07:39:30.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:30 np0005531888 nova_compute[186788]: 2025-11-22 07:39:30.958 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:30 np0005531888 nova_compute[186788]: 2025-11-22 07:39:30.978 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.308 186792 DEBUG nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.309 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.317 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797171.3159435, a2baedff-f8ef-4615-b8fc-25275eb918a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.318 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.322 186792 INFO nova.virt.libvirt.driver [-] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Instance spawned successfully.#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.322 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.385 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.389 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.398 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.399 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.399 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.400 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.400 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.401 186792 DEBUG nova.virt.libvirt.driver [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.431 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.432 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797171.3210475, a2baedff-f8ef-4615-b8fc-25275eb918a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.432 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] VM Started (Lifecycle Event)#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.462 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.467 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.489 186792 INFO nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Took 6.61 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.490 186792 DEBUG nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.501 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.577 186792 INFO nova.compute.manager [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Took 7.18 seconds to build instance.#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.611 186792 DEBUG oslo_concurrency.lockutils [None req-cf83f059-5349-427a-937f-fcf43edadc1f ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "a2baedff-f8ef-4615-b8fc-25275eb918a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:31 np0005531888 nova_compute[186788]: 2025-11-22 07:39:31.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.079 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:32 np0005531888 podman[213265]: 2025-11-22 07:39:32.113221984 +0000 UTC m=+0.071745384 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=)
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.150 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.151 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.213 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.366 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.368 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5966MB free_disk=73.46148300170898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.368 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.368 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.887 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance a2baedff-f8ef-4615-b8fc-25275eb918a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.888 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.888 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:39:32 np0005531888 nova_compute[186788]: 2025-11-22 07:39:32.968 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.013 186792 ERROR nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [req-2432a65a-9c41-4c8c-b0eb-742632de7cf9] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 1afd6948-7df7-46e7-8718-35e2b3007a5d.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-2432a65a-9c41-4c8c-b0eb-742632de7cf9"}]}#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.034 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.068 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.069 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.088 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.117 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.174 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.227 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updated inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.227 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.228 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.260 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:39:33 np0005531888 nova_compute[186788]: 2025-11-22 07:39:33.261 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.893s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.730 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Acquiring lock "a2baedff-f8ef-4615-b8fc-25275eb918a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.732 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Lock "a2baedff-f8ef-4615-b8fc-25275eb918a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.732 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Acquiring lock "a2baedff-f8ef-4615-b8fc-25275eb918a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.732 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Lock "a2baedff-f8ef-4615-b8fc-25275eb918a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.733 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Lock "a2baedff-f8ef-4615-b8fc-25275eb918a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.741 186792 INFO nova.compute.manager [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Terminating instance#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.748 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Acquiring lock "refresh_cache-a2baedff-f8ef-4615-b8fc-25275eb918a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.749 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Acquired lock "refresh_cache-a2baedff-f8ef-4615-b8fc-25275eb918a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:39:34 np0005531888 nova_compute[186788]: 2025-11-22 07:39:34.749 186792 DEBUG nova.network.neutron [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.056 186792 DEBUG nova.network.neutron [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.260 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.261 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.262 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.346 186792 DEBUG nova.network.neutron [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.369 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Releasing lock "refresh_cache-a2baedff-f8ef-4615-b8fc-25275eb918a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.370 186792 DEBUG nova.compute.manager [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:39:35 np0005531888 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 22 02:39:35 np0005531888 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 4.601s CPU time.
Nov 22 02:39:35 np0005531888 systemd-machined[153106]: Machine qemu-1-instance-00000001 terminated.
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.620 186792 INFO nova.virt.libvirt.driver [-] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Instance destroyed successfully.#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.621 186792 DEBUG nova.objects.instance [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Lazy-loading 'resources' on Instance uuid a2baedff-f8ef-4615-b8fc-25275eb918a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.642 186792 INFO nova.virt.libvirt.driver [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Deleting instance files /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1_del#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.643 186792 INFO nova.virt.libvirt.driver [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Deletion of /var/lib/nova/instances/a2baedff-f8ef-4615-b8fc-25275eb918a1_del complete#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.835 186792 DEBUG nova.virt.libvirt.host [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.836 186792 INFO nova.virt.libvirt.host [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] UEFI support detected#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.838 186792 INFO nova.compute.manager [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.838 186792 DEBUG oslo.service.loopingcall [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.838 186792 DEBUG nova.compute.manager [-] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:39:35 np0005531888 nova_compute[186788]: 2025-11-22 07:39:35.839 186792 DEBUG nova.network.neutron [-] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.170 186792 DEBUG nova.network.neutron [-] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.181 186792 DEBUG nova.network.neutron [-] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.193 186792 INFO nova.compute.manager [-] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Took 0.35 seconds to deallocate network for instance.#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.267 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.268 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.320 186792 DEBUG nova.compute.provider_tree [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.337 186792 DEBUG nova.scheduler.client.report [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.364 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.419 186792 INFO nova.scheduler.client.report [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Deleted allocations for instance a2baedff-f8ef-4615-b8fc-25275eb918a1#033[00m
Nov 22 02:39:36 np0005531888 nova_compute[186788]: 2025-11-22 07:39:36.530 186792 DEBUG oslo_concurrency.lockutils [None req-2d79ef35-ff00-4bd7-9b83-725a3d9172ae 3c5e455093a94a34819e4d0cb8fc410a e1fe9aa2d16c42dd83caccd52842f791 - - default default] Lock "a2baedff-f8ef-4615-b8fc-25275eb918a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:39:36.793 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:39:36.793 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:39:36.793 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:44 np0005531888 podman[213302]: 2025-11-22 07:39:44.705080763 +0000 UTC m=+0.071579390 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:39:44 np0005531888 podman[213303]: 2025-11-22 07:39:44.73411927 +0000 UTC m=+0.097350299 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:39:47 np0005531888 podman[213347]: 2025-11-22 07:39:47.677928413 +0000 UTC m=+0.052689855 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:39:50 np0005531888 nova_compute[186788]: 2025-11-22 07:39:50.618 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797175.6177225, a2baedff-f8ef-4615-b8fc-25275eb918a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:39:50 np0005531888 nova_compute[186788]: 2025-11-22 07:39:50.619 186792 INFO nova.compute.manager [-] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:39:50 np0005531888 nova_compute[186788]: 2025-11-22 07:39:50.653 186792 DEBUG nova.compute.manager [None req-5a84d581-31b5-4659-aab7-2aa13c9044f9 - - - - - -] [instance: a2baedff-f8ef-4615-b8fc-25275eb918a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:39:52 np0005531888 podman[213371]: 2025-11-22 07:39:52.694412166 +0000 UTC m=+0.055463743 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:39:53 np0005531888 nova_compute[186788]: 2025-11-22 07:39:53.589 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:53 np0005531888 nova_compute[186788]: 2025-11-22 07:39:53.590 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:53 np0005531888 nova_compute[186788]: 2025-11-22 07:39:53.612 186792 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:39:53 np0005531888 nova_compute[186788]: 2025-11-22 07:39:53.780 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:53 np0005531888 nova_compute[186788]: 2025-11-22 07:39:53.781 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:53 np0005531888 nova_compute[186788]: 2025-11-22 07:39:53.787 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:39:53 np0005531888 nova_compute[186788]: 2025-11-22 07:39:53.787 186792 INFO nova.compute.claims [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.016 186792 DEBUG nova.compute.provider_tree [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.039 186792 DEBUG nova.scheduler.client.report [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.073 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.074 186792 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.152 186792 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.153 186792 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.193 186792 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.231 186792 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.441 186792 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.444 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.444 186792 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Creating image(s)#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.446 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "/var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.447 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "/var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.448 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "/var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.471 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.515 186792 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Automatically allocating a network for project 98627e04b62e4ce4bf9650377c674f73. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.530 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.531 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.532 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.542 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.635 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.636 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.839 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk 1073741824" returned: 0 in 0.203s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.841 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.841 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.896 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.897 186792 DEBUG nova.virt.disk.api [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Checking if we can resize image /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.898 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.951 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.952 186792 DEBUG nova.virt.disk.api [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Cannot resize image /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.953 186792 DEBUG nova.objects.instance [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.966 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.966 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Ensure instance console log exists: /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.966 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.967 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:54 np0005531888 nova_compute[186788]: 2025-11-22 07:39:54.967 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:57 np0005531888 podman[213406]: 2025-11-22 07:39:57.704653088 +0000 UTC m=+0.070261908 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:39:59 np0005531888 podman[213427]: 2025-11-22 07:39:59.684650603 +0000 UTC m=+0.054948681 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:40:02 np0005531888 podman[213451]: 2025-11-22 07:40:02.691920509 +0000 UTC m=+0.066008325 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Nov 22 02:40:04 np0005531888 nova_compute[186788]: 2025-11-22 07:40:04.692 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "8f7e8343-7bd9-402c-bea5-4a3202b54681" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:04 np0005531888 nova_compute[186788]: 2025-11-22 07:40:04.692 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "8f7e8343-7bd9-402c-bea5-4a3202b54681" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:04 np0005531888 nova_compute[186788]: 2025-11-22 07:40:04.725 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:40:04 np0005531888 nova_compute[186788]: 2025-11-22 07:40:04.872 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:04 np0005531888 nova_compute[186788]: 2025-11-22 07:40:04.873 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:04 np0005531888 nova_compute[186788]: 2025-11-22 07:40:04.882 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:40:04 np0005531888 nova_compute[186788]: 2025-11-22 07:40:04.883 186792 INFO nova.compute.claims [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.141 186792 DEBUG nova.compute.provider_tree [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.164 186792 DEBUG nova.scheduler.client.report [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.200 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.201 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.336 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.337 186792 DEBUG nova.network.neutron [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.367 186792 INFO nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.401 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.636 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.637 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.638 186792 INFO nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Creating image(s)#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.638 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "/var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.638 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "/var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.639 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "/var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.650 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.708 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.709 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.710 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.721 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.780 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.781 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.823 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.824 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.825 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.890 186792 WARNING oslo_policy.policy [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.892 186792 WARNING oslo_policy.policy [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.894 186792 DEBUG nova.policy [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.898 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.898 186792 DEBUG nova.virt.disk.api [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Checking if we can resize image /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.898 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.957 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.958 186792 DEBUG nova.virt.disk.api [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Cannot resize image /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.959 186792 DEBUG nova.objects.instance [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lazy-loading 'migration_context' on Instance uuid 8f7e8343-7bd9-402c-bea5-4a3202b54681 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.994 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.995 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Ensure instance console log exists: /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.995 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.996 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:05 np0005531888 nova_compute[186788]: 2025-11-22 07:40:05.996 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:08 np0005531888 nova_compute[186788]: 2025-11-22 07:40:08.650 186792 DEBUG nova.network.neutron [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Successfully created port: b405020c-4786-4b1d-9c68-6ab4d6d7689a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:40:10 np0005531888 nova_compute[186788]: 2025-11-22 07:40:10.798 186792 DEBUG nova.network.neutron [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Successfully updated port: b405020c-4786-4b1d-9c68-6ab4d6d7689a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:40:10 np0005531888 nova_compute[186788]: 2025-11-22 07:40:10.820 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:10 np0005531888 nova_compute[186788]: 2025-11-22 07:40:10.820 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquired lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:10 np0005531888 nova_compute[186788]: 2025-11-22 07:40:10.821 186792 DEBUG nova.network.neutron [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:40:11 np0005531888 nova_compute[186788]: 2025-11-22 07:40:11.240 186792 DEBUG nova.network.neutron [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:40:11 np0005531888 nova_compute[186788]: 2025-11-22 07:40:11.596 186792 DEBUG nova.compute.manager [req-b1acf94a-d9aa-49af-b7d0-dcbf89df7534 req-2c103770-59ad-4cee-8357-22a95c3239aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Received event network-changed-b405020c-4786-4b1d-9c68-6ab4d6d7689a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:40:11 np0005531888 nova_compute[186788]: 2025-11-22 07:40:11.597 186792 DEBUG nova.compute.manager [req-b1acf94a-d9aa-49af-b7d0-dcbf89df7534 req-2c103770-59ad-4cee-8357-22a95c3239aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Refreshing instance network info cache due to event network-changed-b405020c-4786-4b1d-9c68-6ab4d6d7689a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:40:11 np0005531888 nova_compute[186788]: 2025-11-22 07:40:11.597 186792 DEBUG oslo_concurrency.lockutils [req-b1acf94a-d9aa-49af-b7d0-dcbf89df7534 req-2c103770-59ad-4cee-8357-22a95c3239aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.413 186792 DEBUG nova.network.neutron [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Updating instance_info_cache with network_info: [{"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.451 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Releasing lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.451 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Instance network_info: |[{"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.452 186792 DEBUG oslo_concurrency.lockutils [req-b1acf94a-d9aa-49af-b7d0-dcbf89df7534 req-2c103770-59ad-4cee-8357-22a95c3239aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.452 186792 DEBUG nova.network.neutron [req-b1acf94a-d9aa-49af-b7d0-dcbf89df7534 req-2c103770-59ad-4cee-8357-22a95c3239aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Refreshing network info cache for port b405020c-4786-4b1d-9c68-6ab4d6d7689a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.455 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Start _get_guest_xml network_info=[{"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.461 186792 WARNING nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.467 186792 DEBUG nova.virt.libvirt.host [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.468 186792 DEBUG nova.virt.libvirt.host [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.472 186792 DEBUG nova.virt.libvirt.host [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.473 186792 DEBUG nova.virt.libvirt.host [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.474 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.474 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1890587748',id=7,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-22804361',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.475 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.475 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.475 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.475 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.475 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.476 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.476 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.476 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.476 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.477 186792 DEBUG nova.virt.hardware [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.480 186792 DEBUG nova.virt.libvirt.vif [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:40:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-2058718492',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-2058718492',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(7),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-2058718492',id=8,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=7,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNvo81uLNODY5pXMCXv/rgxcCiuBWxjDFSMOswBarzwWE4bZrCdQaaMgGCGacDcycmYMfjuNyIpB44+zMTJDP3JvkVGjJV4StWUn/AhoiRpx02XDT0ns/iRT7Ya1fxBPw==',key_name='tempest-keypair-346383250',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af1e32bc189c402bad715e6c4cc8dcfa',ramdisk_id='',reservation_id='r-xkf8i573',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:40:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0eeafa43d6c84f6888a05c3f4ca3fb78',uuid=8f7e8343-7bd9-402c-bea5-4a3202b54681,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.481 186792 DEBUG nova.network.os_vif_util [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converting VIF {"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.482 186792 DEBUG nova.network.os_vif_util [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:68:db,bridge_name='br-int',has_traffic_filtering=True,id=b405020c-4786-4b1d-9c68-6ab4d6d7689a,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb405020c-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.483 186792 DEBUG nova.objects.instance [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lazy-loading 'pci_devices' on Instance uuid 8f7e8343-7bd9-402c-bea5-4a3202b54681 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.540 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <uuid>8f7e8343-7bd9-402c-bea5-4a3202b54681</uuid>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <name>instance-00000008</name>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-2058718492</nova:name>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:40:14</nova:creationTime>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <nova:flavor name="tempest-flavor_with_ephemeral_0-22804361">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        <nova:user uuid="0eeafa43d6c84f6888a05c3f4ca3fb78">tempest-ServersWithSpecificFlavorTestJSON-1826293598-project-member</nova:user>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        <nova:project uuid="af1e32bc189c402bad715e6c4cc8dcfa">tempest-ServersWithSpecificFlavorTestJSON-1826293598</nova:project>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        <nova:port uuid="b405020c-4786-4b1d-9c68-6ab4d6d7689a">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <entry name="serial">8f7e8343-7bd9-402c-bea5-4a3202b54681</entry>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <entry name="uuid">8f7e8343-7bd9-402c-bea5-4a3202b54681</entry>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.config"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:72:68:db"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <target dev="tapb405020c-47"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/console.log" append="off"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:40:14 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:40:14 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:40:14 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:40:14 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.542 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Preparing to wait for external event network-vif-plugged-b405020c-4786-4b1d-9c68-6ab4d6d7689a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.544 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.544 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.544 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.545 186792 DEBUG nova.virt.libvirt.vif [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:40:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-2058718492',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-2058718492',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(7),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-2058718492',id=8,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=7,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNvo81uLNODY5pXMCXv/rgxcCiuBWxjDFSMOswBarzwWE4bZrCdQaaMgGCGacDcycmYMfjuNyIpB44+zMTJDP3JvkVGjJV4StWUn/AhoiRpx02XDT0ns/iRT7Ya1fxBPw==',key_name='tempest-keypair-346383250',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af1e32bc189c402bad715e6c4cc8dcfa',ramdisk_id='',reservation_id='r-xkf8i573',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:40:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0eeafa43d6c84f6888a05c3f4ca3fb78',uuid=8f7e8343-7bd9-402c-bea5-4a3202b54681,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.546 186792 DEBUG nova.network.os_vif_util [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converting VIF {"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.546 186792 DEBUG nova.network.os_vif_util [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:68:db,bridge_name='br-int',has_traffic_filtering=True,id=b405020c-4786-4b1d-9c68-6ab4d6d7689a,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb405020c-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.547 186792 DEBUG os_vif [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:68:db,bridge_name='br-int',has_traffic_filtering=True,id=b405020c-4786-4b1d-9c68-6ab4d6d7689a,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb405020c-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.593 186792 DEBUG ovsdbapp.backend.ovs_idl [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.594 186792 DEBUG ovsdbapp.backend.ovs_idl [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.594 186792 DEBUG ovsdbapp.backend.ovs_idl [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.596 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.597 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [POLLOUT] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.597 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.598 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.599 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.602 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.613 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.614 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.615 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:40:14 np0005531888 nova_compute[186788]: 2025-11-22 07:40:14.616 186792 INFO oslo.privsep.daemon [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpea_p5hl3/privsep.sock']#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.395 186792 INFO oslo.privsep.daemon [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.243 213491 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.249 213491 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.251 213491 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.251 213491 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213491#033[00m
Nov 22 02:40:15 np0005531888 podman[213495]: 2025-11-22 07:40:15.694742499 +0000 UTC m=+0.065982961 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:40:15 np0005531888 podman[213496]: 2025-11-22 07:40:15.725618297 +0000 UTC m=+0.095213649 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.751 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.752 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb405020c-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.752 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb405020c-47, col_values=(('external_ids', {'iface-id': 'b405020c-4786-4b1d-9c68-6ab4d6d7689a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:68:db', 'vm-uuid': '8f7e8343-7bd9-402c-bea5-4a3202b54681'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:15 np0005531888 NetworkManager[55166]: <info>  [1763797215.7560] manager: (tapb405020c-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.754 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.759 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.764 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.765 186792 INFO os_vif [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:68:db,bridge_name='br-int',has_traffic_filtering=True,id=b405020c-4786-4b1d-9c68-6ab4d6d7689a,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb405020c-47')#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.853 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.854 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.854 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] No VIF found with MAC fa:16:3e:72:68:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.855 186792 INFO nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Using config drive#033[00m
Nov 22 02:40:15 np0005531888 nova_compute[186788]: 2025-11-22 07:40:15.862 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.203 186792 INFO nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Creating config drive at /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.config#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.210 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6xaf9_vb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.340 186792 DEBUG oslo_concurrency.processutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6xaf9_vb" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:17 np0005531888 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 22 02:40:17 np0005531888 kernel: tapb405020c-47: entered promiscuous mode
Nov 22 02:40:17 np0005531888 NetworkManager[55166]: <info>  [1763797217.4127] manager: (tapb405020c-47): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Nov 22 02:40:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:17Z|00027|binding|INFO|Claiming lport b405020c-4786-4b1d-9c68-6ab4d6d7689a for this chassis.
Nov 22 02:40:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:17Z|00028|binding|INFO|b405020c-4786-4b1d-9c68-6ab4d6d7689a: Claiming fa:16:3e:72:68:db 10.100.0.12
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.414 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.417 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:17.442 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:68:db 10.100.0.12'], port_security=['fa:16:3e:72:68:db 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b53dee3e-cc57-4959-b703-fd736782ce77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08d7499b-95e9-4cf7-b602-701ee3e333bf, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=b405020c-4786-4b1d-9c68-6ab4d6d7689a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:40:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:17.443 104023 INFO neutron.agent.ovn.metadata.agent [-] Port b405020c-4786-4b1d-9c68-6ab4d6d7689a in datapath 1ae6b2a9-f586-4520-bc3d-923fe57139cb bound to our chassis#033[00m
Nov 22 02:40:17 np0005531888 systemd-udevd[213561]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:40:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:17.446 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1ae6b2a9-f586-4520-bc3d-923fe57139cb#033[00m
Nov 22 02:40:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:17.448 104023 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp94hfjbgo/privsep.sock']#033[00m
Nov 22 02:40:17 np0005531888 NetworkManager[55166]: <info>  [1763797217.4662] device (tapb405020c-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:40:17 np0005531888 NetworkManager[55166]: <info>  [1763797217.4673] device (tapb405020c-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.470 186792 DEBUG nova.network.neutron [req-b1acf94a-d9aa-49af-b7d0-dcbf89df7534 req-2c103770-59ad-4cee-8357-22a95c3239aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Updated VIF entry in instance network info cache for port b405020c-4786-4b1d-9c68-6ab4d6d7689a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.470 186792 DEBUG nova.network.neutron [req-b1acf94a-d9aa-49af-b7d0-dcbf89df7534 req-2c103770-59ad-4cee-8357-22a95c3239aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Updating instance_info_cache with network_info: [{"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:17Z|00029|binding|INFO|Setting lport b405020c-4786-4b1d-9c68-6ab4d6d7689a ovn-installed in OVS
Nov 22 02:40:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:17Z|00030|binding|INFO|Setting lport b405020c-4786-4b1d-9c68-6ab4d6d7689a up in Southbound
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.495 186792 DEBUG oslo_concurrency.lockutils [req-b1acf94a-d9aa-49af-b7d0-dcbf89df7534 req-2c103770-59ad-4cee-8357-22a95c3239aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.496 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:17 np0005531888 systemd-machined[153106]: New machine qemu-2-instance-00000008.
Nov 22 02:40:17 np0005531888 systemd[1]: Started Virtual Machine qemu-2-instance-00000008.
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.895 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797217.894499, 8f7e8343-7bd9-402c-bea5-4a3202b54681 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.896 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] VM Started (Lifecycle Event)#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.933 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.938 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797217.895501, 8f7e8343-7bd9-402c-bea5-4a3202b54681 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.938 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.966 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:17 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.970 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:40:18 np0005531888 nova_compute[186788]: 2025-11-22 07:40:17.999 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.147 104023 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.148 104023 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp94hfjbgo/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.015 213587 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.020 213587 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.023 213587 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.023 213587 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213587#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.151 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[faf2872c-4d6d-4f27-a7d2-7331d6d24711]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:18 np0005531888 podman[213592]: 2025-11-22 07:40:18.698932892 +0000 UTC m=+0.066179185 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.745 213587 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.745 213587 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:18.745 213587 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:19.403 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[18ac1502-139e-4759-aef1-10effe2b0353]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:19.405 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1ae6b2a9-f1 in ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:40:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:19.407 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1ae6b2a9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:40:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:19.407 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e19145c0-9941-40de-b67c-3fb7e9ff20a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:19.411 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2d6c912c-005d-426f-93b5-f3b5cfe16de7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:19.434 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c36fd8-53fb-4049-bec1-16555abad1f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:19.452 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dad1de60-c833-4e50-9819-20560d34176e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:19.455 104023 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp3tc3i1rp/privsep.sock']#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.204 104023 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.207 104023 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3tc3i1rp/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.038 213625 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.043 213625 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.045 213625 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.045 213625 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213625#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.211 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[592f9d81-496d-4617-b2e5-2fe979641a64]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:20 np0005531888 nova_compute[186788]: 2025-11-22 07:40:20.755 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.790 213625 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.790 213625 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:20.790 213625 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:20 np0005531888 nova_compute[186788]: 2025-11-22 07:40:20.864 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.411 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[646d9d87-f8a1-48d5-af65-1d11e46f4b5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 NetworkManager[55166]: <info>  [1763797221.4331] manager: (tap1ae6b2a9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.431 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[056cae08-8fdf-40d3-9dd4-d559d9f99a57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 systemd-udevd[213637]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.468 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[136ae6c7-20ab-43d4-974f-0e460d57967f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.472 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d7dba793-abcf-4ccf-a5e9-c67315ccca7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 NetworkManager[55166]: <info>  [1763797221.5013] device (tap1ae6b2a9-f0): carrier: link connected
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.507 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f7574c6f-2893-47bb-a92c-a9592c01415b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.526 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[129f0d0d-247d-4f0d-bbe3-0e8fb89a0d3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1ae6b2a9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:30:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396214, 'reachable_time': 32320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213655, 'error': None, 'target': 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.541 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec75445-4a36-4a02-9c03-dc69208d302e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:30cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396214, 'tstamp': 396214}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213656, 'error': None, 'target': 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.561 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e961ddf4-9e56-4c7d-ad77-ceac1abc242c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1ae6b2a9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:30:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396214, 'reachable_time': 32320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213657, 'error': None, 'target': 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.593 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3003a7e0-5e71-4237-b6a4-10cf717816a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.653 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1820fbbe-b90f-403f-8bb6-8ca7e7f1baa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.655 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ae6b2a9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.656 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.656 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ae6b2a9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:21 np0005531888 NetworkManager[55166]: <info>  [1763797221.6987] manager: (tap1ae6b2a9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 22 02:40:21 np0005531888 nova_compute[186788]: 2025-11-22 07:40:21.697 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:21 np0005531888 kernel: tap1ae6b2a9-f0: entered promiscuous mode
Nov 22 02:40:21 np0005531888 nova_compute[186788]: 2025-11-22 07:40:21.700 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.702 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1ae6b2a9-f0, col_values=(('external_ids', {'iface-id': '254524bd-994c-43d1-84a4-6c0edeae0f13'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:21 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:21Z|00031|binding|INFO|Releasing lport 254524bd-994c-43d1-84a4-6c0edeae0f13 from this chassis (sb_readonly=0)
Nov 22 02:40:21 np0005531888 nova_compute[186788]: 2025-11-22 07:40:21.703 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:21 np0005531888 nova_compute[186788]: 2025-11-22 07:40:21.704 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.704 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1ae6b2a9-f586-4520-bc3d-923fe57139cb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1ae6b2a9-f586-4520-bc3d-923fe57139cb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.705 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4cd477-2b52-4bc5-9376-34266c5002a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.708 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-1ae6b2a9-f586-4520-bc3d-923fe57139cb
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/1ae6b2a9-f586-4520-bc3d-923fe57139cb.pid.haproxy
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 1ae6b2a9-f586-4520-bc3d-923fe57139cb
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:40:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:21.709 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'env', 'PROCESS_TAG=haproxy-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1ae6b2a9-f586-4520-bc3d-923fe57139cb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:40:21 np0005531888 nova_compute[186788]: 2025-11-22 07:40:21.714 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:22 np0005531888 podman[213690]: 2025-11-22 07:40:22.118556446 +0000 UTC m=+0.074159639 container create 986d26a39235d1a351ba467e1c14ae45ddd4e1e8c041a2ae53d41f0449c91134 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:40:22 np0005531888 podman[213690]: 2025-11-22 07:40:22.07088393 +0000 UTC m=+0.026487173 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:40:22 np0005531888 systemd[1]: Started libpod-conmon-986d26a39235d1a351ba467e1c14ae45ddd4e1e8c041a2ae53d41f0449c91134.scope.
Nov 22 02:40:22 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:40:22 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f4cf5e15085995c4bc07078abad2ea2435c89ae74874565db9a9a146fba900f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:40:22 np0005531888 podman[213690]: 2025-11-22 07:40:22.215775143 +0000 UTC m=+0.171378366 container init 986d26a39235d1a351ba467e1c14ae45ddd4e1e8c041a2ae53d41f0449c91134 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:40:22 np0005531888 podman[213690]: 2025-11-22 07:40:22.221902132 +0000 UTC m=+0.177505325 container start 986d26a39235d1a351ba467e1c14ae45ddd4e1e8c041a2ae53d41f0449c91134 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 02:40:22 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[213705]: [NOTICE]   (213709) : New worker (213711) forked
Nov 22 02:40:22 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[213705]: [NOTICE]   (213709) : Loading success.
Nov 22 02:40:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:23.223 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:40:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:23.224 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:40:23 np0005531888 nova_compute[186788]: 2025-11-22 07:40:23.224 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:23 np0005531888 nova_compute[186788]: 2025-11-22 07:40:23.264 186792 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Automatically allocated network: {'id': 'cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'name': 'auto_allocated_network', 'tenant_id': '98627e04b62e4ce4bf9650377c674f73', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['68dcf6dc-373a-4168-81d5-f04abc5d8ac8', '826b0fb5-b3d0-49b5-b40e-079f62557646'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-11-22T07:39:55Z', 'updated_at': '2025-11-22T07:40:10Z', 'revision_number': 4, 'project_id': '98627e04b62e4ce4bf9650377c674f73'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Nov 22 02:40:23 np0005531888 nova_compute[186788]: 2025-11-22 07:40:23.266 186792 DEBUG nova.policy [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:40:23 np0005531888 podman[213720]: 2025-11-22 07:40:23.685908366 +0000 UTC m=+0.060662293 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 02:40:25 np0005531888 nova_compute[186788]: 2025-11-22 07:40:25.624 186792 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Successfully created port: f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:40:25 np0005531888 nova_compute[186788]: 2025-11-22 07:40:25.758 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:25 np0005531888 nova_compute[186788]: 2025-11-22 07:40:25.867 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.570 186792 DEBUG nova.compute.manager [req-af519994-b401-4fa7-a40f-ffe57fc8d098 req-faed5978-c994-46ad-95e0-c755622d7914 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Received event network-vif-plugged-b405020c-4786-4b1d-9c68-6ab4d6d7689a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.571 186792 DEBUG oslo_concurrency.lockutils [req-af519994-b401-4fa7-a40f-ffe57fc8d098 req-faed5978-c994-46ad-95e0-c755622d7914 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.571 186792 DEBUG oslo_concurrency.lockutils [req-af519994-b401-4fa7-a40f-ffe57fc8d098 req-faed5978-c994-46ad-95e0-c755622d7914 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.571 186792 DEBUG oslo_concurrency.lockutils [req-af519994-b401-4fa7-a40f-ffe57fc8d098 req-faed5978-c994-46ad-95e0-c755622d7914 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.572 186792 DEBUG nova.compute.manager [req-af519994-b401-4fa7-a40f-ffe57fc8d098 req-faed5978-c994-46ad-95e0-c755622d7914 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Processing event network-vif-plugged-b405020c-4786-4b1d-9c68-6ab4d6d7689a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.572 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.576 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797226.5767531, 8f7e8343-7bd9-402c-bea5-4a3202b54681 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.577 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.578 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.583 186792 INFO nova.virt.libvirt.driver [-] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Instance spawned successfully.#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.583 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.600 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.601 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.602 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.602 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.603 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.603 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.604 186792 DEBUG nova.virt.libvirt.driver [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.610 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.632 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.983 186792 INFO nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Took 21.35 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:40:26 np0005531888 nova_compute[186788]: 2025-11-22 07:40:26.984 186792 DEBUG nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:27 np0005531888 nova_compute[186788]: 2025-11-22 07:40:27.230 186792 INFO nova.compute.manager [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Took 22.42 seconds to build instance.#033[00m
Nov 22 02:40:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:27.232 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:27 np0005531888 nova_compute[186788]: 2025-11-22 07:40:27.278 186792 DEBUG oslo_concurrency.lockutils [None req-18d41404-c914-474c-b5d0-b68fb7723405 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "8f7e8343-7bd9-402c-bea5-4a3202b54681" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.297 186792 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Successfully updated port: f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.372 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "refresh_cache-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.373 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquired lock "refresh_cache-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.373 186792 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.682 186792 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:40:28 np0005531888 podman[213741]: 2025-11-22 07:40:28.709529247 +0000 UTC m=+0.075052421 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.721 186792 DEBUG nova.compute.manager [req-43978c52-d494-4188-8c28-a9306a1a37dc req-9dc07cb6-5979-494d-b7ba-ad1ce934d795 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Received event network-vif-plugged-b405020c-4786-4b1d-9c68-6ab4d6d7689a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.721 186792 DEBUG oslo_concurrency.lockutils [req-43978c52-d494-4188-8c28-a9306a1a37dc req-9dc07cb6-5979-494d-b7ba-ad1ce934d795 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.721 186792 DEBUG oslo_concurrency.lockutils [req-43978c52-d494-4188-8c28-a9306a1a37dc req-9dc07cb6-5979-494d-b7ba-ad1ce934d795 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.721 186792 DEBUG oslo_concurrency.lockutils [req-43978c52-d494-4188-8c28-a9306a1a37dc req-9dc07cb6-5979-494d-b7ba-ad1ce934d795 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8f7e8343-7bd9-402c-bea5-4a3202b54681-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.721 186792 DEBUG nova.compute.manager [req-43978c52-d494-4188-8c28-a9306a1a37dc req-9dc07cb6-5979-494d-b7ba-ad1ce934d795 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] No waiting events found dispatching network-vif-plugged-b405020c-4786-4b1d-9c68-6ab4d6d7689a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:40:28 np0005531888 nova_compute[186788]: 2025-11-22 07:40:28.722 186792 WARNING nova.compute.manager [req-43978c52-d494-4188-8c28-a9306a1a37dc req-9dc07cb6-5979-494d-b7ba-ad1ce934d795 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Received unexpected event network-vif-plugged-b405020c-4786-4b1d-9c68-6ab4d6d7689a for instance with vm_state active and task_state None.#033[00m
Nov 22 02:40:29 np0005531888 nova_compute[186788]: 2025-11-22 07:40:29.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:29 np0005531888 nova_compute[186788]: 2025-11-22 07:40:29.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:40:29 np0005531888 nova_compute[186788]: 2025-11-22 07:40:29.957 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:40:29 np0005531888 nova_compute[186788]: 2025-11-22 07:40:29.994 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.491 186792 DEBUG nova.compute.manager [req-4fb69516-743a-4aa5-933e-7335c3af0475 req-575cfac1-772a-4339-a342-9b26ff93e3f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Received event network-changed-f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.492 186792 DEBUG nova.compute.manager [req-4fb69516-743a-4aa5-933e-7335c3af0475 req-575cfac1-772a-4339-a342-9b26ff93e3f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Refreshing instance network info cache due to event network-changed-f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.492 186792 DEBUG oslo_concurrency.lockutils [req-4fb69516-743a-4aa5-933e-7335c3af0475 req-575cfac1-772a-4339-a342-9b26ff93e3f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.515 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.516 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.516 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.516 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8f7e8343-7bd9-402c-bea5-4a3202b54681 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:40:30 np0005531888 podman[213762]: 2025-11-22 07:40:30.690151044 +0000 UTC m=+0.051718185 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.759 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:30 np0005531888 nova_compute[186788]: 2025-11-22 07:40:30.870 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:33 np0005531888 NetworkManager[55166]: <info>  [1763797233.5153] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/25)
Nov 22 02:40:33 np0005531888 NetworkManager[55166]: <info>  [1763797233.5165] device (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:40:33 np0005531888 NetworkManager[55166]: <info>  [1763797233.5175] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/26)
Nov 22 02:40:33 np0005531888 NetworkManager[55166]: <info>  [1763797233.5177] device (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:40:33 np0005531888 NetworkManager[55166]: <info>  [1763797233.5183] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 22 02:40:33 np0005531888 NetworkManager[55166]: <info>  [1763797233.5188] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 22 02:40:33 np0005531888 NetworkManager[55166]: <info>  [1763797233.5191] device (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 02:40:33 np0005531888 NetworkManager[55166]: <info>  [1763797233.5192] device (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 02:40:33 np0005531888 nova_compute[186788]: 2025-11-22 07:40:33.532 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:33 np0005531888 nova_compute[186788]: 2025-11-22 07:40:33.623 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:33 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:33Z|00032|binding|INFO|Releasing lport 254524bd-994c-43d1-84a4-6c0edeae0f13 from this chassis (sb_readonly=0)
Nov 22 02:40:33 np0005531888 nova_compute[186788]: 2025-11-22 07:40:33.646 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:33 np0005531888 podman[213786]: 2025-11-22 07:40:33.707010764 +0000 UTC m=+0.067558568 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.502 186792 DEBUG nova.compute.manager [req-852f6127-d50b-444d-bd33-58b7ad820d03 req-c6b4c289-5456-4f3b-9e46-983375958aa2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Received event network-changed-b405020c-4786-4b1d-9c68-6ab4d6d7689a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.503 186792 DEBUG nova.compute.manager [req-852f6127-d50b-444d-bd33-58b7ad820d03 req-c6b4c289-5456-4f3b-9e46-983375958aa2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Refreshing instance network info cache due to event network-changed-b405020c-4786-4b1d-9c68-6ab4d6d7689a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.503 186792 DEBUG oslo_concurrency.lockutils [req-852f6127-d50b-444d-bd33-58b7ad820d03 req-c6b4c289-5456-4f3b-9e46-983375958aa2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.547 186792 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Updating instance_info_cache with network_info: [{"id": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "address": "fa:16:3e:42:02:64", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1e5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dbb0ec-bc", "ovs_interfaceid": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.626 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Releasing lock "refresh_cache-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.627 186792 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Instance network_info: |[{"id": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "address": "fa:16:3e:42:02:64", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1e5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dbb0ec-bc", "ovs_interfaceid": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.627 186792 DEBUG oslo_concurrency.lockutils [req-4fb69516-743a-4aa5-933e-7335c3af0475 req-575cfac1-772a-4339-a342-9b26ff93e3f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.628 186792 DEBUG nova.network.neutron [req-4fb69516-743a-4aa5-933e-7335c3af0475 req-575cfac1-772a-4339-a342-9b26ff93e3f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Refreshing network info cache for port f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.631 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Start _get_guest_xml network_info=[{"id": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "address": "fa:16:3e:42:02:64", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1e5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dbb0ec-bc", "ovs_interfaceid": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.638 186792 WARNING nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.646 186792 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.648 186792 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.656 186792 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.657 186792 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.659 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.659 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.660 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.660 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.660 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.660 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.661 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.661 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.661 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.661 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.662 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.662 186792 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.666 186792 DEBUG nova.virt.libvirt.vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:39:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1346960213-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1346960213-1',id=4,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98627e04b62e4ce4bf9650377c674f73',ramdisk_id='',reservation_id='r-qng9bmzp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-83498172',owner_user_name='tempest-AutoAllocateNetworkTest-83498172-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:39:54Z,user_data=None,user_id='12b223a79f8b4927861908eb11663fb5',uuid=9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "address": "fa:16:3e:42:02:64", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1e5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dbb0ec-bc", "ovs_interfaceid": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.667 186792 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converting VIF {"id": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "address": "fa:16:3e:42:02:64", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1e5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dbb0ec-bc", "ovs_interfaceid": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.668 186792 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:02:64,bridge_name='br-int',has_traffic_filtering=True,id=f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dbb0ec-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.668 186792 DEBUG nova.objects.instance [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.682 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <uuid>9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc</uuid>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <name>instance-00000004</name>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <nova:name>tempest-tempest.common.compute-instance-1346960213-1</nova:name>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:40:34</nova:creationTime>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        <nova:user uuid="12b223a79f8b4927861908eb11663fb5">tempest-AutoAllocateNetworkTest-83498172-project-member</nova:user>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        <nova:project uuid="98627e04b62e4ce4bf9650377c674f73">tempest-AutoAllocateNetworkTest-83498172</nova:project>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        <nova:port uuid="f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="fdfe:381f:8400::1e5" ipVersion="6"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.1.0.6" ipVersion="4"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <entry name="serial">9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc</entry>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <entry name="uuid">9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc</entry>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.config"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:42:02:64"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <target dev="tapf3dbb0ec-bc"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/console.log" append="off"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:40:34 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:40:34 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:40:34 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:40:34 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.684 186792 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Preparing to wait for external event network-vif-plugged-f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.684 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.684 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.685 186792 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.685 186792 DEBUG nova.virt.libvirt.vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:39:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1346960213-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1346960213-1',id=4,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98627e04b62e4ce4bf9650377c674f73',ramdisk_id='',reservation_id='r-qng9bmzp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-83498172',owner_user_name='tempest-AutoAllocateNetworkTest-83498172-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:39:54Z,user_data=None,user_id='12b223a79f8b4927861908eb11663fb5',uuid=9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "address": "fa:16:3e:42:02:64", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1e5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dbb0ec-bc", "ovs_interfaceid": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.686 186792 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converting VIF {"id": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "address": "fa:16:3e:42:02:64", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1e5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3dbb0ec-bc", "ovs_interfaceid": "f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.686 186792 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:02:64,bridge_name='br-int',has_traffic_filtering=True,id=f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dbb0ec-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.687 186792 DEBUG os_vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:02:64,bridge_name='br-int',has_traffic_filtering=True,id=f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dbb0ec-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.688 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.688 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.689 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.692 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3dbb0ec-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.692 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf3dbb0ec-bc, col_values=(('external_ids', {'iface-id': 'f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:02:64', 'vm-uuid': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.694 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:34 np0005531888 NetworkManager[55166]: <info>  [1763797234.6952] manager: (tapf3dbb0ec-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.696 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.702 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.704 186792 INFO os_vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:02:64,bridge_name='br-int',has_traffic_filtering=True,id=f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3dbb0ec-bc')#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.818 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.820 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.820 186792 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No VIF found with MAC fa:16:3e:42:02:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:40:34 np0005531888 nova_compute[186788]: 2025-11-22 07:40:34.821 186792 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Using config drive#033[00m
Nov 22 02:40:35 np0005531888 nova_compute[186788]: 2025-11-22 07:40:35.445 186792 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Creating config drive at /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.config#033[00m
Nov 22 02:40:35 np0005531888 nova_compute[186788]: 2025-11-22 07:40:35.450 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpss5wotof execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:35 np0005531888 nova_compute[186788]: 2025-11-22 07:40:35.577 186792 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpss5wotof" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:35 np0005531888 kernel: tapf3dbb0ec-bc: entered promiscuous mode
Nov 22 02:40:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:35Z|00033|binding|INFO|Claiming lport f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f for this chassis.
Nov 22 02:40:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:35Z|00034|binding|INFO|f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f: Claiming fa:16:3e:42:02:64 10.1.0.6 fdfe:381f:8400::1e5
Nov 22 02:40:35 np0005531888 NetworkManager[55166]: <info>  [1763797235.6552] manager: (tapf3dbb0ec-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Nov 22 02:40:35 np0005531888 nova_compute[186788]: 2025-11-22 07:40:35.661 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:35Z|00035|binding|INFO|Setting lport f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f ovn-installed in OVS
Nov 22 02:40:35 np0005531888 nova_compute[186788]: 2025-11-22 07:40:35.668 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:35 np0005531888 systemd-udevd[213828]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:40:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:35Z|00036|binding|INFO|Setting lport f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f up in Southbound
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.684 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:02:64 10.1.0.6 fdfe:381f:8400::1e5'], port_security=['fa:16:3e:42:02:64 10.1.0.6 fdfe:381f:8400::1e5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.6/26 fdfe:381f:8400::1e5/64', 'neutron:device_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98627e04b62e4ce4bf9650377c674f73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '931bf7c3-500b-4034-8d8e-f18219ff1b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6120d3e5-4a9e-45cc-93a1-87b92bf94714, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.685 104023 INFO neutron.agent.ovn.metadata.agent [-] Port f3dbb0ec-bc13-4305-a0bd-fc9b9e93848f in datapath cd94b117-ddd2-457a-a1e9-a1e03ac67322 bound to our chassis#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.688 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd94b117-ddd2-457a-a1e9-a1e03ac67322#033[00m
Nov 22 02:40:35 np0005531888 NetworkManager[55166]: <info>  [1763797235.6951] device (tapf3dbb0ec-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:40:35 np0005531888 NetworkManager[55166]: <info>  [1763797235.6960] device (tapf3dbb0ec-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:40:35 np0005531888 systemd-machined[153106]: New machine qemu-3-instance-00000004.
Nov 22 02:40:35 np0005531888 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.701 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[011da5d4-943e-4815-b537-a65af264aa56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.702 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd94b117-d1 in ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.704 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd94b117-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.704 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3bf6e2-0470-4b77-a617-3f47a4af5570]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.705 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[265333a8-4ddb-4868-91f8-5e447f17627b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.730 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[66d0ab0a-6cd4-4a57-acaf-dfc4a069937a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.749 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b06309fe-046b-4bff-9b19-e34ac14339b1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.787 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8899b5c5-f9ec-486a-a75a-70a7b9dc82f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.794 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8e3a5be3-a3ca-468c-ba3f-82f5a9e214a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 NetworkManager[55166]: <info>  [1763797235.7958] manager: (tapcd94b117-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.834 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[0c39fb4a-f1d1-49bb-a07a-6f24e05fe981]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.838 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[433bc99f-72a0-4655-86e9-e809b82ca9e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 NetworkManager[55166]: <info>  [1763797235.8665] device (tapcd94b117-d0): carrier: link connected
Nov 22 02:40:35 np0005531888 nova_compute[186788]: 2025-11-22 07:40:35.871 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.874 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0e5f16-0e05-4a9f-8cb8-b6c57ea0d977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.891 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[567b3a7f-e86a-4d38-9e71-3754d81c49f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd94b117-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:df:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397651, 'reachable_time': 19785, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213863, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.910 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9f3d7de6-fe63-450e-9824-9b083fbeab11]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:dfb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397651, 'tstamp': 397651}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213864, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.930 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[83d3b4e3-2a57-4eba-ac00-444b0f30f609]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd94b117-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:df:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397651, 'reachable_time': 19785, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213865, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:35.963 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[de975f0f-8724-4b32-8c27-b0021a07c2f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.025 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8cea03f9-d5d0-4ac2-9f02-a2f2bdc77313]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.027 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd94b117-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.028 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.028 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd94b117-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:36 np0005531888 NetworkManager[55166]: <info>  [1763797236.0311] manager: (tapcd94b117-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 22 02:40:36 np0005531888 kernel: tapcd94b117-d0: entered promiscuous mode
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.030 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.034 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd94b117-d0, col_values=(('external_ids', {'iface-id': 'f15694ec-11c8-44d4-a18a-7277c1308d45'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:36 np0005531888 ovn_controller[95067]: 2025-11-22T07:40:36Z|00037|binding|INFO|Releasing lport f15694ec-11c8-44d4-a18a-7277c1308d45 from this chassis (sb_readonly=0)
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.037 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.037 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.046 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[69118741-060f-4151-87e8-4709ba80b1a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.049 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.049 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-cd94b117-ddd2-457a-a1e9-a1e03ac67322
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID cd94b117-ddd2-457a-a1e9-a1e03ac67322
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.050 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'env', 'PROCESS_TAG=haproxy-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd94b117-ddd2-457a-a1e9-a1e03ac67322.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:40:36 np0005531888 podman[213896]: 2025-11-22 07:40:36.482901001 +0000 UTC m=+0.074453835 container create 364d4cf66856d909d6d125c9d0a246994ea1bbd629cefcbd762a2334ce53bac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:40:36 np0005531888 podman[213896]: 2025-11-22 07:40:36.433461373 +0000 UTC m=+0.025014227 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:40:36 np0005531888 systemd[1]: Started libpod-conmon-364d4cf66856d909d6d125c9d0a246994ea1bbd629cefcbd762a2334ce53bac8.scope.
Nov 22 02:40:36 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:40:36 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/536cbb60faca2b55de8cbfc7202e453fa2057108d7119397b08f7e8d60049d35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:40:36 np0005531888 podman[213896]: 2025-11-22 07:40:36.587213701 +0000 UTC m=+0.178766565 container init 364d4cf66856d909d6d125c9d0a246994ea1bbd629cefcbd762a2334ce53bac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:40:36 np0005531888 podman[213896]: 2025-11-22 07:40:36.594333503 +0000 UTC m=+0.185886337 container start 364d4cf66856d909d6d125c9d0a246994ea1bbd629cefcbd762a2334ce53bac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 02:40:36 np0005531888 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213911]: [NOTICE]   (213915) : New worker (213917) forked
Nov 22 02:40:36 np0005531888 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213911]: [NOTICE]   (213915) : Loading success.
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.793 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.794 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:40:36.796 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.890 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Updating instance_info_cache with network_info: [{"id": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "address": "fa:16:3e:72:68:db", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb405020c-47", "ovs_interfaceid": "b405020c-4786-4b1d-9c68-6ab4d6d7689a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.958 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.959 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.960 186792 DEBUG oslo_concurrency.lockutils [req-852f6127-d50b-444d-bd33-58b7ad820d03 req-c6b4c289-5456-4f3b-9e46-983375958aa2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8f7e8343-7bd9-402c-bea5-4a3202b54681" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.960 186792 DEBUG nova.network.neutron [req-852f6127-d50b-444d-bd33-58b7ad820d03 req-c6b4c289-5456-4f3b-9e46-983375958aa2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Refreshing network info cache for port b405020c-4786-4b1d-9c68-6ab4d6d7689a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.961 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.962 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.962 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.963 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.963 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.964 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.990 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.990 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.991 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:36 np0005531888 nova_compute[186788]: 2025-11-22 07:40:36.991 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.296 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.360 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.361 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.423 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.432 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:37.484 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}fc7d9c7ecd26b234eaa30e5ea11275171088da69ea400a9f08d986f43ed0da80" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.509 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.511 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.582 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797237.5816932, 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.583 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] VM Started (Lifecycle Event)#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.593 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f7e8343-7bd9-402c-bea5-4a3202b54681/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.656 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.659 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797237.58204, 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.659 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.699 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.703 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:40:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:37.704 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1183 Content-Type: application/json Date: Sat, 22 Nov 2025 07:40:37 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-b216673a-d766-4a45-93d7-5fd76c576729 x-openstack-request-id: req-b216673a-d766-4a45-93d7-5fd76c576729 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 02:40:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:37.704 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1890587748", "name": "tempest-flavor_with_ephemeral_0-22804361", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1890587748"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1890587748"}]}, {"id": "1c351edf-5b2d-477d-93d0-c380bdae83e7", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}]}, {"id": "31612188-3cd6-428b-9166-9568f0affd4a", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}]}, {"id": "962835561", "name": "tempest-flavor_with_ephemeral_1-2110312460", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/962835561"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/962835561"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 02:40:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:37.704 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-b216673a-d766-4a45-93d7-5fd76c576729 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 02:40:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:37.706 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}fc7d9c7ecd26b234eaa30e5ea11275171088da69ea400a9f08d986f43ed0da80" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.751 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.809 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.811 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5583MB free_disk=73.46039199829102GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.811 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.811 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.915 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.915 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 8f7e8343-7bd9-402c-bea5-4a3202b54681 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.916 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:40:37 np0005531888 nova_compute[186788]: 2025-11-22 07:40:37.916 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:40:38 np0005531888 nova_compute[186788]: 2025-11-22 07:40:38.102 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:40:38 np0005531888 nova_compute[186788]: 2025-11-22 07:40:38.122 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:40:38 np0005531888 nova_compute[186788]: 2025-11-22 07:40:38.154 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:40:38 np0005531888 nova_compute[186788]: 2025-11-22 07:40:38.155 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.281 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 22 Nov 2025 07:40:37 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d41ecaad-79b7-4cc7-8cfd-e86f63d911c7 x-openstack-request-id: req-d41ecaad-79b7-4cc7-8cfd-e86f63d911c7 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.281 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "31612188-3cd6-428b-9166-9568f0affd4a", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.281 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a used request id req-d41ecaad-79b7-4cc7-8cfd-e86f63d911c7 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.284 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'name': 'tempest-tempest.common.compute-instance-1346960213-1', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '98627e04b62e4ce4bf9650377c674f73', 'user_id': '12b223a79f8b4927861908eb11663fb5', 'hostId': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.287 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}fc7d9c7ecd26b234eaa30e5ea11275171088da69ea400a9f08d986f43ed0da80" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.483 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1183 Content-Type: application/json Date: Sat, 22 Nov 2025 07:40:38 GMT Keep-Alive: timeout=5, max=98 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-8cd9f8cc-69c2-4767-a811-598c9967c4bf x-openstack-request-id: req-8cd9f8cc-69c2-4767-a811-598c9967c4bf _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.483 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1890587748", "name": "tempest-flavor_with_ephemeral_0-22804361", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1890587748"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1890587748"}]}, {"id": "1c351edf-5b2d-477d-93d0-c380bdae83e7", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}]}, {"id": "31612188-3cd6-428b-9166-9568f0affd4a", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}]}, {"id": "962835561", "name": "tempest-flavor_with_ephemeral_1-2110312460", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/962835561"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/962835561"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.483 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-8cd9f8cc-69c2-4767-a811-598c9967c4bf request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.485 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/1890587748 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}fc7d9c7ecd26b234eaa30e5ea11275171088da69ea400a9f08d986f43ed0da80" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.641 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 450 Content-Type: application/json Date: Sat, 22 Nov 2025 07:40:38 GMT Keep-Alive: timeout=5, max=97 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-e2416385-ece8-41e6-a041-a5cb41cd7bb2 x-openstack-request-id: req-e2416385-ece8-41e6-a041-a5cb41cd7bb2 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.641 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "1890587748", "name": "tempest-flavor_with_ephemeral_0-22804361", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1890587748"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1890587748"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.641 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/1890587748 used request id req-e2416385-ece8-41e6-a041-a5cb41cd7bb2 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.642 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000008', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'hostId': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.643 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.643 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.643 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-1>, <NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-2058718492>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-1>, <NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-2058718492>]
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.649 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc / tapf3dbb0ec-bc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.649 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.652 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8f7e8343-7bd9-402c-bea5-4a3202b54681 / tapb405020c-47 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.653 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dd53e85-8e4e-4f51-b187-7d23c610464e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000004-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-tapf3dbb0ec-bc', 'timestamp': '2025-11-22T07:40:38.645853', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'tapf3dbb0ec-bc', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3dbb0ec-bc'}, 'message_id': '8a2915b6-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.345581748, 'message_signature': '6245640e3b05369876421f74bf6fdc057101e8c1f6477eeaa2834cac2a2491f9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': 'instance-00000008-8f7e8343-7bd9-402c-bea5-4a3202b54681-tapb405020c-47', 'timestamp': '2025-11-22T07:40:38.645853', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'tapb405020c-47', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:68:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb405020c-47'}, 'message_id': '8a298c44-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.350108049, 'message_signature': '7cd87455a85de1752f41e2aef3aae4ae7f0cac17b731d9366445e65628238f61'}]}, 'timestamp': '2025-11-22 07:40:38.653400', '_unique_id': '0a02e817b70445809f475aec1626a121'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.663 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.663 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.664 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd900df11-f141-4b58-9639-d5e464b50dcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000004-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-tapf3dbb0ec-bc', 'timestamp': '2025-11-22T07:40:38.663702', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'tapf3dbb0ec-bc', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3dbb0ec-bc'}, 'message_id': '8a2b3008-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.345581748, 'message_signature': 'fef4883f82af3edf027635ea6ed7049fc924d9182a972b78a3130842e4b0b1fb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': 'instance-00000008-8f7e8343-7bd9-402c-bea5-4a3202b54681-tapb405020c-47', 'timestamp': '2025-11-22T07:40:38.663702', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'tapb405020c-47', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:68:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb405020c-47'}, 'message_id': '8a2b3d64-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.350108049, 'message_signature': '92410fb7fe47bb164123dece2b299b295dac53b2430b50d6c1e7784557886778'}]}, 'timestamp': '2025-11-22 07:40:38.664448', '_unique_id': 'fa0e7615379842d89a901915610a57dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.666 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.678 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.679 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.692 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.692 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5bdb1af-624b-430d-a043-1aa32bb7f0a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-vda', 'timestamp': '2025-11-22T07:40:38.666155', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'instance-00000004', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a2d7ec6-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.36584425, 'message_signature': '768c2c4fa0f78787bae2ee5d448cd3a12fa5df3a677649a005cadc8656b4f988'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-sda', 'timestamp': '2025-11-22T07:40:38.666155', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'instance-00000004', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a2d8bbe-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.36584425, 'message_signature': '7b2e3c0f09f0ac79e21c44ba13fe8ea212d8542bc978706b16a257ddb984cb91'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681-vda', 'timestamp': '2025-11-22T07:40:38.666155', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'instance-00000008', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a2f8e8c-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.379104381, 'message_signature': 'a1b4e83480e87e59605ac2584bcb13a7557baa1931248d93dc1d914e51937d4c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681-sda', 'timestamp': '2025-11-22T07:40:38.666155', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'instance-00000008', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a2f9eae-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.379104381, 'message_signature': '7c2f08e19c4a67f3382a3de88ffc1108dfcb5ecbc07416dc7342c764f8ed3d4f'}]}, 'timestamp': '2025-11-22 07:40:38.693204', '_unique_id': 'd85c7624834a4ab2af9f3e36179ad527'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.696 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.729 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.730 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.769 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.770 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c97bde8e-3955-4163-b217-af310278831a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-vda', 'timestamp': '2025-11-22T07:40:38.696220', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'instance-00000004', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a353c6a-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.395915938, 'message_signature': 'a9a0083dca9185ed782cad4e98759a3ac540c8101d14f5aff3e89d6a7fbfd0d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-sda', 'timestamp': '2025-11-22T07:40:38.696220', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'instance-00000004', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a3553f8-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.395915938, 'message_signature': 'c8641ac6f2009a3a3c7b469d63c93585c73646fde571137e375516045259c80a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681-vda', 'timestamp': '2025-11-22T07:40:38.696220', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'instance-00000008', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a3b62c0-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.430108838, 'message_signature': '3f1c707dc69e5ec5839e20e8559e023649608847512f4ca71aa4c7886fa0f267'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681-sda', 'timestamp': '2025-11-22T07:40:38.696220', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'instance-00000008', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a3b6f90-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.430108838, 'message_signature': '75ed489a5a8b36be660e73e8ed378ce85d351a83280ab93d6bd3ad56ef8bca53'}]}, 'timestamp': '2025-11-22 07:40:38.770673', '_unique_id': '729687b073c64191bf58387fe29d0485'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.771 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.772 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.772 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.773 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.773 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.773 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c021c43-fa88-4a14-bee8-338bb190e5a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-vda', 'timestamp': '2025-11-22T07:40:38.772905', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'instance-00000004', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a3bd610-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.395915938, 'message_signature': '227fcdbb00af6dcaaa9b14b81c3e374b8284c4c9ea904c924acad43db8aaf978'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-sda', 'timestamp': '2025-11-22T07:40:38.772905', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'instance-00000004', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a3be1e6-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.395915938, 'message_signature': 'ba3035ab46e0472ef3b837bf967cd636bd850f83f0db406c350d477c38e2d833'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681-vda', 'timestamp': '2025-11-22T07:40:38.772905', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'instance-00000008', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a3bed94-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.430108838, 'message_signature': '807ae7557e9ab6d542bebac15178fcc1165806d40cbc4888a07885b6fbf9dcd5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681-sda', 'timestamp': '2025-11-22T07:40:38.772905', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'instance-00000008', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a3bf834-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.430108838, 'message_signature': 'f281c7b8a45f913165a66c627016334b4ee01012b612443877d58b80d9e238e1'}]}, 'timestamp': '2025-11-22 07:40:38.774118', '_unique_id': 'b81506fc636f4ada902ae735d1be2dfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.776 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.776 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.777 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.777 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '456b6023-538a-402b-9182-591a7905fb49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-vda', 'timestamp': '2025-11-22T07:40:38.776358', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'instance-00000004', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a3c5c70-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.395915938, 'message_signature': '58d91a688ae510d91f783575dde28b576aa1b34a8c6a9528bf7fd80c91a2e0c0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-sda', 'timestamp': '2025-11-22T07:40:38.776358', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'instance-00000004', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a3c6918-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.395915938, 'message_signature': '32271dd2628c35c17c76cc053570d401ab7c7f177041a484baca48750f71068f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681-vda', 'timestamp': '2025-11-22T07:40:38.776358', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'instance-00000008', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a3c73cc-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.430108838, 'message_signature': 'd32b730fbe3fd774acba250fd9a1dd1a7d7cf7380169bee921b96c3c3f81491a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681-sda', 'timestamp': '2025-11-22T07:40:38.776358', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'instance-00000008', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a3c7e3a-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.430108838, 'message_signature': '34984c1bd49668a654232804ced94787b1133af0b5b786ec10036ce535c5ae8a'}]}, 'timestamp': '2025-11-22 07:40:38.777543', '_unique_id': '99f59af67d0e447ea3c9e6c45e28a96a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.779 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.780 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5946d89-aa66-4e83-8610-d442a672b81f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000004-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-tapf3dbb0ec-bc', 'timestamp': '2025-11-22T07:40:38.779686', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'tapf3dbb0ec-bc', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3dbb0ec-bc'}, 'message_id': '8a3cdf24-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.345581748, 'message_signature': 'b5b8cb47d28bd556446e0a5fbdf6b54950bb244688cccae775cf7430a4028750'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': 'instance-00000008-8f7e8343-7bd9-402c-bea5-4a3202b54681-tapb405020c-47', 'timestamp': '2025-11-22T07:40:38.779686', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'tapb405020c-47', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:68:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb405020c-47'}, 'message_id': '8a3ced34-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.350108049, 'message_signature': 'ec474e23bd271c4279b387d46a5bb3dc445aa52a5622f662e744db327ed4a574'}]}, 'timestamp': '2025-11-22 07:40:38.780399', '_unique_id': 'fb92cf3e5f0c49f6a25d995b622657e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.782 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.782 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8931c6d-643a-4962-a2e2-b58f3d793c1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000004-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-tapf3dbb0ec-bc', 'timestamp': '2025-11-22T07:40:38.782355', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'tapf3dbb0ec-bc', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3dbb0ec-bc'}, 'message_id': '8a3d46f8-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.345581748, 'message_signature': '384d93c43eae2559fc9983aecedc044255554132d474e83ace390d2eb68f18ff'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': 'instance-00000008-8f7e8343-7bd9-402c-bea5-4a3202b54681-tapb405020c-47', 'timestamp': '2025-11-22T07:40:38.782355', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'tapb405020c-47', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:68:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb405020c-47'}, 'message_id': '8a3d544a-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.350108049, 'message_signature': '003562357f5b342fb67d7a345117c3ba01fdbb4fb5fb74d3eb8f49fa2e1c1b18'}]}, 'timestamp': '2025-11-22 07:40:38.783024', '_unique_id': '50709ce0cd2046e3b51391df851d52d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.784 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.784 12 DEBUG ceilometer.compute.pollsters [-] 9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.785 12 DEBUG ceilometer.compute.pollsters [-] 8f7e8343-7bd9-402c-bea5-4a3202b54681/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d7ab61a-80c0-422e-8e01-e053b00257bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000004-9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc-tapf3dbb0ec-bc', 'timestamp': '2025-11-22T07:40:38.784843', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-1', 'name': 'tapf3dbb0ec-bc', 'instance_id': '9c54bcfc-c187-44b0-99ff-d9cde3ddf7fc', 'instance_type': 'm1.nano', 'host': '47067cdc294566f12342b54a33c0b2843b5ad7d6675d023c4344df9a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3dbb0ec-bc'}, 'message_id': '8a3da792-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.345581748, 'message_signature': 'ffac69f92a6f73782541d13962dcd41a8ae359c4777730611c8fafc7f56d36e7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_name': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_name': None, 'resource_id': 'instance-00000008-8f7e8343-7bd9-402c-bea5-4a3202b54681-tapb405020c-47', 'timestamp': '2025-11-22T07:40:38.784843', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-2058718492', 'name': 'tapb405020c-47', 'instance_id': '8f7e8343-7bd9-402c-bea5-4a3202b54681', 'instance_type': 'tempest-flavor_with_ephemeral_0-22804361', 'host': 'b2d4d21428043dd443791314f92a4e4ff7f34d747e4aba9a3d8b94f9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1890587748', 'name': 'tempest-flavor_with_ephemeral_0-22804361', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:68:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb405020c-47'}, 'message_id': '8a3db32c-c776-11f0-941d-fa163e6775e5', 'monotonic_time': 3979.350108049, 'message_signature': 'd6372930c9e12c4e4bb0e60711d94d0f60ac345ca64e5872887b79332163a561'}]}, 'timestamp': '2025-11-22 07:40:38.785460', '_unique_id': 'b26127de214c49bd9632e4a3139e70ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:38 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:40:38.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:41:03 np0005531888 nova_compute[186788]: 2025-11-22 07:41:03.786 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531888 rsyslogd[1010]: imjournal: 967 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Nov 22 02:41:04 np0005531888 podman[214276]: 2025-11-22 07:41:04.692010953 +0000 UTC m=+0.064978357 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Nov 22 02:41:05 np0005531888 nova_compute[186788]: 2025-11-22 07:41:05.312 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "ee885dd4-c723-4bb6-a0f3-87effcedb330" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:05 np0005531888 nova_compute[186788]: 2025-11-22 07:41:05.312 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:05 np0005531888 nova_compute[186788]: 2025-11-22 07:41:05.388 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:41:05 np0005531888 nova_compute[186788]: 2025-11-22 07:41:05.691 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:05 np0005531888 nova_compute[186788]: 2025-11-22 07:41:05.692 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:05 np0005531888 nova_compute[186788]: 2025-11-22 07:41:05.699 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:41:05 np0005531888 nova_compute[186788]: 2025-11-22 07:41:05.700 186792 INFO nova.compute.claims [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:41:05 np0005531888 nova_compute[186788]: 2025-11-22 07:41:05.884 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.084 186792 DEBUG nova.compute.provider_tree [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.121 186792 DEBUG nova.scheduler.client.report [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.176 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.177 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.393 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.394 186792 DEBUG nova.network.neutron [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.471 186792 INFO nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.586 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.943 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.947 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.948 186792 INFO nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Creating image(s)#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.949 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.950 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.950 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:06 np0005531888 nova_compute[186788]: 2025-11-22 07:41:06.963 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.031 186792 DEBUG nova.policy [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0eeafa43d6c84f6888a05c3f4ca3fb78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.035 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.035 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.036 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.048 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.113 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.114 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.237 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk 1073741824" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.238 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.239 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.298 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.300 186792 DEBUG nova.virt.disk.api [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Checking if we can resize image /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.300 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.361 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.362 186792 DEBUG nova.virt.disk.api [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Cannot resize image /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.363 186792 DEBUG nova.objects.instance [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lazy-loading 'migration_context' on Instance uuid ee885dd4-c723-4bb6-a0f3-87effcedb330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.423 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.424 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.424 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.425 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.425 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.426 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.451 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.452 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.491 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.493 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.507 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.575 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.576 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.577 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.589 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.650 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.652 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.eph0 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.724 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.eph0 1073741824" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.725 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.725 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.790 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.791 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.792 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Ensure instance console log exists: /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.793 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.793 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:07 np0005531888 nova_compute[186788]: 2025-11-22 07:41:07.793 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:08 np0005531888 nova_compute[186788]: 2025-11-22 07:41:08.732 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797253.7308218, 8f7e8343-7bd9-402c-bea5-4a3202b54681 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:41:08 np0005531888 nova_compute[186788]: 2025-11-22 07:41:08.734 186792 INFO nova.compute.manager [-] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:41:08 np0005531888 nova_compute[186788]: 2025-11-22 07:41:08.766 186792 DEBUG nova.compute.manager [None req-53de97e5-7fd8-4c03-b8a8-e45833dfeba6 - - - - - -] [instance: 8f7e8343-7bd9-402c-bea5-4a3202b54681] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:41:08 np0005531888 nova_compute[186788]: 2025-11-22 07:41:08.789 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:08 np0005531888 nova_compute[186788]: 2025-11-22 07:41:08.982 186792 DEBUG nova.network.neutron [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Successfully created port: 6d7f4376-d18e-4738-9936-779dc0fdfefb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:41:10 np0005531888 nova_compute[186788]: 2025-11-22 07:41:10.818 186792 DEBUG nova.network.neutron [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Successfully updated port: 6d7f4376-d18e-4738-9936-779dc0fdfefb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:41:10 np0005531888 nova_compute[186788]: 2025-11-22 07:41:10.859 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:41:10 np0005531888 nova_compute[186788]: 2025-11-22 07:41:10.859 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquired lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:41:10 np0005531888 nova_compute[186788]: 2025-11-22 07:41:10.859 186792 DEBUG nova.network.neutron [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:41:10 np0005531888 nova_compute[186788]: 2025-11-22 07:41:10.887 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:11 np0005531888 nova_compute[186788]: 2025-11-22 07:41:11.167 186792 DEBUG nova.compute.manager [req-cd1578d7-4b5d-4364-9fa4-0265b4778de0 req-753e9aa2-4144-47f3-b3d2-542ce541acf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received event network-changed-6d7f4376-d18e-4738-9936-779dc0fdfefb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:11 np0005531888 nova_compute[186788]: 2025-11-22 07:41:11.168 186792 DEBUG nova.compute.manager [req-cd1578d7-4b5d-4364-9fa4-0265b4778de0 req-753e9aa2-4144-47f3-b3d2-542ce541acf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Refreshing instance network info cache due to event network-changed-6d7f4376-d18e-4738-9936-779dc0fdfefb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:41:11 np0005531888 nova_compute[186788]: 2025-11-22 07:41:11.168 186792 DEBUG oslo_concurrency.lockutils [req-cd1578d7-4b5d-4364-9fa4-0265b4778de0 req-753e9aa2-4144-47f3-b3d2-542ce541acf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:41:13 np0005531888 nova_compute[186788]: 2025-11-22 07:41:13.052 186792 DEBUG nova.network.neutron [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:41:13 np0005531888 nova_compute[186788]: 2025-11-22 07:41:13.792 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.101 186792 DEBUG nova.network.neutron [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Updating instance_info_cache with network_info: [{"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.152 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Releasing lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.153 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Instance network_info: |[{"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.153 186792 DEBUG oslo_concurrency.lockutils [req-cd1578d7-4b5d-4364-9fa4-0265b4778de0 req-753e9aa2-4144-47f3-b3d2-542ce541acf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.153 186792 DEBUG nova.network.neutron [req-cd1578d7-4b5d-4364-9fa4-0265b4778de0 req-753e9aa2-4144-47f3-b3d2-542ce541acf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Refreshing network info cache for port 6d7f4376-d18e-4738-9936-779dc0fdfefb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.157 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Start _get_guest_xml network_info=[{"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [{'disk_bus': 'virtio', 'device_name': '/dev/vdb', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 1, 'encrypted': False}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.161 186792 WARNING nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.172 186792 DEBUG nova.virt.libvirt.host [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.172 186792 DEBUG nova.virt.libvirt.host [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.178 186792 DEBUG nova.virt.libvirt.host [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.179 186792 DEBUG nova.virt.libvirt.host [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.180 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.180 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:39:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={hw_rng:allowed='True'},flavorid='962835561',id=6,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_1-2110312460',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.180 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.180 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.181 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.181 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.181 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.181 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.181 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.182 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.182 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.182 186792 DEBUG nova.virt.hardware [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.185 186792 DEBUG nova.virt.libvirt.vif [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:41:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-326802289',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-326802289',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(6),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-326802289',id=10,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNvo81uLNODY5pXMCXv/rgxcCiuBWxjDFSMOswBarzwWE4bZrCdQaaMgGCGacDcycmYMfjuNyIpB44+zMTJDP3JvkVGjJV4StWUn/AhoiRpx02XDT0ns/iRT7Ya1fxBPw==',key_name='tempest-keypair-346383250',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af1e32bc189c402bad715e6c4cc8dcfa',ramdisk_id='',reservation_id='r-30knckko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:41:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0eeafa43d6c84f6888a05c3f4ca3fb78',uuid=ee885dd4-c723-4bb6-a0f3-87effcedb330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.185 186792 DEBUG nova.network.os_vif_util [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converting VIF {"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.186 186792 DEBUG nova.network.os_vif_util [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:2f:9e,bridge_name='br-int',has_traffic_filtering=True,id=6d7f4376-d18e-4738-9936-779dc0fdfefb,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d7f4376-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.187 186792 DEBUG nova.objects.instance [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lazy-loading 'pci_devices' on Instance uuid ee885dd4-c723-4bb6-a0f3-87effcedb330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.207 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <uuid>ee885dd4-c723-4bb6-a0f3-87effcedb330</uuid>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <name>instance-0000000a</name>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-326802289</nova:name>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:41:15</nova:creationTime>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <nova:flavor name="tempest-flavor_with_ephemeral_1-2110312460">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        <nova:ephemeral>1</nova:ephemeral>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        <nova:user uuid="0eeafa43d6c84f6888a05c3f4ca3fb78">tempest-ServersWithSpecificFlavorTestJSON-1826293598-project-member</nova:user>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        <nova:project uuid="af1e32bc189c402bad715e6c4cc8dcfa">tempest-ServersWithSpecificFlavorTestJSON-1826293598</nova:project>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        <nova:port uuid="6d7f4376-d18e-4738-9936-779dc0fdfefb">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <entry name="serial">ee885dd4-c723-4bb6-a0f3-87effcedb330</entry>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <entry name="uuid">ee885dd4-c723-4bb6-a0f3-87effcedb330</entry>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.eph0"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <target dev="vdb" bus="virtio"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.config"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:15:2f:9e"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <target dev="tap6d7f4376-d1"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/console.log" append="off"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:41:15 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:41:15 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:41:15 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:41:15 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.208 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Preparing to wait for external event network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.208 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.209 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.209 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.210 186792 DEBUG nova.virt.libvirt.vif [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:41:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-326802289',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-326802289',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(6),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-326802289',id=10,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNvo81uLNODY5pXMCXv/rgxcCiuBWxjDFSMOswBarzwWE4bZrCdQaaMgGCGacDcycmYMfjuNyIpB44+zMTJDP3JvkVGjJV4StWUn/AhoiRpx02XDT0ns/iRT7Ya1fxBPw==',key_name='tempest-keypair-346383250',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af1e32bc189c402bad715e6c4cc8dcfa',ramdisk_id='',reservation_id='r-30knckko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:41:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0eeafa43d6c84f6888a05c3f4ca3fb78',uuid=ee885dd4-c723-4bb6-a0f3-87effcedb330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.210 186792 DEBUG nova.network.os_vif_util [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converting VIF {"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.211 186792 DEBUG nova.network.os_vif_util [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:2f:9e,bridge_name='br-int',has_traffic_filtering=True,id=6d7f4376-d18e-4738-9936-779dc0fdfefb,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d7f4376-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.211 186792 DEBUG os_vif [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:2f:9e,bridge_name='br-int',has_traffic_filtering=True,id=6d7f4376-d18e-4738-9936-779dc0fdfefb,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d7f4376-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.212 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.213 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.213 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.216 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.216 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d7f4376-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.216 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6d7f4376-d1, col_values=(('external_ids', {'iface-id': '6d7f4376-d18e-4738-9936-779dc0fdfefb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:2f:9e', 'vm-uuid': 'ee885dd4-c723-4bb6-a0f3-87effcedb330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.251 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:15 np0005531888 NetworkManager[55166]: <info>  [1763797275.2527] manager: (tap6d7f4376-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.254 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.259 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.260 186792 INFO os_vif [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:2f:9e,bridge_name='br-int',has_traffic_filtering=True,id=6d7f4376-d18e-4738-9936-779dc0fdfefb,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d7f4376-d1')#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.312 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.312 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.313 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.313 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] No VIF found with MAC fa:16:3e:15:2f:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.313 186792 INFO nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Using config drive#033[00m
Nov 22 02:41:15 np0005531888 nova_compute[186788]: 2025-11-22 07:41:15.889 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:16 np0005531888 podman[214332]: 2025-11-22 07:41:16.698465414 +0000 UTC m=+0.066265517 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:41:16 np0005531888 podman[214333]: 2025-11-22 07:41:16.721814221 +0000 UTC m=+0.086377316 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:41:16 np0005531888 nova_compute[186788]: 2025-11-22 07:41:16.874 186792 INFO nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Creating config drive at /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.config#033[00m
Nov 22 02:41:16 np0005531888 nova_compute[186788]: 2025-11-22 07:41:16.881 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp02hhsjte execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.006 186792 DEBUG oslo_concurrency.processutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp02hhsjte" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:17 np0005531888 kernel: tap6d7f4376-d1: entered promiscuous mode
Nov 22 02:41:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:17Z|00044|binding|INFO|Claiming lport 6d7f4376-d18e-4738-9936-779dc0fdfefb for this chassis.
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.090 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:17 np0005531888 NetworkManager[55166]: <info>  [1763797277.0914] manager: (tap6d7f4376-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Nov 22 02:41:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:17Z|00045|binding|INFO|6d7f4376-d18e-4738-9936-779dc0fdfefb: Claiming fa:16:3e:15:2f:9e 10.100.0.5
Nov 22 02:41:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:17Z|00046|binding|INFO|Setting lport 6d7f4376-d18e-4738-9936-779dc0fdfefb ovn-installed in OVS
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.103 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.105 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:17 np0005531888 systemd-udevd[214392]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:41:17 np0005531888 NetworkManager[55166]: <info>  [1763797277.1339] device (tap6d7f4376-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:41:17 np0005531888 systemd-machined[153106]: New machine qemu-4-instance-0000000a.
Nov 22 02:41:17 np0005531888 NetworkManager[55166]: <info>  [1763797277.1364] device (tap6d7f4376-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:41:17 np0005531888 systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.550 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797277.5497212, ee885dd4-c723-4bb6-a0f3-87effcedb330 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.552 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] VM Started (Lifecycle Event)#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.577 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.581 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797277.5498548, ee885dd4-c723-4bb6-a0f3-87effcedb330 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.581 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.604 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.608 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:41:17 np0005531888 nova_compute[186788]: 2025-11-22 07:41:17.624 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:41:18 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:18Z|00047|binding|INFO|Setting lport 6d7f4376-d18e-4738-9936-779dc0fdfefb up in Southbound
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.023 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:2f:9e 10.100.0.5'], port_security=['fa:16:3e:15:2f:9e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ee885dd4-c723-4bb6-a0f3-87effcedb330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b53dee3e-cc57-4959-b703-fd736782ce77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08d7499b-95e9-4cf7-b602-701ee3e333bf, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=6d7f4376-d18e-4738-9936-779dc0fdfefb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.024 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 6d7f4376-d18e-4738-9936-779dc0fdfefb in datapath 1ae6b2a9-f586-4520-bc3d-923fe57139cb bound to our chassis#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.026 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1ae6b2a9-f586-4520-bc3d-923fe57139cb#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.041 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[69de2c4c-8f94-47d0-9e80-acf834bd3ccc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.042 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1ae6b2a9-f1 in ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.043 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1ae6b2a9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.044 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3a021286-7616-4f6e-a8af-f4885a2a8789]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.044 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a8decef3-c4fd-414c-af63-83add0c532ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.058 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[d3cbee5d-2817-42e4-b368-95f545c64e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.076 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[593e224d-a849-451b-a24a-aff541fae090]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.109 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[42fa5f3c-ddd9-4c3f-be1e-a2c2c80d90fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.115 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[119c7398-24d5-41d8-baa8-eb26ee5b54cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 NetworkManager[55166]: <info>  [1763797278.1171] manager: (tap1ae6b2a9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Nov 22 02:41:18 np0005531888 systemd-udevd[214396]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.150 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d86b5a-1e83-4df7-9918-bdb6e930da92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.154 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[52bf9863-4ef7-438d-ace5-903745f18cff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 NetworkManager[55166]: <info>  [1763797278.1821] device (tap1ae6b2a9-f0): carrier: link connected
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.189 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a16ad6-98ab-42df-97d4-82a41213eef1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.206 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[97c5088e-e0a0-492b-8cdb-5fddf14727ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1ae6b2a9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:30:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401882, 'reachable_time': 15773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214435, 'error': None, 'target': 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.222 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3461c1d3-be1d-429c-aeb6-71fac2b608f4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:30cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401882, 'tstamp': 401882}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214436, 'error': None, 'target': 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.241 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d13df113-25b4-45b1-8ee9-92328e6b35fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1ae6b2a9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:30:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401882, 'reachable_time': 15773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214437, 'error': None, 'target': 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.270 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[964421f2-7ea9-462f-afff-1f01596bebfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.334 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[84fbcbce-4056-430f-8adb-c847ff339cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.337 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ae6b2a9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.338 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.338 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ae6b2a9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:18 np0005531888 kernel: tap1ae6b2a9-f0: entered promiscuous mode
Nov 22 02:41:18 np0005531888 nova_compute[186788]: 2025-11-22 07:41:18.341 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:18 np0005531888 NetworkManager[55166]: <info>  [1763797278.3424] manager: (tap1ae6b2a9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 22 02:41:18 np0005531888 nova_compute[186788]: 2025-11-22 07:41:18.343 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.344 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1ae6b2a9-f0, col_values=(('external_ids', {'iface-id': '254524bd-994c-43d1-84a4-6c0edeae0f13'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:18 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:18Z|00048|binding|INFO|Releasing lport 254524bd-994c-43d1-84a4-6c0edeae0f13 from this chassis (sb_readonly=0)
Nov 22 02:41:18 np0005531888 nova_compute[186788]: 2025-11-22 07:41:18.346 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.347 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1ae6b2a9-f586-4520-bc3d-923fe57139cb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1ae6b2a9-f586-4520-bc3d-923fe57139cb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.348 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1f79f0c1-1a16-4325-af83-823e4cb0d89f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.349 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-1ae6b2a9-f586-4520-bc3d-923fe57139cb
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/1ae6b2a9-f586-4520-bc3d-923fe57139cb.pid.haproxy
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 1ae6b2a9-f586-4520-bc3d-923fe57139cb
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:41:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:18.350 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'env', 'PROCESS_TAG=haproxy-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1ae6b2a9-f586-4520-bc3d-923fe57139cb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:41:18 np0005531888 nova_compute[186788]: 2025-11-22 07:41:18.357 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:18 np0005531888 podman[214470]: 2025-11-22 07:41:18.755115795 +0000 UTC m=+0.060651332 container create 35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 02:41:18 np0005531888 systemd[1]: Started libpod-conmon-35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d.scope.
Nov 22 02:41:18 np0005531888 podman[214470]: 2025-11-22 07:41:18.718709252 +0000 UTC m=+0.024244809 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:41:18 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:41:18 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dae14049ec5e6c2a0ac7529a4d5266fb46836ab596c929e85cdad982089f276/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:41:18 np0005531888 podman[214470]: 2025-11-22 07:41:18.844495082 +0000 UTC m=+0.150030639 container init 35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:41:18 np0005531888 podman[214470]: 2025-11-22 07:41:18.850102548 +0000 UTC m=+0.155638085 container start 35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:41:18 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[214485]: [NOTICE]   (214489) : New worker (214491) forked
Nov 22 02:41:18 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[214485]: [NOTICE]   (214489) : Loading success.
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.795 186792 DEBUG nova.compute.manager [req-c5866008-5df7-4906-ad9e-8d95f6d49e5b req-c0f41d06-7882-4d11-a542-ed4a6f7efb5b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received event network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.796 186792 DEBUG oslo_concurrency.lockutils [req-c5866008-5df7-4906-ad9e-8d95f6d49e5b req-c0f41d06-7882-4d11-a542-ed4a6f7efb5b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.796 186792 DEBUG oslo_concurrency.lockutils [req-c5866008-5df7-4906-ad9e-8d95f6d49e5b req-c0f41d06-7882-4d11-a542-ed4a6f7efb5b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.796 186792 DEBUG oslo_concurrency.lockutils [req-c5866008-5df7-4906-ad9e-8d95f6d49e5b req-c0f41d06-7882-4d11-a542-ed4a6f7efb5b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.797 186792 DEBUG nova.compute.manager [req-c5866008-5df7-4906-ad9e-8d95f6d49e5b req-c0f41d06-7882-4d11-a542-ed4a6f7efb5b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Processing event network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.797 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.802 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797279.8017254, ee885dd4-c723-4bb6-a0f3-87effcedb330 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.802 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.805 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.809 186792 INFO nova.virt.libvirt.driver [-] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Instance spawned successfully.#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.809 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.827 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.833 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.837 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.837 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.838 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.838 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.838 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.839 186792 DEBUG nova.virt.libvirt.driver [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.867 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.978 186792 INFO nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Took 13.03 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:41:19 np0005531888 nova_compute[186788]: 2025-11-22 07:41:19.979 186792 DEBUG nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:41:20 np0005531888 nova_compute[186788]: 2025-11-22 07:41:20.013 186792 DEBUG nova.network.neutron [req-cd1578d7-4b5d-4364-9fa4-0265b4778de0 req-753e9aa2-4144-47f3-b3d2-542ce541acf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Updated VIF entry in instance network info cache for port 6d7f4376-d18e-4738-9936-779dc0fdfefb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:41:20 np0005531888 nova_compute[186788]: 2025-11-22 07:41:20.013 186792 DEBUG nova.network.neutron [req-cd1578d7-4b5d-4364-9fa4-0265b4778de0 req-753e9aa2-4144-47f3-b3d2-542ce541acf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Updating instance_info_cache with network_info: [{"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:41:20 np0005531888 nova_compute[186788]: 2025-11-22 07:41:20.071 186792 DEBUG oslo_concurrency.lockutils [req-cd1578d7-4b5d-4364-9fa4-0265b4778de0 req-753e9aa2-4144-47f3-b3d2-542ce541acf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:41:20 np0005531888 nova_compute[186788]: 2025-11-22 07:41:20.105 186792 INFO nova.compute.manager [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Took 14.60 seconds to build instance.#033[00m
Nov 22 02:41:20 np0005531888 nova_compute[186788]: 2025-11-22 07:41:20.178 186792 DEBUG oslo_concurrency.lockutils [None req-42ae3046-4c6d-4947-85b5-11559345a772 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.865s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:20 np0005531888 nova_compute[186788]: 2025-11-22 07:41:20.284 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:20 np0005531888 podman[214500]: 2025-11-22 07:41:20.683486526 +0000 UTC m=+0.052083024 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:41:20 np0005531888 nova_compute[186788]: 2025-11-22 07:41:20.894 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:21 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:21Z|00049|binding|INFO|Releasing lport 254524bd-994c-43d1-84a4-6c0edeae0f13 from this chassis (sb_readonly=0)
Nov 22 02:41:21 np0005531888 nova_compute[186788]: 2025-11-22 07:41:21.274 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:21 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:21Z|00050|binding|INFO|Releasing lport 254524bd-994c-43d1-84a4-6c0edeae0f13 from this chassis (sb_readonly=0)
Nov 22 02:41:21 np0005531888 nova_compute[186788]: 2025-11-22 07:41:21.433 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:22 np0005531888 nova_compute[186788]: 2025-11-22 07:41:22.009 186792 DEBUG nova.compute.manager [req-1f5adec4-acff-42ee-94d4-ca7f361b7299 req-c59dcc1d-7e72-4bf6-9615-b2371f3031fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received event network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:22 np0005531888 nova_compute[186788]: 2025-11-22 07:41:22.010 186792 DEBUG oslo_concurrency.lockutils [req-1f5adec4-acff-42ee-94d4-ca7f361b7299 req-c59dcc1d-7e72-4bf6-9615-b2371f3031fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:22 np0005531888 nova_compute[186788]: 2025-11-22 07:41:22.011 186792 DEBUG oslo_concurrency.lockutils [req-1f5adec4-acff-42ee-94d4-ca7f361b7299 req-c59dcc1d-7e72-4bf6-9615-b2371f3031fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:22 np0005531888 nova_compute[186788]: 2025-11-22 07:41:22.011 186792 DEBUG oslo_concurrency.lockutils [req-1f5adec4-acff-42ee-94d4-ca7f361b7299 req-c59dcc1d-7e72-4bf6-9615-b2371f3031fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:22 np0005531888 nova_compute[186788]: 2025-11-22 07:41:22.012 186792 DEBUG nova.compute.manager [req-1f5adec4-acff-42ee-94d4-ca7f361b7299 req-c59dcc1d-7e72-4bf6-9615-b2371f3031fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] No waiting events found dispatching network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:41:22 np0005531888 nova_compute[186788]: 2025-11-22 07:41:22.012 186792 WARNING nova.compute.manager [req-1f5adec4-acff-42ee-94d4-ca7f361b7299 req-c59dcc1d-7e72-4bf6-9615-b2371f3031fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received unexpected event network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb for instance with vm_state active and task_state None.#033[00m
Nov 22 02:41:24 np0005531888 nova_compute[186788]: 2025-11-22 07:41:24.187 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:24 np0005531888 NetworkManager[55166]: <info>  [1763797284.1891] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 22 02:41:24 np0005531888 NetworkManager[55166]: <info>  [1763797284.1910] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Nov 22 02:41:24 np0005531888 nova_compute[186788]: 2025-11-22 07:41:24.255 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:24 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:24Z|00051|binding|INFO|Releasing lport 254524bd-994c-43d1-84a4-6c0edeae0f13 from this chassis (sb_readonly=0)
Nov 22 02:41:24 np0005531888 nova_compute[186788]: 2025-11-22 07:41:24.265 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:24 np0005531888 nova_compute[186788]: 2025-11-22 07:41:24.672 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:24.671 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:41:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:24.673 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:41:24 np0005531888 podman[214526]: 2025-11-22 07:41:24.705686757 +0000 UTC m=+0.071558460 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:41:25 np0005531888 nova_compute[186788]: 2025-11-22 07:41:25.011 186792 DEBUG nova.compute.manager [req-f93acc6f-7950-4b45-be7e-fdf993bc0548 req-fce0144a-407e-49d1-9164-c5c37a58575d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received event network-changed-6d7f4376-d18e-4738-9936-779dc0fdfefb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:25 np0005531888 nova_compute[186788]: 2025-11-22 07:41:25.012 186792 DEBUG nova.compute.manager [req-f93acc6f-7950-4b45-be7e-fdf993bc0548 req-fce0144a-407e-49d1-9164-c5c37a58575d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Refreshing instance network info cache due to event network-changed-6d7f4376-d18e-4738-9936-779dc0fdfefb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:41:25 np0005531888 nova_compute[186788]: 2025-11-22 07:41:25.012 186792 DEBUG oslo_concurrency.lockutils [req-f93acc6f-7950-4b45-be7e-fdf993bc0548 req-fce0144a-407e-49d1-9164-c5c37a58575d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:41:25 np0005531888 nova_compute[186788]: 2025-11-22 07:41:25.012 186792 DEBUG oslo_concurrency.lockutils [req-f93acc6f-7950-4b45-be7e-fdf993bc0548 req-fce0144a-407e-49d1-9164-c5c37a58575d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:41:25 np0005531888 nova_compute[186788]: 2025-11-22 07:41:25.013 186792 DEBUG nova.network.neutron [req-f93acc6f-7950-4b45-be7e-fdf993bc0548 req-fce0144a-407e-49d1-9164-c5c37a58575d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Refreshing network info cache for port 6d7f4376-d18e-4738-9936-779dc0fdfefb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:41:25 np0005531888 nova_compute[186788]: 2025-11-22 07:41:25.287 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:25 np0005531888 nova_compute[186788]: 2025-11-22 07:41:25.897 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:27.677 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:30 np0005531888 nova_compute[186788]: 2025-11-22 07:41:30.290 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:30 np0005531888 podman[214546]: 2025-11-22 07:41:30.701757867 +0000 UTC m=+0.070683828 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:41:30 np0005531888 nova_compute[186788]: 2025-11-22 07:41:30.899 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:31 np0005531888 nova_compute[186788]: 2025-11-22 07:41:31.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:31 np0005531888 nova_compute[186788]: 2025-11-22 07:41:31.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:41:31 np0005531888 nova_compute[186788]: 2025-11-22 07:41:31.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:41:32 np0005531888 nova_compute[186788]: 2025-11-22 07:41:32.598 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:41:32 np0005531888 podman[214566]: 2025-11-22 07:41:32.68135334 +0000 UTC m=+0.053704694 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:41:33 np0005531888 nova_compute[186788]: 2025-11-22 07:41:33.773 186792 DEBUG nova.network.neutron [req-f93acc6f-7950-4b45-be7e-fdf993bc0548 req-fce0144a-407e-49d1-9164-c5c37a58575d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Updated VIF entry in instance network info cache for port 6d7f4376-d18e-4738-9936-779dc0fdfefb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:41:33 np0005531888 nova_compute[186788]: 2025-11-22 07:41:33.775 186792 DEBUG nova.network.neutron [req-f93acc6f-7950-4b45-be7e-fdf993bc0548 req-fce0144a-407e-49d1-9164-c5c37a58575d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Updating instance_info_cache with network_info: [{"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:41:33 np0005531888 nova_compute[186788]: 2025-11-22 07:41:33.836 186792 DEBUG oslo_concurrency.lockutils [req-f93acc6f-7950-4b45-be7e-fdf993bc0548 req-fce0144a-407e-49d1-9164-c5c37a58575d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:41:33 np0005531888 nova_compute[186788]: 2025-11-22 07:41:33.837 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:41:33 np0005531888 nova_compute[186788]: 2025-11-22 07:41:33.837 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:41:33 np0005531888 nova_compute[186788]: 2025-11-22 07:41:33.838 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ee885dd4-c723-4bb6-a0f3-87effcedb330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:41:34 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:34Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:2f:9e 10.100.0.5
Nov 22 02:41:34 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:34Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:2f:9e 10.100.0.5
Nov 22 02:41:35 np0005531888 nova_compute[186788]: 2025-11-22 07:41:35.330 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:35 np0005531888 podman[214609]: 2025-11-22 07:41:35.724498824 +0000 UTC m=+0.094210548 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Nov 22 02:41:35 np0005531888 nova_compute[186788]: 2025-11-22 07:41:35.901 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:36.794 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:36.794 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:36.795 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.045 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Updating instance_info_cache with network_info: [{"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.076 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-ee885dd4-c723-4bb6-a0f3-87effcedb330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.077 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.077 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.078 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.078 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.078 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.078 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.078 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.079 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.109 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.109 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.109 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.109 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.200 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.268 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.270 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.347 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.348 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.409 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.eph0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.410 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.469 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330/disk.eph0 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.637 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.638 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5607MB free_disk=73.4292221069336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.638 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.638 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.721 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance ee885dd4-c723-4bb6-a0f3-87effcedb330 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.721 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.722 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.811 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.829 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.871 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:41:37 np0005531888 nova_compute[186788]: 2025-11-22 07:41:37.872 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:38 np0005531888 nova_compute[186788]: 2025-11-22 07:41:38.747 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:38 np0005531888 nova_compute[186788]: 2025-11-22 07:41:38.748 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:38 np0005531888 nova_compute[186788]: 2025-11-22 07:41:38.777 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:40 np0005531888 nova_compute[186788]: 2025-11-22 07:41:40.366 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:40 np0005531888 nova_compute[186788]: 2025-11-22 07:41:40.903 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.112 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "ee885dd4-c723-4bb6-a0f3-87effcedb330" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.112 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.113 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.113 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.114 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.123 186792 INFO nova.compute.manager [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Terminating instance#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.130 186792 DEBUG nova.compute.manager [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:41:45 np0005531888 kernel: tap6d7f4376-d1 (unregistering): left promiscuous mode
Nov 22 02:41:45 np0005531888 NetworkManager[55166]: <info>  [1763797305.1817] device (tap6d7f4376-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.187 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:45Z|00052|binding|INFO|Releasing lport 6d7f4376-d18e-4738-9936-779dc0fdfefb from this chassis (sb_readonly=0)
Nov 22 02:41:45 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:45Z|00053|binding|INFO|Setting lport 6d7f4376-d18e-4738-9936-779dc0fdfefb down in Southbound
Nov 22 02:41:45 np0005531888 ovn_controller[95067]: 2025-11-22T07:41:45Z|00054|binding|INFO|Removing iface tap6d7f4376-d1 ovn-installed in OVS
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.190 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.198 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:2f:9e 10.100.0.5'], port_security=['fa:16:3e:15:2f:9e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ee885dd4-c723-4bb6-a0f3-87effcedb330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af1e32bc189c402bad715e6c4cc8dcfa', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b53dee3e-cc57-4959-b703-fd736782ce77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08d7499b-95e9-4cf7-b602-701ee3e333bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=6d7f4376-d18e-4738-9936-779dc0fdfefb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.200 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 6d7f4376-d18e-4738-9936-779dc0fdfefb in datapath 1ae6b2a9-f586-4520-bc3d-923fe57139cb unbound from our chassis#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.201 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1ae6b2a9-f586-4520-bc3d-923fe57139cb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.202 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[62042edd-924d-4707-8c46-24d371c17867]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.203 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb namespace which is not needed anymore#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.207 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 22 02:41:45 np0005531888 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 14.999s CPU time.
Nov 22 02:41:45 np0005531888 systemd-machined[153106]: Machine qemu-4-instance-0000000a terminated.
Nov 22 02:41:45 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[214485]: [NOTICE]   (214489) : haproxy version is 2.8.14-c23fe91
Nov 22 02:41:45 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[214485]: [NOTICE]   (214489) : path to executable is /usr/sbin/haproxy
Nov 22 02:41:45 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[214485]: [WARNING]  (214489) : Exiting Master process...
Nov 22 02:41:45 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[214485]: [WARNING]  (214489) : Exiting Master process...
Nov 22 02:41:45 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[214485]: [ALERT]    (214489) : Current worker (214491) exited with code 143 (Terminated)
Nov 22 02:41:45 np0005531888 neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb[214485]: [WARNING]  (214489) : All workers exited. Exiting... (0)
Nov 22 02:41:45 np0005531888 systemd[1]: libpod-35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d.scope: Deactivated successfully.
Nov 22 02:41:45 np0005531888 podman[214669]: 2025-11-22 07:41:45.355444578 +0000 UTC m=+0.047838637 container died 35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.368 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d-userdata-shm.mount: Deactivated successfully.
Nov 22 02:41:45 np0005531888 systemd[1]: var-lib-containers-storage-overlay-6dae14049ec5e6c2a0ac7529a4d5266fb46836ab596c929e85cdad982089f276-merged.mount: Deactivated successfully.
Nov 22 02:41:45 np0005531888 podman[214669]: 2025-11-22 07:41:45.405103361 +0000 UTC m=+0.097497410 container cleanup 35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:41:45 np0005531888 systemd[1]: libpod-conmon-35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d.scope: Deactivated successfully.
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.417 186792 INFO nova.virt.libvirt.driver [-] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Instance destroyed successfully.#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.417 186792 DEBUG nova.objects.instance [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lazy-loading 'resources' on Instance uuid ee885dd4-c723-4bb6-a0f3-87effcedb330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.438 186792 DEBUG nova.virt.libvirt.vif [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:41:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-326802289',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-326802289',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(6),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-326802289',id=10,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCNvo81uLNODY5pXMCXv/rgxcCiuBWxjDFSMOswBarzwWE4bZrCdQaaMgGCGacDcycmYMfjuNyIpB44+zMTJDP3JvkVGjJV4StWUn/AhoiRpx02XDT0ns/iRT7Ya1fxBPw==',key_name='tempest-keypair-346383250',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:41:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af1e32bc189c402bad715e6c4cc8dcfa',ramdisk_id='',reservation_id='r-30knckko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1826293598-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:41:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0eeafa43d6c84f6888a05c3f4ca3fb78',uuid=ee885dd4-c723-4bb6-a0f3-87effcedb330,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.439 186792 DEBUG nova.network.os_vif_util [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converting VIF {"id": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "address": "fa:16:3e:15:2f:9e", "network": {"id": "1ae6b2a9-f586-4520-bc3d-923fe57139cb", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1622257356-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af1e32bc189c402bad715e6c4cc8dcfa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d7f4376-d1", "ovs_interfaceid": "6d7f4376-d18e-4738-9936-779dc0fdfefb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.440 186792 DEBUG nova.network.os_vif_util [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:2f:9e,bridge_name='br-int',has_traffic_filtering=True,id=6d7f4376-d18e-4738-9936-779dc0fdfefb,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d7f4376-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.440 186792 DEBUG os_vif [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:2f:9e,bridge_name='br-int',has_traffic_filtering=True,id=6d7f4376-d18e-4738-9936-779dc0fdfefb,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d7f4376-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.443 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d7f4376-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.445 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.446 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.449 186792 INFO os_vif [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:2f:9e,bridge_name='br-int',has_traffic_filtering=True,id=6d7f4376-d18e-4738-9936-779dc0fdfefb,network=Network(1ae6b2a9-f586-4520-bc3d-923fe57139cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d7f4376-d1')#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.450 186792 INFO nova.virt.libvirt.driver [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Deleting instance files /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330_del#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.451 186792 INFO nova.virt.libvirt.driver [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Deletion of /var/lib/nova/instances/ee885dd4-c723-4bb6-a0f3-87effcedb330_del complete#033[00m
Nov 22 02:41:45 np0005531888 podman[214722]: 2025-11-22 07:41:45.472099726 +0000 UTC m=+0.041943040 container remove 35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.477 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4aeed51b-3f79-4b1a-98a4-d93a44628e60]: (4, ('Sat Nov 22 07:41:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb (35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d)\n35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d\nSat Nov 22 07:41:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb (35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d)\n35d6008493b780a0ffcfc5ca38136264e9662d669121825ebb96678f9c79ef6d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.478 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5a067887-6d1a-4c7e-899a-9912321585ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.479 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ae6b2a9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.481 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 kernel: tap1ae6b2a9-f0: left promiscuous mode
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.492 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.495 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a01690f8-6238-4db1-9125-9652e810ef51]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.518 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[89974e30-0d33-48cb-9b9d-25ca775dcb18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.519 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[739504c3-a05a-4fe3-9127-34e1c3ccf358]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.533 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4ac761cd-9bb4-4721-b995-765c898f7c71]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401874, 'reachable_time': 23519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214737, 'error': None, 'target': 'ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.536 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1ae6b2a9-f586-4520-bc3d-923fe57139cb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:41:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:41:45.536 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[2d917e56-8342-4c87-b786-1ec8fc2eccde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:45 np0005531888 systemd[1]: run-netns-ovnmeta\x2d1ae6b2a9\x2df586\x2d4520\x2dbc3d\x2d923fe57139cb.mount: Deactivated successfully.
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.582 186792 INFO nova.compute.manager [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.583 186792 DEBUG oslo.service.loopingcall [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.583 186792 DEBUG nova.compute.manager [-] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.584 186792 DEBUG nova.network.neutron [-] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.720 186792 DEBUG nova.compute.manager [req-c0a9f99c-cc74-4925-b89b-82399d3ddee3 req-52d79d35-14e7-45b6-b525-598077e19ec0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received event network-vif-unplugged-6d7f4376-d18e-4738-9936-779dc0fdfefb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.720 186792 DEBUG oslo_concurrency.lockutils [req-c0a9f99c-cc74-4925-b89b-82399d3ddee3 req-52d79d35-14e7-45b6-b525-598077e19ec0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.720 186792 DEBUG oslo_concurrency.lockutils [req-c0a9f99c-cc74-4925-b89b-82399d3ddee3 req-52d79d35-14e7-45b6-b525-598077e19ec0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.721 186792 DEBUG oslo_concurrency.lockutils [req-c0a9f99c-cc74-4925-b89b-82399d3ddee3 req-52d79d35-14e7-45b6-b525-598077e19ec0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.721 186792 DEBUG nova.compute.manager [req-c0a9f99c-cc74-4925-b89b-82399d3ddee3 req-52d79d35-14e7-45b6-b525-598077e19ec0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] No waiting events found dispatching network-vif-unplugged-6d7f4376-d18e-4738-9936-779dc0fdfefb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.721 186792 DEBUG nova.compute.manager [req-c0a9f99c-cc74-4925-b89b-82399d3ddee3 req-52d79d35-14e7-45b6-b525-598077e19ec0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received event network-vif-unplugged-6d7f4376-d18e-4738-9936-779dc0fdfefb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:41:45 np0005531888 nova_compute[186788]: 2025-11-22 07:41:45.937 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:47 np0005531888 podman[214738]: 2025-11-22 07:41:47.688622215 +0000 UTC m=+0.060933726 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 02:41:47 np0005531888 podman[214739]: 2025-11-22 07:41:47.718946132 +0000 UTC m=+0.088010722 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:41:47 np0005531888 nova_compute[186788]: 2025-11-22 07:41:47.849 186792 DEBUG nova.network.neutron [-] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:41:47 np0005531888 nova_compute[186788]: 2025-11-22 07:41:47.870 186792 DEBUG nova.compute.manager [req-7ac9f288-afee-49c7-a79a-901a96a72088 req-5bf7a193-1e2c-4554-9cc8-477857896bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received event network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:47 np0005531888 nova_compute[186788]: 2025-11-22 07:41:47.870 186792 DEBUG oslo_concurrency.lockutils [req-7ac9f288-afee-49c7-a79a-901a96a72088 req-5bf7a193-1e2c-4554-9cc8-477857896bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:47 np0005531888 nova_compute[186788]: 2025-11-22 07:41:47.870 186792 DEBUG oslo_concurrency.lockutils [req-7ac9f288-afee-49c7-a79a-901a96a72088 req-5bf7a193-1e2c-4554-9cc8-477857896bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:47 np0005531888 nova_compute[186788]: 2025-11-22 07:41:47.871 186792 DEBUG oslo_concurrency.lockutils [req-7ac9f288-afee-49c7-a79a-901a96a72088 req-5bf7a193-1e2c-4554-9cc8-477857896bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:47 np0005531888 nova_compute[186788]: 2025-11-22 07:41:47.871 186792 DEBUG nova.compute.manager [req-7ac9f288-afee-49c7-a79a-901a96a72088 req-5bf7a193-1e2c-4554-9cc8-477857896bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] No waiting events found dispatching network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:41:47 np0005531888 nova_compute[186788]: 2025-11-22 07:41:47.871 186792 WARNING nova.compute.manager [req-7ac9f288-afee-49c7-a79a-901a96a72088 req-5bf7a193-1e2c-4554-9cc8-477857896bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received unexpected event network-vif-plugged-6d7f4376-d18e-4738-9936-779dc0fdfefb for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:41:47 np0005531888 nova_compute[186788]: 2025-11-22 07:41:47.880 186792 INFO nova.compute.manager [-] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Took 2.30 seconds to deallocate network for instance.#033[00m
Nov 22 02:41:48 np0005531888 nova_compute[186788]: 2025-11-22 07:41:48.009 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:48 np0005531888 nova_compute[186788]: 2025-11-22 07:41:48.009 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:48 np0005531888 nova_compute[186788]: 2025-11-22 07:41:48.038 186792 DEBUG nova.compute.manager [req-3e2fecc6-a8b6-49a7-af0d-6bd3300a4a3d req-ba470d00-a164-4757-9398-87405170676c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Received event network-vif-deleted-6d7f4376-d18e-4738-9936-779dc0fdfefb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:48 np0005531888 nova_compute[186788]: 2025-11-22 07:41:48.101 186792 DEBUG nova.compute.provider_tree [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:41:48 np0005531888 nova_compute[186788]: 2025-11-22 07:41:48.122 186792 DEBUG nova.scheduler.client.report [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:41:48 np0005531888 nova_compute[186788]: 2025-11-22 07:41:48.175 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:48 np0005531888 nova_compute[186788]: 2025-11-22 07:41:48.252 186792 INFO nova.scheduler.client.report [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Deleted allocations for instance ee885dd4-c723-4bb6-a0f3-87effcedb330#033[00m
Nov 22 02:41:48 np0005531888 nova_compute[186788]: 2025-11-22 07:41:48.399 186792 DEBUG oslo_concurrency.lockutils [None req-b457d937-13e9-4b67-bf25-a4a80358baab 0eeafa43d6c84f6888a05c3f4ca3fb78 af1e32bc189c402bad715e6c4cc8dcfa - - default default] Lock "ee885dd4-c723-4bb6-a0f3-87effcedb330" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:50 np0005531888 nova_compute[186788]: 2025-11-22 07:41:50.448 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:50 np0005531888 nova_compute[186788]: 2025-11-22 07:41:50.940 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:51 np0005531888 podman[214780]: 2025-11-22 07:41:51.677475691 +0000 UTC m=+0.055307304 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:41:55 np0005531888 nova_compute[186788]: 2025-11-22 07:41:55.452 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:55 np0005531888 podman[214804]: 2025-11-22 07:41:55.689697253 +0000 UTC m=+0.057373817 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 02:41:55 np0005531888 nova_compute[186788]: 2025-11-22 07:41:55.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:00 np0005531888 nova_compute[186788]: 2025-11-22 07:42:00.416 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797305.4125352, ee885dd4-c723-4bb6-a0f3-87effcedb330 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:42:00 np0005531888 nova_compute[186788]: 2025-11-22 07:42:00.417 186792 INFO nova.compute.manager [-] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:42:00 np0005531888 nova_compute[186788]: 2025-11-22 07:42:00.456 186792 DEBUG nova.compute.manager [None req-cc2c3c19-70dd-4d0e-b40a-de55f0d421e8 - - - - - -] [instance: ee885dd4-c723-4bb6-a0f3-87effcedb330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:42:00 np0005531888 nova_compute[186788]: 2025-11-22 07:42:00.456 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:00 np0005531888 nova_compute[186788]: 2025-11-22 07:42:00.943 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:01 np0005531888 podman[214823]: 2025-11-22 07:42:01.692200273 +0000 UTC m=+0.061010238 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 02:42:03 np0005531888 podman[214843]: 2025-11-22 07:42:03.685377186 +0000 UTC m=+0.058093274 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:42:05 np0005531888 nova_compute[186788]: 2025-11-22 07:42:05.459 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:05 np0005531888 nova_compute[186788]: 2025-11-22 07:42:05.721 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:05 np0005531888 nova_compute[186788]: 2025-11-22 07:42:05.828 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:05 np0005531888 nova_compute[186788]: 2025-11-22 07:42:05.945 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:06 np0005531888 podman[214866]: 2025-11-22 07:42:06.695048362 +0000 UTC m=+0.067192532 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 22 02:42:10 np0005531888 nova_compute[186788]: 2025-11-22 07:42:10.491 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:10 np0005531888 nova_compute[186788]: 2025-11-22 07:42:10.947 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:15 np0005531888 nova_compute[186788]: 2025-11-22 07:42:15.539 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:15 np0005531888 nova_compute[186788]: 2025-11-22 07:42:15.949 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:18 np0005531888 podman[214887]: 2025-11-22 07:42:18.70586631 +0000 UTC m=+0.066568135 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 02:42:18 np0005531888 podman[214888]: 2025-11-22 07:42:18.755577214 +0000 UTC m=+0.121787307 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Nov 22 02:42:20 np0005531888 nova_compute[186788]: 2025-11-22 07:42:20.543 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:21 np0005531888 nova_compute[186788]: 2025-11-22 07:42:21.003 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:22 np0005531888 podman[214931]: 2025-11-22 07:42:22.675748303 +0000 UTC m=+0.053011718 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:42:25 np0005531888 nova_compute[186788]: 2025-11-22 07:42:25.547 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:26 np0005531888 nova_compute[186788]: 2025-11-22 07:42:26.005 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:26 np0005531888 podman[214955]: 2025-11-22 07:42:26.714550919 +0000 UTC m=+0.082695489 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 02:42:30 np0005531888 nova_compute[186788]: 2025-11-22 07:42:30.580 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:31 np0005531888 nova_compute[186788]: 2025-11-22 07:42:31.007 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:31 np0005531888 nova_compute[186788]: 2025-11-22 07:42:31.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:31 np0005531888 nova_compute[186788]: 2025-11-22 07:42:31.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:42:31 np0005531888 nova_compute[186788]: 2025-11-22 07:42:31.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:42:31 np0005531888 nova_compute[186788]: 2025-11-22 07:42:31.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:42:32 np0005531888 podman[214975]: 2025-11-22 07:42:32.688838204 +0000 UTC m=+0.059167161 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:42:32 np0005531888 nova_compute[186788]: 2025-11-22 07:42:32.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:33 np0005531888 nova_compute[186788]: 2025-11-22 07:42:33.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:33 np0005531888 nova_compute[186788]: 2025-11-22 07:42:33.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:42:34.024 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:42:34 np0005531888 nova_compute[186788]: 2025-11-22 07:42:34.025 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:42:34.026 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:42:34 np0005531888 podman[214995]: 2025-11-22 07:42:34.679529164 +0000 UTC m=+0.048160115 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:42:34 np0005531888 nova_compute[186788]: 2025-11-22 07:42:34.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:42:35.028 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:42:35 np0005531888 nova_compute[186788]: 2025-11-22 07:42:35.583 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:35 np0005531888 nova_compute[186788]: 2025-11-22 07:42:35.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:35 np0005531888 nova_compute[186788]: 2025-11-22 07:42:35.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:35 np0005531888 nova_compute[186788]: 2025-11-22 07:42:35.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:35 np0005531888 nova_compute[186788]: 2025-11-22 07:42:35.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:35 np0005531888 nova_compute[186788]: 2025-11-22 07:42:35.981 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.008 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.171 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.172 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5784MB free_disk=73.45904922485352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.172 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.173 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.484 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.484 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.605 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.628 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.710 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:42:36 np0005531888 nova_compute[186788]: 2025-11-22 07:42:36.711 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:42:36.794 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:42:36.795 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:42:36.795 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.834 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.835 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.836 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:42:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:37 np0005531888 podman[215022]: 2025-11-22 07:42:37.688840021 +0000 UTC m=+0.059691854 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 02:42:37 np0005531888 nova_compute[186788]: 2025-11-22 07:42:37.712 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:37 np0005531888 nova_compute[186788]: 2025-11-22 07:42:37.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:37 np0005531888 nova_compute[186788]: 2025-11-22 07:42:37.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:42:39 np0005531888 nova_compute[186788]: 2025-11-22 07:42:39.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:40 np0005531888 nova_compute[186788]: 2025-11-22 07:42:40.587 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:41 np0005531888 nova_compute[186788]: 2025-11-22 07:42:41.012 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:45 np0005531888 nova_compute[186788]: 2025-11-22 07:42:45.592 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:46 np0005531888 nova_compute[186788]: 2025-11-22 07:42:46.013 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:49 np0005531888 podman[215043]: 2025-11-22 07:42:49.685940476 +0000 UTC m=+0.056261108 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:42:49 np0005531888 podman[215044]: 2025-11-22 07:42:49.732527191 +0000 UTC m=+0.097690605 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:42:50 np0005531888 nova_compute[186788]: 2025-11-22 07:42:50.635 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:51 np0005531888 nova_compute[186788]: 2025-11-22 07:42:51.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:54 np0005531888 podman[215089]: 2025-11-22 07:42:54.050790097 +0000 UTC m=+0.080145146 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:42:55 np0005531888 nova_compute[186788]: 2025-11-22 07:42:55.637 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:56 np0005531888 nova_compute[186788]: 2025-11-22 07:42:56.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:57 np0005531888 podman[215113]: 2025-11-22 07:42:57.68567182 +0000 UTC m=+0.054880984 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 02:43:00 np0005531888 nova_compute[186788]: 2025-11-22 07:43:00.642 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:01 np0005531888 nova_compute[186788]: 2025-11-22 07:43:01.020 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:03 np0005531888 podman[215132]: 2025-11-22 07:43:03.685933846 +0000 UTC m=+0.062079964 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:43:05 np0005531888 nova_compute[186788]: 2025-11-22 07:43:05.645 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:05 np0005531888 podman[215152]: 2025-11-22 07:43:05.676635777 +0000 UTC m=+0.050092354 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:43:06 np0005531888 nova_compute[186788]: 2025-11-22 07:43:06.023 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:08 np0005531888 podman[215176]: 2025-11-22 07:43:08.684972739 +0000 UTC m=+0.057059378 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.362 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.362 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.437 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.607 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.608 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.617 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.618 186792 INFO nova.compute.claims [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.745 186792 DEBUG nova.compute.provider_tree [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.763 186792 DEBUG nova.scheduler.client.report [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.831 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.832 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.896 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.897 186792 DEBUG nova.network.neutron [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:43:09 np0005531888 nova_compute[186788]: 2025-11-22 07:43:09.990 186792 INFO nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.027 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.227 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.228 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.229 186792 INFO nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Creating image(s)#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.229 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "/var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.229 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.230 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.243 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.305 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.307 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.308 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.319 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.386 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.387 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.421 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.423 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.423 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.482 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.484 186792 DEBUG nova.virt.disk.api [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Checking if we can resize image /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.484 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.549 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.550 186792 DEBUG nova.virt.disk.api [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Cannot resize image /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.551 186792 DEBUG nova.objects.instance [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'migration_context' on Instance uuid 99dcf8de-456d-4737-8362-0ddfc942c00a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.569 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.570 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Ensure instance console log exists: /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.571 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.571 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.571 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:10 np0005531888 nova_compute[186788]: 2025-11-22 07:43:10.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:11 np0005531888 nova_compute[186788]: 2025-11-22 07:43:11.078 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:11 np0005531888 nova_compute[186788]: 2025-11-22 07:43:11.118 186792 DEBUG nova.policy [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:43:13 np0005531888 nova_compute[186788]: 2025-11-22 07:43:13.076 186792 DEBUG nova.network.neutron [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Successfully created port: 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:43:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:43:13Z|00055|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 02:43:14 np0005531888 nova_compute[186788]: 2025-11-22 07:43:14.896 186792 DEBUG nova.network.neutron [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Successfully updated port: 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:43:14 np0005531888 nova_compute[186788]: 2025-11-22 07:43:14.927 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:14 np0005531888 nova_compute[186788]: 2025-11-22 07:43:14.928 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquired lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:14 np0005531888 nova_compute[186788]: 2025-11-22 07:43:14.928 186792 DEBUG nova.network.neutron [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:43:15 np0005531888 nova_compute[186788]: 2025-11-22 07:43:15.204 186792 DEBUG nova.compute.manager [req-29fd9504-0e7e-4c97-ae28-fb1a11a7ff80 req-ec536349-0159-4f72-85cf-b157f7b4a76d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received event network-changed-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:15 np0005531888 nova_compute[186788]: 2025-11-22 07:43:15.204 186792 DEBUG nova.compute.manager [req-29fd9504-0e7e-4c97-ae28-fb1a11a7ff80 req-ec536349-0159-4f72-85cf-b157f7b4a76d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Refreshing instance network info cache due to event network-changed-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:43:15 np0005531888 nova_compute[186788]: 2025-11-22 07:43:15.205 186792 DEBUG oslo_concurrency.lockutils [req-29fd9504-0e7e-4c97-ae28-fb1a11a7ff80 req-ec536349-0159-4f72-85cf-b157f7b4a76d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:15 np0005531888 nova_compute[186788]: 2025-11-22 07:43:15.455 186792 DEBUG nova.network.neutron [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:43:15 np0005531888 nova_compute[186788]: 2025-11-22 07:43:15.696 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:16 np0005531888 nova_compute[186788]: 2025-11-22 07:43:16.081 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.620 186792 DEBUG nova.network.neutron [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Updating instance_info_cache with network_info: [{"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.660 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Releasing lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.661 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Instance network_info: |[{"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.661 186792 DEBUG oslo_concurrency.lockutils [req-29fd9504-0e7e-4c97-ae28-fb1a11a7ff80 req-ec536349-0159-4f72-85cf-b157f7b4a76d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.662 186792 DEBUG nova.network.neutron [req-29fd9504-0e7e-4c97-ae28-fb1a11a7ff80 req-ec536349-0159-4f72-85cf-b157f7b4a76d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Refreshing network info cache for port 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.666 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Start _get_guest_xml network_info=[{"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.672 186792 WARNING nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.678 186792 DEBUG nova.virt.libvirt.host [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.679 186792 DEBUG nova.virt.libvirt.host [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.690 186792 DEBUG nova.virt.libvirt.host [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.691 186792 DEBUG nova.virt.libvirt.host [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.692 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.693 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.693 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.693 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.693 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.694 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.694 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.694 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.694 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.694 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.695 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.695 186792 DEBUG nova.virt.hardware [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.699 186792 DEBUG nova.virt.libvirt.vif [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1444867234',display_name='tempest-ServersAdminTestJSON-server-1444867234',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1444867234',id=14,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-66ppaeoq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:10Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=99dcf8de-456d-4737-8362-0ddfc942c00a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.699 186792 DEBUG nova.network.os_vif_util [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.700 186792 DEBUG nova.network.os_vif_util [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:0e:a5,bridge_name='br-int',has_traffic_filtering=True,id=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a936f8-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.700 186792 DEBUG nova.objects.instance [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 99dcf8de-456d-4737-8362-0ddfc942c00a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.713 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <uuid>99dcf8de-456d-4737-8362-0ddfc942c00a</uuid>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <name>instance-0000000e</name>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersAdminTestJSON-server-1444867234</nova:name>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:43:17</nova:creationTime>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        <nova:user uuid="7c0fb56fc41e44dfa23a0d45149e78e3">tempest-ServersAdminTestJSON-1843119868-project-member</nova:user>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        <nova:project uuid="9b004cb06df74de2903dae19345fd9c7">tempest-ServersAdminTestJSON-1843119868</nova:project>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        <nova:port uuid="43a936f8-40f5-4d2a-8ad7-791bc6dde9ee">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <entry name="serial">99dcf8de-456d-4737-8362-0ddfc942c00a</entry>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <entry name="uuid">99dcf8de-456d-4737-8362-0ddfc942c00a</entry>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk.config"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:e9:0e:a5"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <target dev="tap43a936f8-40"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/console.log" append="off"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:43:17 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:43:17 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:43:17 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:43:17 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.715 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Preparing to wait for external event network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.715 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.716 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.716 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.717 186792 DEBUG nova.virt.libvirt.vif [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1444867234',display_name='tempest-ServersAdminTestJSON-server-1444867234',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1444867234',id=14,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-66ppaeoq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:10Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=99dcf8de-456d-4737-8362-0ddfc942c00a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.717 186792 DEBUG nova.network.os_vif_util [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.718 186792 DEBUG nova.network.os_vif_util [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:0e:a5,bridge_name='br-int',has_traffic_filtering=True,id=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a936f8-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.718 186792 DEBUG os_vif [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:0e:a5,bridge_name='br-int',has_traffic_filtering=True,id=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a936f8-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.719 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.719 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.720 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.724 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.724 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43a936f8-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.724 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43a936f8-40, col_values=(('external_ids', {'iface-id': '43a936f8-40f5-4d2a-8ad7-791bc6dde9ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:0e:a5', 'vm-uuid': '99dcf8de-456d-4737-8362-0ddfc942c00a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.726 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:17 np0005531888 NetworkManager[55166]: <info>  [1763797397.7274] manager: (tap43a936f8-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.727 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.733 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:17 np0005531888 nova_compute[186788]: 2025-11-22 07:43:17.734 186792 INFO os_vif [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:0e:a5,bridge_name='br-int',has_traffic_filtering=True,id=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a936f8-40')#033[00m
Nov 22 02:43:18 np0005531888 nova_compute[186788]: 2025-11-22 07:43:18.019 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:43:18 np0005531888 nova_compute[186788]: 2025-11-22 07:43:18.020 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:43:18 np0005531888 nova_compute[186788]: 2025-11-22 07:43:18.020 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No VIF found with MAC fa:16:3e:e9:0e:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:43:18 np0005531888 nova_compute[186788]: 2025-11-22 07:43:18.020 186792 INFO nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Using config drive#033[00m
Nov 22 02:43:18 np0005531888 nova_compute[186788]: 2025-11-22 07:43:18.968 186792 INFO nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Creating config drive at /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk.config#033[00m
Nov 22 02:43:18 np0005531888 nova_compute[186788]: 2025-11-22 07:43:18.976 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp90foz4t3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.105 186792 DEBUG oslo_concurrency.processutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp90foz4t3" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:19 np0005531888 kernel: tap43a936f8-40: entered promiscuous mode
Nov 22 02:43:19 np0005531888 NetworkManager[55166]: <info>  [1763797399.1753] manager: (tap43a936f8-40): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.175 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:43:19Z|00056|binding|INFO|Claiming lport 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee for this chassis.
Nov 22 02:43:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:43:19Z|00057|binding|INFO|43a936f8-40f5-4d2a-8ad7-791bc6dde9ee: Claiming fa:16:3e:e9:0e:a5 10.100.0.11
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.181 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.206 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:0e:a5 10.100.0.11'], port_security=['fa:16:3e:e9:0e:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.208 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 bound to our chassis#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.210 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723#033[00m
Nov 22 02:43:19 np0005531888 systemd-udevd[215230]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.222 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6d6df182-58c0-4b77-bd01-35585379e895]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.224 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7ba1c27-61 in ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.226 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7ba1c27-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.226 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7b236ffa-baa8-470b-804b-976b4b6560f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.227 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[368bfcfd-b55a-4e26-97fa-2fbd8feb25b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 systemd-machined[153106]: New machine qemu-5-instance-0000000e.
Nov 22 02:43:19 np0005531888 NetworkManager[55166]: <info>  [1763797399.2335] device (tap43a936f8-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:43:19 np0005531888 NetworkManager[55166]: <info>  [1763797399.2343] device (tap43a936f8-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.239 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531888 systemd[1]: Started Virtual Machine qemu-5-instance-0000000e.
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.248 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b4b823-e273-403f-b210-9ed15ff54590]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:43:19Z|00058|binding|INFO|Setting lport 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee ovn-installed in OVS
Nov 22 02:43:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:43:19Z|00059|binding|INFO|Setting lport 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee up in Southbound
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.256 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.276 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd1e91d-14f6-4b95-95b9-b71761dd8b01]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.308 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[fc6813f0-7ad1-40d6-8cf3-8491efed3bfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 systemd-udevd[215236]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.314 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[90cbff04-067a-4104-9c81-c65d1e0c3c5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 NetworkManager[55166]: <info>  [1763797399.3170] manager: (tapd7ba1c27-60): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.354 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[fedf20f6-7113-4d09-bfc3-7635d1423a3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.358 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8925282d-790a-41f8-8bc3-90bd258eedbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 NetworkManager[55166]: <info>  [1763797399.3867] device (tapd7ba1c27-60): carrier: link connected
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.390 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d0372470-4d89-4293-aa8d-0f6504773b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.410 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[46e31b42-c670-43d8-890f-7f0191b87cf8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414003, 'reachable_time': 36120, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215265, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.431 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[41594721-33e0-464d-aca4-f38dd60f650a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:37eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414003, 'tstamp': 414003}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215267, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.453 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb72a7f-213a-495c-92de-20623d0bf4bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414003, 'reachable_time': 36120, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215268, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.484 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[231ee11b-bf31-4355-87dc-f946f4111a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.551 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6a27cc23-b9ac-47b7-9909-df64ab007ada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.553 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.553 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.553 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:19 np0005531888 NetworkManager[55166]: <info>  [1763797399.5562] manager: (tapd7ba1c27-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.555 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531888 kernel: tapd7ba1c27-60: entered promiscuous mode
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.561 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:43:19Z|00060|binding|INFO|Releasing lport 3c20001c-28e2-4cdd-9a7c-497ed470b31c from this chassis (sb_readonly=0)
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.560 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.562 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.576 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.578 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.579 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ebf2ea-02f1-4655-9e91-c7e34ff94f43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.579 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:43:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:19.580 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'env', 'PROCESS_TAG=haproxy-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7ba1c27-6255-4c71-8e98-23a1c59b5723.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.617 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797399.6164355, 99dcf8de-456d-4737-8362-0ddfc942c00a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.618 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] VM Started (Lifecycle Event)#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.648 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.653 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797399.6170473, 99dcf8de-456d-4737-8362-0ddfc942c00a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.653 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.675 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.682 186792 DEBUG nova.compute.manager [req-80350bb8-2394-4276-a135-a2725ba6cc99 req-18e2ed9e-8639-41bb-82d0-1d913a4e9214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received event network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.683 186792 DEBUG oslo_concurrency.lockutils [req-80350bb8-2394-4276-a135-a2725ba6cc99 req-18e2ed9e-8639-41bb-82d0-1d913a4e9214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.683 186792 DEBUG oslo_concurrency.lockutils [req-80350bb8-2394-4276-a135-a2725ba6cc99 req-18e2ed9e-8639-41bb-82d0-1d913a4e9214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.684 186792 DEBUG oslo_concurrency.lockutils [req-80350bb8-2394-4276-a135-a2725ba6cc99 req-18e2ed9e-8639-41bb-82d0-1d913a4e9214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.684 186792 DEBUG nova.compute.manager [req-80350bb8-2394-4276-a135-a2725ba6cc99 req-18e2ed9e-8639-41bb-82d0-1d913a4e9214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Processing event network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.685 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.686 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.690 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.695 186792 INFO nova.virt.libvirt.driver [-] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Instance spawned successfully.#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.696 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.716 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.717 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797399.6891897, 99dcf8de-456d-4737-8362-0ddfc942c00a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.717 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.727 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.728 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.728 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.729 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.730 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.730 186792 DEBUG nova.virt.libvirt.driver [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.740 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.752 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.776 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.839 186792 INFO nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Took 9.61 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.840 186792 DEBUG nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:19 np0005531888 nova_compute[186788]: 2025-11-22 07:43:19.960 186792 INFO nova.compute.manager [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Took 10.41 seconds to build instance.#033[00m
Nov 22 02:43:19 np0005531888 podman[215307]: 2025-11-22 07:43:19.987598573 +0000 UTC m=+0.053740595 container create aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 02:43:20 np0005531888 nova_compute[186788]: 2025-11-22 07:43:20.000 186792 DEBUG oslo_concurrency.lockutils [None req-41de57e7-efce-4668-a918-46efaaf97360 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:20 np0005531888 systemd[1]: Started libpod-conmon-aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817.scope.
Nov 22 02:43:20 np0005531888 podman[215307]: 2025-11-22 07:43:19.957773117 +0000 UTC m=+0.023915169 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:43:20 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:43:20 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07d187b57ae7423b2c78d7203f3512fa363a15af97b1e66e862291329f967dcb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:43:20 np0005531888 podman[215307]: 2025-11-22 07:43:20.088737683 +0000 UTC m=+0.154879725 container init aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 02:43:20 np0005531888 podman[215320]: 2025-11-22 07:43:20.093071721 +0000 UTC m=+0.071314815 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 02:43:20 np0005531888 podman[215307]: 2025-11-22 07:43:20.098872596 +0000 UTC m=+0.165014618 container start aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 02:43:20 np0005531888 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215342]: [NOTICE]   (215364) : New worker (215370) forked
Nov 22 02:43:20 np0005531888 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215342]: [NOTICE]   (215364) : Loading success.
Nov 22 02:43:20 np0005531888 podman[215323]: 2025-11-22 07:43:20.14943501 +0000 UTC m=+0.121117299 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.082 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.244 186792 DEBUG nova.network.neutron [req-29fd9504-0e7e-4c97-ae28-fb1a11a7ff80 req-ec536349-0159-4f72-85cf-b157f7b4a76d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Updated VIF entry in instance network info cache for port 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.245 186792 DEBUG nova.network.neutron [req-29fd9504-0e7e-4c97-ae28-fb1a11a7ff80 req-ec536349-0159-4f72-85cf-b157f7b4a76d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Updating instance_info_cache with network_info: [{"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.261 186792 DEBUG oslo_concurrency.lockutils [req-29fd9504-0e7e-4c97-ae28-fb1a11a7ff80 req-ec536349-0159-4f72-85cf-b157f7b4a76d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.811 186792 DEBUG nova.compute.manager [req-92f6aca9-5d42-408d-9f95-3a58677d50e7 req-5b248a4b-f7d7-4713-8aa7-4126e2a988dc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received event network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.812 186792 DEBUG oslo_concurrency.lockutils [req-92f6aca9-5d42-408d-9f95-3a58677d50e7 req-5b248a4b-f7d7-4713-8aa7-4126e2a988dc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.812 186792 DEBUG oslo_concurrency.lockutils [req-92f6aca9-5d42-408d-9f95-3a58677d50e7 req-5b248a4b-f7d7-4713-8aa7-4126e2a988dc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.812 186792 DEBUG oslo_concurrency.lockutils [req-92f6aca9-5d42-408d-9f95-3a58677d50e7 req-5b248a4b-f7d7-4713-8aa7-4126e2a988dc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.813 186792 DEBUG nova.compute.manager [req-92f6aca9-5d42-408d-9f95-3a58677d50e7 req-5b248a4b-f7d7-4713-8aa7-4126e2a988dc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] No waiting events found dispatching network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:43:21 np0005531888 nova_compute[186788]: 2025-11-22 07:43:21.813 186792 WARNING nova.compute.manager [req-92f6aca9-5d42-408d-9f95-3a58677d50e7 req-5b248a4b-f7d7-4713-8aa7-4126e2a988dc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received unexpected event network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee for instance with vm_state active and task_state None.#033[00m
Nov 22 02:43:22 np0005531888 nova_compute[186788]: 2025-11-22 07:43:22.728 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:24 np0005531888 podman[215380]: 2025-11-22 07:43:24.686403017 +0000 UTC m=+0.056019462 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:43:26 np0005531888 nova_compute[186788]: 2025-11-22 07:43:26.085 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:27 np0005531888 nova_compute[186788]: 2025-11-22 07:43:27.731 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:27 np0005531888 podman[215405]: 2025-11-22 07:43:27.824764002 +0000 UTC m=+0.062858193 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 22 02:43:28 np0005531888 nova_compute[186788]: 2025-11-22 07:43:28.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:31 np0005531888 nova_compute[186788]: 2025-11-22 07:43:31.087 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:32 np0005531888 nova_compute[186788]: 2025-11-22 07:43:32.102 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:32 np0005531888 nova_compute[186788]: 2025-11-22 07:43:32.102 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:43:32 np0005531888 nova_compute[186788]: 2025-11-22 07:43:32.102 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:43:32 np0005531888 nova_compute[186788]: 2025-11-22 07:43:32.385 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:32 np0005531888 nova_compute[186788]: 2025-11-22 07:43:32.386 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:32 np0005531888 nova_compute[186788]: 2025-11-22 07:43:32.387 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:43:32 np0005531888 nova_compute[186788]: 2025-11-22 07:43:32.387 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 99dcf8de-456d-4737-8362-0ddfc942c00a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:32 np0005531888 nova_compute[186788]: 2025-11-22 07:43:32.735 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:34 np0005531888 podman[215441]: 2025-11-22 07:43:34.683205187 +0000 UTC m=+0.054874113 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 02:43:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:34.978 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:43:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:34.980 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:43:34 np0005531888 nova_compute[186788]: 2025-11-22 07:43:34.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:43:35Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e9:0e:a5 10.100.0.11
Nov 22 02:43:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:43:35Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:0e:a5 10.100.0.11
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.090 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.158 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Updating instance_info_cache with network_info: [{"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.254 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.255 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.255 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.256 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.256 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.256 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.256 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.270 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.271 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.271 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:43:36 np0005531888 podman[215461]: 2025-11-22 07:43:36.684699623 +0000 UTC m=+0.055816658 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:43:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:36.795 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:36.796 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:36.797 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.984 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531888 nova_compute[186788]: 2025-11-22 07:43:36.984 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:37 np0005531888 nova_compute[186788]: 2025-11-22 07:43:37.737 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:37 np0005531888 nova_compute[186788]: 2025-11-22 07:43:37.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:37 np0005531888 nova_compute[186788]: 2025-11-22 07:43:37.975 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:37 np0005531888 nova_compute[186788]: 2025-11-22 07:43:37.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:37 np0005531888 nova_compute[186788]: 2025-11-22 07:43:37.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:37 np0005531888 nova_compute[186788]: 2025-11-22 07:43:37.976 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.051 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.114 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.115 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.179 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.347 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.349 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5611MB free_disk=73.43044662475586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.349 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.349 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.504 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 99dcf8de-456d-4737-8362-0ddfc942c00a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.505 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.505 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.619 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.631 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.656 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:43:38 np0005531888 nova_compute[186788]: 2025-11-22 07:43:38.656 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:39 np0005531888 nova_compute[186788]: 2025-11-22 07:43:39.656 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:39 np0005531888 nova_compute[186788]: 2025-11-22 07:43:39.657 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:43:39 np0005531888 podman[215492]: 2025-11-22 07:43:39.685694744 +0000 UTC m=+0.062214117 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 02:43:39 np0005531888 nova_compute[186788]: 2025-11-22 07:43:39.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:39 np0005531888 nova_compute[186788]: 2025-11-22 07:43:39.965 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.018 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.018 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.034 186792 DEBUG nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.093 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.148 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.149 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.159 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.159 186792 INFO nova.compute.claims [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.355 186792 DEBUG nova.compute.provider_tree [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.376 186792 DEBUG nova.scheduler.client.report [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.398 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.398 186792 DEBUG nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.463 186792 DEBUG nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.463 186792 DEBUG nova.network.neutron [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.487 186792 INFO nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.513 186792 DEBUG nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.674 186792 DEBUG nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.675 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.675 186792 INFO nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Creating image(s)#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.676 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.676 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.677 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.691 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.758 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.759 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.759 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.770 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.828 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.829 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.861 186792 DEBUG nova.network.neutron [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.862 186792 DEBUG nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.863 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.863 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.864 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.925 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.926 186792 DEBUG nova.virt.disk.api [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Checking if we can resize image /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.927 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.990 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.992 186792 DEBUG nova.virt.disk.api [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Cannot resize image /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:43:41 np0005531888 nova_compute[186788]: 2025-11-22 07:43:41.992 186792 DEBUG nova.objects.instance [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'migration_context' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.015 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.016 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Ensure instance console log exists: /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.016 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.017 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.017 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.019 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.025 186792 WARNING nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.032 186792 DEBUG nova.virt.libvirt.host [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.033 186792 DEBUG nova.virt.libvirt.host [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.039 186792 DEBUG nova.virt.libvirt.host [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.040 186792 DEBUG nova.virt.libvirt.host [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.042 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.042 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.043 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.043 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.043 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.044 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.044 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.044 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.045 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.045 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.045 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.045 186792 DEBUG nova.virt.hardware [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.050 186792 DEBUG nova.objects.instance [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'pci_devices' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.070 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <uuid>5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67</uuid>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <name>instance-00000011</name>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <nova:name>tempest-MigrationsAdminTest-server-370989325</nova:name>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:43:42</nova:creationTime>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <entry name="serial">5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67</entry>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <entry name="uuid">5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67</entry>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/console.log" append="off"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:43:42 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:43:42 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:43:42 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:43:42 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.114 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.115 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.115 186792 INFO nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Using config drive#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.299 186792 INFO nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Creating config drive at /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.305 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo6b90jxx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.433 186792 DEBUG oslo_concurrency.processutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo6b90jxx" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:42 np0005531888 systemd-machined[153106]: New machine qemu-6-instance-00000011.
Nov 22 02:43:42 np0005531888 systemd[1]: Started Virtual Machine qemu-6-instance-00000011.
Nov 22 02:43:42 np0005531888 nova_compute[186788]: 2025-11-22 07:43:42.743 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:43:42.982 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.648 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797424.6477065, 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.649 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.654 186792 DEBUG nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.655 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.660 186792 INFO nova.virt.libvirt.driver [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance spawned successfully.#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.661 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.673 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.681 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.684 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.685 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.685 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.685 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.686 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.686 186792 DEBUG nova.virt.libvirt.driver [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.711 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.712 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797424.6491916, 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.712 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] VM Started (Lifecycle Event)#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.744 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.749 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.768 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.779 186792 INFO nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Took 3.11 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.780 186792 DEBUG nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.851 186792 INFO nova.compute.manager [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Took 3.77 seconds to build instance.#033[00m
Nov 22 02:43:44 np0005531888 nova_compute[186788]: 2025-11-22 07:43:44.868 186792 DEBUG oslo_concurrency.lockutils [None req-3e2e914b-8c9a-4058-8373-d05722520475 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:46 np0005531888 nova_compute[186788]: 2025-11-22 07:43:46.095 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:47 np0005531888 nova_compute[186788]: 2025-11-22 07:43:47.747 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:49 np0005531888 nova_compute[186788]: 2025-11-22 07:43:49.155 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:49 np0005531888 nova_compute[186788]: 2025-11-22 07:43:49.156 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquired lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:49 np0005531888 nova_compute[186788]: 2025-11-22 07:43:49.156 186792 DEBUG nova.network.neutron [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:43:50 np0005531888 nova_compute[186788]: 2025-11-22 07:43:50.086 186792 DEBUG nova.network.neutron [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:43:50 np0005531888 podman[215556]: 2025-11-22 07:43:50.709310492 +0000 UTC m=+0.061407087 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:43:51 np0005531888 podman[215557]: 2025-11-22 07:43:51.032679378 +0000 UTC m=+0.390863266 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Nov 22 02:43:51 np0005531888 nova_compute[186788]: 2025-11-22 07:43:51.080 186792 DEBUG nova.network.neutron [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:51 np0005531888 nova_compute[186788]: 2025-11-22 07:43:51.098 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Releasing lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:51 np0005531888 nova_compute[186788]: 2025-11-22 07:43:51.125 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:51 np0005531888 nova_compute[186788]: 2025-11-22 07:43:51.660 186792 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 02:43:51 np0005531888 nova_compute[186788]: 2025-11-22 07:43:51.661 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Creating file /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/55553fe8b4f44205ac68107d0517d39f.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 02:43:51 np0005531888 nova_compute[186788]: 2025-11-22 07:43:51.662 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/55553fe8b4f44205ac68107d0517d39f.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:52 np0005531888 nova_compute[186788]: 2025-11-22 07:43:52.079 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/55553fe8b4f44205ac68107d0517d39f.tmp" returned: 1 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:52 np0005531888 nova_compute[186788]: 2025-11-22 07:43:52.081 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/55553fe8b4f44205ac68107d0517d39f.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:43:52 np0005531888 nova_compute[186788]: 2025-11-22 07:43:52.081 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Creating directory /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 02:43:52 np0005531888 nova_compute[186788]: 2025-11-22 07:43:52.081 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:52 np0005531888 nova_compute[186788]: 2025-11-22 07:43:52.280 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" returned: 0 in 0.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:52 np0005531888 nova_compute[186788]: 2025-11-22 07:43:52.286 186792 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:43:52 np0005531888 nova_compute[186788]: 2025-11-22 07:43:52.751 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:55 np0005531888 podman[215602]: 2025-11-22 07:43:55.725552471 +0000 UTC m=+0.092087985 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:43:56 np0005531888 nova_compute[186788]: 2025-11-22 07:43:56.127 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:57 np0005531888 nova_compute[186788]: 2025-11-22 07:43:57.756 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:58 np0005531888 podman[215639]: 2025-11-22 07:43:58.691080054 +0000 UTC m=+0.061958060 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:44:01 np0005531888 nova_compute[186788]: 2025-11-22 07:44:01.130 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:02 np0005531888 nova_compute[186788]: 2025-11-22 07:44:02.335 186792 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:44:02 np0005531888 nova_compute[186788]: 2025-11-22 07:44:02.759 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:04 np0005531888 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000011.scope: Deactivated successfully.
Nov 22 02:44:04 np0005531888 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000011.scope: Consumed 16.042s CPU time.
Nov 22 02:44:04 np0005531888 systemd-machined[153106]: Machine qemu-6-instance-00000011 terminated.
Nov 22 02:44:05 np0005531888 nova_compute[186788]: 2025-11-22 07:44:05.354 186792 INFO nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 02:44:05 np0005531888 nova_compute[186788]: 2025-11-22 07:44:05.363 186792 INFO nova.virt.libvirt.driver [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance destroyed successfully.#033[00m
Nov 22 02:44:05 np0005531888 nova_compute[186788]: 2025-11-22 07:44:05.369 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:05 np0005531888 nova_compute[186788]: 2025-11-22 07:44:05.438 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:05 np0005531888 nova_compute[186788]: 2025-11-22 07:44:05.439 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:05 np0005531888 nova_compute[186788]: 2025-11-22 07:44:05.504 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:05 np0005531888 nova_compute[186788]: 2025-11-22 07:44:05.506 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Copying file /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk to 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:44:05 np0005531888 nova_compute[186788]: 2025-11-22 07:44:05.506 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:05 np0005531888 podman[215675]: 2025-11-22 07:44:05.698549314 +0000 UTC m=+0.061549910 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.350 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "scp -r /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk" returned: 0 in 0.844s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.351 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Copying file /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.351 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk.config 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.557 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "scp -C -r /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk.config 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.558 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Copying file /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.558 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk.info 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.796 186792 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "scp -C -r /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_resize/disk.info 192.168.122.100:/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info" returned: 0 in 0.238s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.891 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.891 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.896 186792 INFO nova.compute.rpcapi [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.897 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.913 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.913 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:06 np0005531888 nova_compute[186788]: 2025-11-22 07:44:06.914 186792 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:07 np0005531888 podman[215699]: 2025-11-22 07:44:07.701810004 +0000 UTC m=+0.061843418 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:44:07 np0005531888 nova_compute[186788]: 2025-11-22 07:44:07.763 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:10 np0005531888 podman[215723]: 2025-11-22 07:44:10.697832051 +0000 UTC m=+0.065055088 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 02:44:11 np0005531888 nova_compute[186788]: 2025-11-22 07:44:11.135 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:12 np0005531888 nova_compute[186788]: 2025-11-22 07:44:12.767 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:12 np0005531888 nova_compute[186788]: 2025-11-22 07:44:12.879 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:12 np0005531888 nova_compute[186788]: 2025-11-22 07:44:12.898 186792 WARNING nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 2 instances on the hypervisor.#033[00m
Nov 22 02:44:12 np0005531888 nova_compute[186788]: 2025-11-22 07:44:12.898 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Triggering sync for uuid 99dcf8de-456d-4737-8362-0ddfc942c00a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 02:44:12 np0005531888 nova_compute[186788]: 2025-11-22 07:44:12.899 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:12 np0005531888 nova_compute[186788]: 2025-11-22 07:44:12.900 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:12 np0005531888 nova_compute[186788]: 2025-11-22 07:44:12.930 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:13 np0005531888 nova_compute[186788]: 2025-11-22 07:44:13.644 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:13 np0005531888 nova_compute[186788]: 2025-11-22 07:44:13.645 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:13 np0005531888 nova_compute[186788]: 2025-11-22 07:44:13.645 186792 DEBUG nova.compute.manager [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Going to confirm migration 2 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 22 02:44:13 np0005531888 nova_compute[186788]: 2025-11-22 07:44:13.696 186792 DEBUG nova.objects.instance [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'info_cache' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:13 np0005531888 nova_compute[186788]: 2025-11-22 07:44:13.936 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:13 np0005531888 nova_compute[186788]: 2025-11-22 07:44:13.936 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:13 np0005531888 nova_compute[186788]: 2025-11-22 07:44:13.937 186792 DEBUG nova.network.neutron [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.123 186792 DEBUG nova.network.neutron [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.459 186792 DEBUG nova.network.neutron [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.472 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.473 186792 DEBUG nova.objects.instance [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'migration_context' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.502 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.502 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.587 186792 DEBUG nova.compute.provider_tree [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.599 186792 DEBUG nova.scheduler.client.report [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.644 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.756 186792 INFO nova.scheduler.client.report [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Deleted allocation for migration d8d5546f-2f6f-4662-836a-f33c5797ec0e#033[00m
Nov 22 02:44:14 np0005531888 nova_compute[186788]: 2025-11-22 07:44:14.820 186792 DEBUG oslo_concurrency.lockutils [None req-6bf2182b-548d-4aef-9592-d5e7d79c2d0e 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 1.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:16 np0005531888 nova_compute[186788]: 2025-11-22 07:44:16.137 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:17 np0005531888 nova_compute[186788]: 2025-11-22 07:44:17.771 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:19 np0005531888 nova_compute[186788]: 2025-11-22 07:44:19.744 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797444.7433326, 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:19 np0005531888 nova_compute[186788]: 2025-11-22 07:44:19.745 186792 INFO nova.compute.manager [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:44:19 np0005531888 nova_compute[186788]: 2025-11-22 07:44:19.764 186792 DEBUG nova.compute.manager [None req-012803ff-e3d0-4cae-a08b-3f3044096959 - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:21 np0005531888 nova_compute[186788]: 2025-11-22 07:44:21.139 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:21 np0005531888 podman[215744]: 2025-11-22 07:44:21.691465918 +0000 UTC m=+0.063062808 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:44:21 np0005531888 podman[215745]: 2025-11-22 07:44:21.726781452 +0000 UTC m=+0.093434758 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:44:22 np0005531888 nova_compute[186788]: 2025-11-22 07:44:22.775 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.385 186792 DEBUG nova.compute.manager [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.531 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.531 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.566 186792 DEBUG nova.objects.instance [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'pci_requests' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.582 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.582 186792 INFO nova.compute.claims [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.583 186792 DEBUG nova.objects.instance [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'resources' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.593 186792 DEBUG nova.objects.instance [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'pci_devices' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.636 186792 INFO nova.compute.resource_tracker [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating resource usage from migration ed0b10e7-46ed-431c-bd8e-aa93dcdc9988#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.637 186792 DEBUG nova.compute.resource_tracker [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Starting to track incoming migration ed0b10e7-46ed-431c-bd8e-aa93dcdc9988 with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.717 186792 DEBUG nova.compute.provider_tree [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.729 186792 DEBUG nova.scheduler.client.report [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.748 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:24 np0005531888 nova_compute[186788]: 2025-11-22 07:44:24.749 186792 INFO nova.compute.manager [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Migrating#033[00m
Nov 22 02:44:25 np0005531888 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:44:25 np0005531888 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:44:25 np0005531888 systemd-logind[825]: New session 27 of user nova.
Nov 22 02:44:25 np0005531888 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:44:25 np0005531888 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:44:25 np0005531888 systemd[215794]: Queued start job for default target Main User Target.
Nov 22 02:44:25 np0005531888 systemd[215794]: Created slice User Application Slice.
Nov 22 02:44:25 np0005531888 systemd[215794]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:44:25 np0005531888 systemd[215794]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:44:25 np0005531888 systemd[215794]: Reached target Paths.
Nov 22 02:44:25 np0005531888 systemd[215794]: Reached target Timers.
Nov 22 02:44:25 np0005531888 systemd[215794]: Starting D-Bus User Message Bus Socket...
Nov 22 02:44:25 np0005531888 systemd[215794]: Starting Create User's Volatile Files and Directories...
Nov 22 02:44:25 np0005531888 systemd[215794]: Finished Create User's Volatile Files and Directories.
Nov 22 02:44:25 np0005531888 systemd[215794]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:44:25 np0005531888 systemd[215794]: Reached target Sockets.
Nov 22 02:44:25 np0005531888 systemd[215794]: Reached target Basic System.
Nov 22 02:44:25 np0005531888 systemd[215794]: Reached target Main User Target.
Nov 22 02:44:25 np0005531888 systemd[215794]: Startup finished in 161ms.
Nov 22 02:44:25 np0005531888 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:44:25 np0005531888 systemd[1]: Started Session 27 of User nova.
Nov 22 02:44:25 np0005531888 systemd[1]: session-27.scope: Deactivated successfully.
Nov 22 02:44:25 np0005531888 podman[215809]: 2025-11-22 07:44:25.950667097 +0000 UTC m=+0.068968605 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:44:25 np0005531888 systemd-logind[825]: Session 27 logged out. Waiting for processes to exit.
Nov 22 02:44:25 np0005531888 systemd-logind[825]: Removed session 27.
Nov 22 02:44:26 np0005531888 systemd-logind[825]: New session 29 of user nova.
Nov 22 02:44:26 np0005531888 systemd[1]: Started Session 29 of User nova.
Nov 22 02:44:26 np0005531888 nova_compute[186788]: 2025-11-22 07:44:26.141 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:26 np0005531888 systemd[1]: session-29.scope: Deactivated successfully.
Nov 22 02:44:26 np0005531888 systemd-logind[825]: Session 29 logged out. Waiting for processes to exit.
Nov 22 02:44:26 np0005531888 systemd-logind[825]: Removed session 29.
Nov 22 02:44:27 np0005531888 nova_compute[186788]: 2025-11-22 07:44:27.780 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:29 np0005531888 podman[215841]: 2025-11-22 07:44:29.682889816 +0000 UTC m=+0.052305628 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:44:31 np0005531888 nova_compute[186788]: 2025-11-22 07:44:31.143 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531888 nova_compute[186788]: 2025-11-22 07:44:31.975 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:31 np0005531888 nova_compute[186788]: 2025-11-22 07:44:31.976 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:44:31 np0005531888 nova_compute[186788]: 2025-11-22 07:44:31.976 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:44:32 np0005531888 nova_compute[186788]: 2025-11-22 07:44:32.293 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:32 np0005531888 nova_compute[186788]: 2025-11-22 07:44:32.293 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:32 np0005531888 nova_compute[186788]: 2025-11-22 07:44:32.294 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:44:32 np0005531888 nova_compute[186788]: 2025-11-22 07:44:32.294 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 99dcf8de-456d-4737-8362-0ddfc942c00a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:32 np0005531888 nova_compute[186788]: 2025-11-22 07:44:32.783 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:33 np0005531888 nova_compute[186788]: 2025-11-22 07:44:33.841 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Updating instance_info_cache with network_info: [{"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:34 np0005531888 nova_compute[186788]: 2025-11-22 07:44:34.010 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-99dcf8de-456d-4737-8362-0ddfc942c00a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:34 np0005531888 nova_compute[186788]: 2025-11-22 07:44:34.011 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:44:34 np0005531888 nova_compute[186788]: 2025-11-22 07:44:34.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:34 np0005531888 nova_compute[186788]: 2025-11-22 07:44:34.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:34 np0005531888 nova_compute[186788]: 2025-11-22 07:44:34.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:35.407 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:44:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:35.408 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:44:35 np0005531888 nova_compute[186788]: 2025-11-22 07:44:35.408 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:36 np0005531888 nova_compute[186788]: 2025-11-22 07:44:36.145 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:36 np0005531888 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:44:36 np0005531888 systemd[215794]: Activating special unit Exit the Session...
Nov 22 02:44:36 np0005531888 systemd[215794]: Stopped target Main User Target.
Nov 22 02:44:36 np0005531888 systemd[215794]: Stopped target Basic System.
Nov 22 02:44:36 np0005531888 systemd[215794]: Stopped target Paths.
Nov 22 02:44:36 np0005531888 systemd[215794]: Stopped target Sockets.
Nov 22 02:44:36 np0005531888 systemd[215794]: Stopped target Timers.
Nov 22 02:44:36 np0005531888 systemd[215794]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:44:36 np0005531888 systemd[215794]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:44:36 np0005531888 systemd[215794]: Closed D-Bus User Message Bus Socket.
Nov 22 02:44:36 np0005531888 systemd[215794]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:44:36 np0005531888 systemd[215794]: Removed slice User Application Slice.
Nov 22 02:44:36 np0005531888 systemd[215794]: Reached target Shutdown.
Nov 22 02:44:36 np0005531888 systemd[215794]: Finished Exit the Session.
Nov 22 02:44:36 np0005531888 systemd[215794]: Reached target Exit the Session.
Nov 22 02:44:36 np0005531888 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:44:36 np0005531888 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:44:36 np0005531888 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:44:36 np0005531888 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:44:36 np0005531888 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:44:36 np0005531888 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:44:36 np0005531888 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:44:36 np0005531888 podman[215861]: 2025-11-22 07:44:36.468641927 +0000 UTC m=+0.090740410 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:44:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:36.796 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:36.797 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:36.799 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.837 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'name': 'tempest-ServersAdminTestJSON-server-1444867234', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000e', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9b004cb06df74de2903dae19345fd9c7', 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'hostId': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.839 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.859 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/cpu volume: 13610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d1bad34-330b-4417-9082-958c1e66414c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13610000000, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'timestamp': '2025-11-22T07:44:36.839257', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '18252e36-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.559283673, 'message_signature': '9cc9a9405157ea4b892d576982519d7282addc071871729cc086bba450d47f9c'}]}, 'timestamp': '2025-11-22 07:44:36.861334', '_unique_id': '42dbd617aaf14a6b87c4a6aaf6069490'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.863 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.864 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.876 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.877 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5992ee0-0ece-42a9-8f62-0d855f5509ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.864922', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1827ae68-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.564486403, 'message_signature': '534ffe7c2377b341ec92324ee5017a5d2c4daa5126645d8eff662ada0e32c308'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.864922', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1827bcf0-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.564486403, 'message_signature': '35af21a888d1d06136ea671691c71eb76325f30926ab2c5d434f394be4645647'}]}, 'timestamp': '2025-11-22 07:44:36.877979', '_unique_id': '992e111e5f5c4713a247d5acaded99ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.879 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.880 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.905 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.read.latency volume: 1135657818 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.906 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.read.latency volume: 57267629 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3fefdd6-e221-45ee-8ecc-62e25f38572f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1135657818, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.880179', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '182c02c4-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': '7ca26e58d10c018ec51bf6ec1ce414cc6164d2f49b6e9f43fda07fa60156f8dc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57267629, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.880179', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '182c0f12-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': 'd893add10a179ef791a42dfcec8c5c705512c677600b4636a7038cdcc04f1f66'}]}, 'timestamp': '2025-11-22 07:44:36.906314', '_unique_id': '0797daa31b0548799f9036ef16cc4aae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.907 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.908 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.908 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.908 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1444867234>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1444867234>]
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.908 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.908 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.read.bytes volume: 30407168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.908 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52a4cf82-1e18-4a59-aeaa-31eb0f10ea19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30407168, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.908726', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '182c77ae-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': '3987262f73b2addcba780d2a89b8a3b23c731605d25affa7dc30de5a22e11653'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.908726', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '182c7f7e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': '634a78c4b8ec85d38aa74a6605b1351873302045bc3cdc495085c2b83032d247'}]}, 'timestamp': '2025-11-22 07:44:36.909136', '_unique_id': 'ee7c87d42eb94503940a2a6281c13ef8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.913 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 99dcf8de-456d-4737-8362-0ddfc942c00a / tap43a936f8-40 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.914 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84fb280d-26c1-473e-82f7-83b9ea890add', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.910459', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '182d47a6-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': '9fbe8a1393d0471f7fe84df7e27519907a82988f1fae3b15524bbae92a180bfe'}]}, 'timestamp': '2025-11-22 07:44:36.914315', '_unique_id': '4f9cbff228cb4eca821d6716bee8f2c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.915 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bea2de3-4f66-40e3-bb65-a38b29a9ebe9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.915943', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '182d921a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': 'b2831fedc3c61cb4dd54792893ba2c11f0f8b712d47033c6479f17e4577500f1'}]}, 'timestamp': '2025-11-22 07:44:36.916184', '_unique_id': '4e99e03bc5b54f1aa6fc8a38522dc33c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.917 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.917 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6a7c49a-e135-4c08-8f95-06b31ccf8f7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.917332', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '182dc942-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': '520163229724f217200618fca9c2c0e0469219408ac6cf337a6aea3c0d76a755'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.917332', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '182dd298-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': '0fe4e70726cea3b1f25577213460d514e7965a478dd966b55a76f8dc13de7cd3'}]}, 'timestamp': '2025-11-22 07:44:36.917818', '_unique_id': 'a4dc8b3271b2416680b1914e0881f5ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.918 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.read.requests volume: 1084 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7b0535a-1d7e-4e40-a91e-cdb21f399940', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1084, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.918913', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '182e059c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': 'b05b175ee4b82779fdee92b1c611f8e89cb2b1e55036502440b4ebbb5adbdae3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.918913', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '182e0d8a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': '4f139c9c638aa91e575170b7a7df77d250b60e8c6b8c42a73f61e6f6c4438bc5'}]}, 'timestamp': '2025-11-22 07:44:36.919327', '_unique_id': '36c570f4980b45b08ca1ea975598929f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.920 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c40292a6-ab8f-4d7d-89dd-7c8802455bab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.920478', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '182e4458-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': '4481e42360baeea61efafd1c8898c249567e5ac902fdb5e1bea445d06700a3c3'}]}, 'timestamp': '2025-11-22 07:44:36.920778', '_unique_id': '6d4af8103f0b468c9127cfd3d8e0444d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.921 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7dd8d97-d7be-4815-8ac8-139de2b6b750', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.921955', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '182e7c84-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': 'a9d46dc2b7e6cbafec6bf305e94a2de8ba7cbf1f6c5dc912db8f8b462fcfd713'}]}, 'timestamp': '2025-11-22 07:44:36.922180', '_unique_id': 'd643824987494e52a73360a243d7687e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.923 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99717682-87cc-464b-b18c-c147ae61f646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.923354', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '182eb33e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': 'f636b7683cca4b800b7b6f57b13f4763f760089c696a550d152f5eb5fc1e8807'}]}, 'timestamp': '2025-11-22 07:44:36.923611', '_unique_id': '777d27c4537741b6b3a0a6c8073aedc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.924 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ca59167-425a-468d-81e2-2468d586502c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.924704', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '182ee7c8-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.564486403, 'message_signature': 'ab027009604fc4e8c9788dcaa698c81a6a42f3971da9d992999dfe88f0f59ff1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.924704', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '182eeffc-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.564486403, 'message_signature': 'f47d0d980a04e27c4f721c714d735a155698248dfb5ac61b4fed2bdba5a8614c'}]}, 'timestamp': '2025-11-22 07:44:36.925124', '_unique_id': '16e8328d360a4129953793aa33059aa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.incoming.bytes volume: 1604 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e13bfcc6-8999-4240-b75d-b3133af7ef31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1604, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.926278', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '182f254e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': 'bcfa0734bf55c2fb8f377c4a86896b7a5737b7f0f0b00e34705469acd627519b'}]}, 'timestamp': '2025-11-22 07:44:36.926517', '_unique_id': '6f3adc4b7fa7485a91ccf9c5bbe0f59d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.927 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.927 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1444867234>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1444867234>]
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.927 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f4e8fa5-1051-4af6-9c96-1752311c2c27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.927927', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '182f65cc-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': '414d0093c8a9c99ca03c56f6e85be232d2c2857d2857c802d22f6851d26f3432'}]}, 'timestamp': '2025-11-22 07:44:36.928151', '_unique_id': '4bfb039da2d547e2969578455135188a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9460bbf-f5a8-4f41-976a-73f94105608d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.929217', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '182f97fe-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': 'f246bf68d440f7ac0a19b0086a82ce2e4ce96d4f66cda45580503d107470238d'}]}, 'timestamp': '2025-11-22 07:44:36.929456', '_unique_id': '65aed6af33444f90acd17a9b99ec0269'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.930 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.930 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.930 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b751b3e-e2ac-4293-8e6d-24fb39053393', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.930664', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '182fd2be-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.564486403, 'message_signature': 'e5f28bbbd5b41a27a00f28ab711ff22481a6d62b94698aae874471b507f5f43e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.930664', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '182fda8e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.564486403, 'message_signature': '88d7196e5ce2eae2fdceef83e76e1925f6054119fd3131f99e84463c63ce7f3f'}]}, 'timestamp': '2025-11-22 07:44:36.931127', '_unique_id': '34e0f6a2001a4fb1afe7cf1335e29391'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1444867234>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1444867234>]
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1444867234>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1444867234>]
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.write.latency volume: 27848045052 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.932 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be02d61f-a318-4b8d-b2db-cc0fd33267a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27848045052, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.932722', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '183020d4-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': 'ccf9758c766065037d8fe679f150e63ede8b0eea3cc0c87e2bcff953d68d8644'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.932722', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18302868-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': '096f9c5671a48504af268fb5396b3046c310f871f893a76f91016e449cf338a8'}]}, 'timestamp': '2025-11-22 07:44:36.933119', '_unique_id': 'b07e0f07b6be47899792cdd51ff656b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.934 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.write.requests volume: 289 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.934 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1caecc40-53af-47ef-b551-30cbe21bcbaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 289, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-vda', 'timestamp': '2025-11-22T07:44:36.934417', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '183063b4-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': 'ded170bcd2f6e5544c2c6691028d965fb1fd815591afbf67a7c07d8da4bce530'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a-sda', 'timestamp': '2025-11-22T07:44:36.934417', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18306c74-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.579718323, 'message_signature': '6452fa33710b580d8e34033bf7a6df18c4c4bda64c034d7d1d9599dabe8d7065'}]}, 'timestamp': '2025-11-22 07:44:36.934862', '_unique_id': 'acd82921ff164ace8f82aa4de8ab402f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.935 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d069319-4d6f-4193-92a5-fc85672be1cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.935931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '18309e6a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': '5345468f28d1483393dc27a29ea5cf1066206afbcb9cc9daed541116bf2b0737'}]}, 'timestamp': '2025-11-22 07:44:36.936155', '_unique_id': 'ce6580c655044550b24894955b11ea91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/memory.usage volume: 42.5 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca1345d8-4d91-4514-9e19-03fcfb8049bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.5, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'timestamp': '2025-11-22T07:44:36.937210', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'instance-0000000e', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '1830d182-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.559283673, 'message_signature': '3f9bf30d12662d4498d0c0c0a7be187df6283313eb466f61b89f79fa37bf0db4'}]}, 'timestamp': '2025-11-22 07:44:36.937461', '_unique_id': '18ac8548ffde47c68f38cc33425f6a27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.938 12 DEBUG ceilometer.compute.pollsters [-] 99dcf8de-456d-4737-8362-0ddfc942c00a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11320e1b-6d44-4853-bc25-ca995e6a6f5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000e-99dcf8de-456d-4737-8362-0ddfc942c00a-tap43a936f8-40', 'timestamp': '2025-11-22T07:44:36.938681', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1444867234', 'name': 'tap43a936f8-40', 'instance_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'instance_type': 'm1.nano', 'host': 'b337fcd7220600a4504dc176c78bdd57358c0495f808472192bf8aa9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e9:0e:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43a936f8-40'}, 'message_id': '18310ba2-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4217.610029942, 'message_signature': '06e70bf135bdc08a3544a4b80af3fa4488de62b316db0c81cb801660b84c4fd6'}]}, 'timestamp': '2025-11-22 07:44:36.938989', '_unique_id': '8d827d11d46f452d9d22c279e8287abc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:44:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.370 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.371 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.372 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.372 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.372 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.382 186792 INFO nova.compute.manager [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Terminating instance#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.390 186792 DEBUG nova.compute.manager [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.411 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:37 np0005531888 kernel: tap43a936f8-40 (unregistering): left promiscuous mode
Nov 22 02:44:37 np0005531888 NetworkManager[55166]: <info>  [1763797477.4231] device (tap43a936f8-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:44:37 np0005531888 ovn_controller[95067]: 2025-11-22T07:44:37Z|00061|binding|INFO|Releasing lport 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee from this chassis (sb_readonly=0)
Nov 22 02:44:37 np0005531888 ovn_controller[95067]: 2025-11-22T07:44:37Z|00062|binding|INFO|Setting lport 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee down in Southbound
Nov 22 02:44:37 np0005531888 ovn_controller[95067]: 2025-11-22T07:44:37Z|00063|binding|INFO|Removing iface tap43a936f8-40 ovn-installed in OVS
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.432 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.435 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.478 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:0e:a5 10.100.0.11'], port_security=['fa:16:3e:e9:0e:a5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '99dcf8de-456d-4737-8362-0ddfc942c00a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.479 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 43a936f8-40f5-4d2a-8ad7-791bc6dde9ee in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 unbound from our chassis#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.481 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7ba1c27-6255-4c71-8e98-23a1c59b5723, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.483 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[42ece28b-6626-45d2-b433-a141f833799f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.483 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 namespace which is not needed anymore#033[00m
Nov 22 02:44:37 np0005531888 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Nov 22 02:44:37 np0005531888 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000e.scope: Consumed 17.415s CPU time.
Nov 22 02:44:37 np0005531888 systemd-machined[153106]: Machine qemu-5-instance-0000000e terminated.
Nov 22 02:44:37 np0005531888 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215342]: [NOTICE]   (215364) : haproxy version is 2.8.14-c23fe91
Nov 22 02:44:37 np0005531888 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215342]: [NOTICE]   (215364) : path to executable is /usr/sbin/haproxy
Nov 22 02:44:37 np0005531888 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215342]: [WARNING]  (215364) : Exiting Master process...
Nov 22 02:44:37 np0005531888 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215342]: [ALERT]    (215364) : Current worker (215370) exited with code 143 (Terminated)
Nov 22 02:44:37 np0005531888 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215342]: [WARNING]  (215364) : All workers exited. Exiting... (0)
Nov 22 02:44:37 np0005531888 systemd[1]: libpod-aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817.scope: Deactivated successfully.
Nov 22 02:44:37 np0005531888 podman[215905]: 2025-11-22 07:44:37.639375216 +0000 UTC m=+0.057349155 container died aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.669 186792 INFO nova.virt.libvirt.driver [-] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Instance destroyed successfully.#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.670 186792 DEBUG nova.objects.instance [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'resources' on Instance uuid 99dcf8de-456d-4737-8362-0ddfc942c00a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:37 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817-userdata-shm.mount: Deactivated successfully.
Nov 22 02:44:37 np0005531888 systemd[1]: var-lib-containers-storage-overlay-07d187b57ae7423b2c78d7203f3512fa363a15af97b1e66e862291329f967dcb-merged.mount: Deactivated successfully.
Nov 22 02:44:37 np0005531888 podman[215905]: 2025-11-22 07:44:37.684224408 +0000 UTC m=+0.102198337 container cleanup aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.689 186792 DEBUG nova.virt.libvirt.vif [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:43:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1444867234',display_name='tempest-ServersAdminTestJSON-server-1444867234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1444867234',id=14,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-66ppaeoq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:43:19Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=99dcf8de-456d-4737-8362-0ddfc942c00a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.690 186792 DEBUG nova.network.os_vif_util [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "address": "fa:16:3e:e9:0e:a5", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a936f8-40", "ovs_interfaceid": "43a936f8-40f5-4d2a-8ad7-791bc6dde9ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:44:37 np0005531888 systemd[1]: libpod-conmon-aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817.scope: Deactivated successfully.
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.691 186792 DEBUG nova.network.os_vif_util [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:0e:a5,bridge_name='br-int',has_traffic_filtering=True,id=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a936f8-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.692 186792 DEBUG os_vif [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:0e:a5,bridge_name='br-int',has_traffic_filtering=True,id=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a936f8-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.693 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.694 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43a936f8-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.695 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.699 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.701 186792 INFO os_vif [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:0e:a5,bridge_name='br-int',has_traffic_filtering=True,id=43a936f8-40f5-4d2a-8ad7-791bc6dde9ee,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a936f8-40')#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.702 186792 INFO nova.virt.libvirt.driver [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Deleting instance files /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a_del#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.703 186792 INFO nova.virt.libvirt.driver [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Deletion of /var/lib/nova/instances/99dcf8de-456d-4737-8362-0ddfc942c00a_del complete#033[00m
Nov 22 02:44:37 np0005531888 podman[215954]: 2025-11-22 07:44:37.752539125 +0000 UTC m=+0.043601841 container remove aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.758 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7a3546-4dc9-4548-a066-c879cfa0d913]: (4, ('Sat Nov 22 07:44:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 (aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817)\naab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817\nSat Nov 22 07:44:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 (aab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817)\naab152e29f8415983b2a027f0d34551797b6871226c01d105adc0f5b1ff18817\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.760 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[60a91a52-b51e-45aa-b7a0-35beb1d0083d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.761 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:37 np0005531888 kernel: tapd7ba1c27-60: left promiscuous mode
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.764 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.776 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.778 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9d61e129-cba5-41cf-b456-0b2916ab9e28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.790 186792 INFO nova.compute.manager [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.791 186792 DEBUG oslo.service.loopingcall [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.791 186792 DEBUG nova.compute.manager [-] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.791 186792 DEBUG nova.network.neutron [-] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.793 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[efd7249e-af04-4ebd-a6e6-fded2e776500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.794 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[405235c9-4a3d-4e07-9cc5-2c4ea921c355]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.814 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[95d5fa6e-380a-4972-a3b9-1a564e150f7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413994, 'reachable_time': 31399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215975, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.818 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:44:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:44:37.818 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[adb5dc41-269d-4d14-835f-8a26e2ff8ee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:37 np0005531888 systemd[1]: run-netns-ovnmeta\x2dd7ba1c27\x2d6255\x2d4c71\x2d8e98\x2d23a1c59b5723.mount: Deactivated successfully.
Nov 22 02:44:37 np0005531888 podman[215967]: 2025-11-22 07:44:37.858084956 +0000 UTC m=+0.061182292 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:37 np0005531888 nova_compute[186788]: 2025-11-22 07:44:37.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.035 186792 DEBUG nova.compute.manager [req-40e68f6b-62dc-401c-8a1f-39a3657c33ae req-1d65c668-2dd3-40f6-a69a-32acf57250a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received event network-vif-unplugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.035 186792 DEBUG oslo_concurrency.lockutils [req-40e68f6b-62dc-401c-8a1f-39a3657c33ae req-1d65c668-2dd3-40f6-a69a-32acf57250a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.035 186792 DEBUG oslo_concurrency.lockutils [req-40e68f6b-62dc-401c-8a1f-39a3657c33ae req-1d65c668-2dd3-40f6-a69a-32acf57250a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.036 186792 DEBUG oslo_concurrency.lockutils [req-40e68f6b-62dc-401c-8a1f-39a3657c33ae req-1d65c668-2dd3-40f6-a69a-32acf57250a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.036 186792 DEBUG nova.compute.manager [req-40e68f6b-62dc-401c-8a1f-39a3657c33ae req-1d65c668-2dd3-40f6-a69a-32acf57250a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] No waiting events found dispatching network-vif-unplugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.036 186792 DEBUG nova.compute.manager [req-40e68f6b-62dc-401c-8a1f-39a3657c33ae req-1d65c668-2dd3-40f6-a69a-32acf57250a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received event network-vif-unplugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.866 186792 DEBUG nova.network.neutron [-] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.898 186792 INFO nova.compute.manager [-] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Took 1.11 seconds to deallocate network for instance.#033[00m
Nov 22 02:44:38 np0005531888 nova_compute[186788]: 2025-11-22 07:44:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.190 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.191 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.191 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.191 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.214 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.215 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.274 186792 DEBUG nova.scheduler.client.report [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.318 186792 DEBUG nova.scheduler.client.report [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.319 186792 DEBUG nova.compute.provider_tree [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.348 186792 DEBUG nova.scheduler.client.report [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.380 186792 DEBUG nova.scheduler.client.report [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.418 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.419 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5777MB free_disk=73.4590950012207GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.419 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.458 186792 DEBUG nova.compute.provider_tree [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.472 186792 DEBUG nova.scheduler.client.report [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:39 np0005531888 systemd-logind[825]: New session 30 of user nova.
Nov 22 02:44:39 np0005531888 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:44:39 np0005531888 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:44:39 np0005531888 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:44:39 np0005531888 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.640 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.643 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:39 np0005531888 systemd[216000]: Queued start job for default target Main User Target.
Nov 22 02:44:39 np0005531888 systemd[216000]: Created slice User Application Slice.
Nov 22 02:44:39 np0005531888 systemd[216000]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:44:39 np0005531888 systemd[216000]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:44:39 np0005531888 systemd[216000]: Reached target Paths.
Nov 22 02:44:39 np0005531888 systemd[216000]: Reached target Timers.
Nov 22 02:44:39 np0005531888 systemd[216000]: Starting D-Bus User Message Bus Socket...
Nov 22 02:44:39 np0005531888 systemd[216000]: Starting Create User's Volatile Files and Directories...
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.690 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Migration for instance 3cf2b323-ba35-4807-8337-288f6c983860 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 22 02:44:39 np0005531888 systemd[216000]: Finished Create User's Volatile Files and Directories.
Nov 22 02:44:39 np0005531888 systemd[216000]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:44:39 np0005531888 systemd[216000]: Reached target Sockets.
Nov 22 02:44:39 np0005531888 systemd[216000]: Reached target Basic System.
Nov 22 02:44:39 np0005531888 systemd[216000]: Reached target Main User Target.
Nov 22 02:44:39 np0005531888 systemd[216000]: Startup finished in 137ms.
Nov 22 02:44:39 np0005531888 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.708 186792 INFO nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating resource usage from migration ed0b10e7-46ed-431c-bd8e-aa93dcdc9988#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.708 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Starting to track incoming migration ed0b10e7-46ed-431c-bd8e-aa93dcdc9988 with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 22 02:44:39 np0005531888 systemd[1]: Started Session 30 of User nova.
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.752 186792 WARNING nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 3cf2b323-ba35-4807-8337-288f6c983860 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}.#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.754 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.754 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.792 186792 INFO nova.scheduler.client.report [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Deleted allocations for instance 99dcf8de-456d-4737-8362-0ddfc942c00a#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.806 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.828 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.870 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.870 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:39 np0005531888 nova_compute[186788]: 2025-11-22 07:44:39.880 186792 DEBUG oslo_concurrency.lockutils [None req-488bbd68-2efd-4008-a034-1167eacc95e9 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.142 186792 DEBUG nova.compute.manager [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received event network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.143 186792 DEBUG oslo_concurrency.lockutils [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.143 186792 DEBUG oslo_concurrency.lockutils [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.143 186792 DEBUG oslo_concurrency.lockutils [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "99dcf8de-456d-4737-8362-0ddfc942c00a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.143 186792 DEBUG nova.compute.manager [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] No waiting events found dispatching network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.144 186792 WARNING nova.compute.manager [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received unexpected event network-vif-plugged-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.144 186792 DEBUG nova.compute.manager [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Received event network-vif-deleted-43a936f8-40f5-4d2a-8ad7-791bc6dde9ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:40 np0005531888 systemd[1]: session-30.scope: Deactivated successfully.
Nov 22 02:44:40 np0005531888 systemd-logind[825]: Session 30 logged out. Waiting for processes to exit.
Nov 22 02:44:40 np0005531888 systemd-logind[825]: Removed session 30.
Nov 22 02:44:40 np0005531888 systemd-logind[825]: New session 32 of user nova.
Nov 22 02:44:40 np0005531888 systemd[1]: Started Session 32 of User nova.
Nov 22 02:44:40 np0005531888 systemd[1]: session-32.scope: Deactivated successfully.
Nov 22 02:44:40 np0005531888 systemd-logind[825]: Session 32 logged out. Waiting for processes to exit.
Nov 22 02:44:40 np0005531888 systemd-logind[825]: Removed session 32.
Nov 22 02:44:40 np0005531888 systemd-logind[825]: New session 33 of user nova.
Nov 22 02:44:40 np0005531888 systemd[1]: Started Session 33 of User nova.
Nov 22 02:44:40 np0005531888 systemd[1]: session-33.scope: Deactivated successfully.
Nov 22 02:44:40 np0005531888 systemd-logind[825]: Session 33 logged out. Waiting for processes to exit.
Nov 22 02:44:40 np0005531888 systemd-logind[825]: Removed session 33.
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.870 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:40 np0005531888 nova_compute[186788]: 2025-11-22 07:44:40.871 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:44:41 np0005531888 nova_compute[186788]: 2025-11-22 07:44:41.147 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:41 np0005531888 nova_compute[186788]: 2025-11-22 07:44:41.270 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:41 np0005531888 nova_compute[186788]: 2025-11-22 07:44:41.270 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:41 np0005531888 nova_compute[186788]: 2025-11-22 07:44:41.271 186792 DEBUG nova.network.neutron [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:44:41 np0005531888 nova_compute[186788]: 2025-11-22 07:44:41.691 186792 DEBUG nova.network.neutron [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:44:41 np0005531888 podman[216027]: 2025-11-22 07:44:41.722696435 +0000 UTC m=+0.074377101 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Nov 22 02:44:41 np0005531888 nova_compute[186788]: 2025-11-22 07:44:41.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:41 np0005531888 nova_compute[186788]: 2025-11-22 07:44:41.993 186792 DEBUG nova.network.neutron [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.010 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.198 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.200 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.200 186792 INFO nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Creating image(s)#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.201 186792 DEBUG nova.objects.instance [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.214 186792 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.271 186792 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.273 186792 DEBUG nova.virt.disk.api [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Checking if we can resize image /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.273 186792 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.339 186792 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.340 186792 DEBUG nova.virt.disk.api [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Cannot resize image /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.355 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.355 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Ensure instance console log exists: /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.356 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.356 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.356 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.358 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.364 186792 WARNING nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.370 186792 DEBUG nova.virt.libvirt.host [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.371 186792 DEBUG nova.virt.libvirt.host [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.375 186792 DEBUG nova.virt.libvirt.host [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.376 186792 DEBUG nova.virt.libvirt.host [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.377 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.377 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.378 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.378 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.378 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.378 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.378 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.379 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.379 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.379 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.379 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.379 186792 DEBUG nova.virt.hardware [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.379 186792 DEBUG nova.objects.instance [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.396 186792 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.455 186792 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.456 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.457 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.458 186792 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.461 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <uuid>3cf2b323-ba35-4807-8337-288f6c983860</uuid>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <name>instance-00000013</name>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <memory>196608</memory>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <nova:name>tempest-MigrationsAdminTest-server-1564380060</nova:name>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:44:42</nova:creationTime>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.micro">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:        <nova:memory>192</nova:memory>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <entry name="serial">3cf2b323-ba35-4807-8337-288f6c983860</entry>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <entry name="uuid">3cf2b323-ba35-4807-8337-288f6c983860</entry>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/console.log" append="off"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:44:42 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:44:42 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:44:42 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:44:42 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.509 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.510 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.511 186792 INFO nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Using config drive#033[00m
Nov 22 02:44:42 np0005531888 systemd-machined[153106]: New machine qemu-7-instance-00000013.
Nov 22 02:44:42 np0005531888 systemd[1]: Started Virtual Machine qemu-7-instance-00000013.
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.697 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.893 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797482.8932781, 3cf2b323-ba35-4807-8337-288f6c983860 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.894 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.897 186792 DEBUG nova.compute.manager [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.900 186792 INFO nova.virt.libvirt.driver [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance running successfully.#033[00m
Nov 22 02:44:42 np0005531888 virtqemud[186358]: argument unsupported: QEMU guest agent is not configured
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.903 186792 DEBUG nova.virt.libvirt.guest [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.904 186792 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.935 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:42 np0005531888 nova_compute[186788]: 2025-11-22 07:44:42.938 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:44:43 np0005531888 nova_compute[186788]: 2025-11-22 07:44:43.108 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:44:43 np0005531888 nova_compute[186788]: 2025-11-22 07:44:43.108 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797482.8946304, 3cf2b323-ba35-4807-8337-288f6c983860 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:43 np0005531888 nova_compute[186788]: 2025-11-22 07:44:43.108 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] VM Started (Lifecycle Event)#033[00m
Nov 22 02:44:43 np0005531888 nova_compute[186788]: 2025-11-22 07:44:43.139 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:43 np0005531888 nova_compute[186788]: 2025-11-22 07:44:43.142 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:44:43 np0005531888 nova_compute[186788]: 2025-11-22 07:44:43.191 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:44:46 np0005531888 nova_compute[186788]: 2025-11-22 07:44:46.197 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:47 np0005531888 nova_compute[186788]: 2025-11-22 07:44:47.699 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:50 np0005531888 nova_compute[186788]: 2025-11-22 07:44:50.815 186792 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Creating tmpfile /var/lib/nova/instances/tmpi234bmsv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 22 02:44:50 np0005531888 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:44:50 np0005531888 systemd[216000]: Activating special unit Exit the Session...
Nov 22 02:44:50 np0005531888 systemd[216000]: Stopped target Main User Target.
Nov 22 02:44:50 np0005531888 systemd[216000]: Stopped target Basic System.
Nov 22 02:44:50 np0005531888 systemd[216000]: Stopped target Paths.
Nov 22 02:44:50 np0005531888 systemd[216000]: Stopped target Sockets.
Nov 22 02:44:50 np0005531888 systemd[216000]: Stopped target Timers.
Nov 22 02:44:50 np0005531888 systemd[216000]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:44:50 np0005531888 systemd[216000]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:44:50 np0005531888 systemd[216000]: Closed D-Bus User Message Bus Socket.
Nov 22 02:44:50 np0005531888 systemd[216000]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:44:50 np0005531888 systemd[216000]: Removed slice User Application Slice.
Nov 22 02:44:50 np0005531888 systemd[216000]: Reached target Shutdown.
Nov 22 02:44:50 np0005531888 systemd[216000]: Finished Exit the Session.
Nov 22 02:44:50 np0005531888 systemd[216000]: Reached target Exit the Session.
Nov 22 02:44:50 np0005531888 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:44:50 np0005531888 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:44:50 np0005531888 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:44:50 np0005531888 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:44:50 np0005531888 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:44:50 np0005531888 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:44:50 np0005531888 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:44:51 np0005531888 nova_compute[186788]: 2025-11-22 07:44:51.145 186792 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi234bmsv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 22 02:44:51 np0005531888 nova_compute[186788]: 2025-11-22 07:44:51.200 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:51 np0005531888 nova_compute[186788]: 2025-11-22 07:44:51.715 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:52 np0005531888 nova_compute[186788]: 2025-11-22 07:44:52.667 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797477.6663334, 99dcf8de-456d-4737-8362-0ddfc942c00a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:52 np0005531888 nova_compute[186788]: 2025-11-22 07:44:52.668 186792 INFO nova.compute.manager [-] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:44:52 np0005531888 nova_compute[186788]: 2025-11-22 07:44:52.693 186792 DEBUG nova.compute.manager [None req-6b577dee-ec8d-4233-af29-2a375bc33c64 - - - - - -] [instance: 99dcf8de-456d-4737-8362-0ddfc942c00a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:52 np0005531888 nova_compute[186788]: 2025-11-22 07:44:52.702 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:52 np0005531888 podman[216083]: 2025-11-22 07:44:52.723201025 +0000 UTC m=+0.075878489 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:44:52 np0005531888 podman[216084]: 2025-11-22 07:44:52.76140155 +0000 UTC m=+0.113932220 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 02:44:53 np0005531888 nova_compute[186788]: 2025-11-22 07:44:53.336 186792 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi234bmsv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='feb5ca5f-df67-4f29-9c21-71ba30b5af9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 22 02:44:53 np0005531888 nova_compute[186788]: 2025-11-22 07:44:53.382 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:53 np0005531888 nova_compute[186788]: 2025-11-22 07:44:53.383 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquired lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:53 np0005531888 nova_compute[186788]: 2025-11-22 07:44:53.383 186792 DEBUG nova.network.neutron [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.507 186792 DEBUG nova.network.neutron [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updating instance_info_cache with network_info: [{"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.670 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Releasing lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.682 186792 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi234bmsv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='feb5ca5f-df67-4f29-9c21-71ba30b5af9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.683 186792 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Creating instance directory: /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.683 186792 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Creating disk.info with the contents: {'/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk': 'qcow2', '/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.684 186792 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.684 186792 DEBUG nova.objects.instance [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'trusted_certs' on Instance uuid feb5ca5f-df67-4f29-9c21-71ba30b5af9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.744 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.822 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.824 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.825 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.836 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.897 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.899 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.937 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.939 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.940 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.995 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.997 186792 DEBUG nova.virt.disk.api [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Checking if we can resize image /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:44:55 np0005531888 nova_compute[186788]: 2025-11-22 07:44:55.998 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.055 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.056 186792 DEBUG nova.virt.disk.api [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Cannot resize image /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.057 186792 DEBUG nova.objects.instance [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'migration_context' on Instance uuid feb5ca5f-df67-4f29-9c21-71ba30b5af9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.070 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.095 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.098 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config to /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.098 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.202 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.534 186792 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.535 186792 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.536 186792 DEBUG nova.virt.libvirt.vif [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:44:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1661145969',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1661145969',id=20,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:44:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-lpdner4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:44Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=feb5ca5f-df67-4f29-9c21-71ba30b5af9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.537 186792 DEBUG nova.network.os_vif_util [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.538 186792 DEBUG nova.network.os_vif_util [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.538 186792 DEBUG os_vif [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.539 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.540 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.541 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.544 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.545 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ab0012c-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.545 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ab0012c-e7, col_values=(('external_ids', {'iface-id': '4ab0012c-e73f-4cd6-b146-527583d046f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:22:c9', 'vm-uuid': 'feb5ca5f-df67-4f29-9c21-71ba30b5af9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.547 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:56 np0005531888 NetworkManager[55166]: <info>  [1763797496.5492] manager: (tap4ab0012c-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.550 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.556 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.558 186792 INFO os_vif [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7')#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.559 186792 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 22 02:44:56 np0005531888 nova_compute[186788]: 2025-11-22 07:44:56.559 186792 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi234bmsv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='feb5ca5f-df67-4f29-9c21-71ba30b5af9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 22 02:44:56 np0005531888 podman[216161]: 2025-11-22 07:44:56.705776805 +0000 UTC m=+0.064005292 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.680 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "2160c105-2e0f-46fc-9039-28d7d834fc0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.681 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.699 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.778 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.779 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.790 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.792 186792 INFO nova.compute.claims [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.949 186792 DEBUG nova.compute.provider_tree [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.963 186792 DEBUG nova.scheduler.client.report [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.991 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:57 np0005531888 nova_compute[186788]: 2025-11-22 07:44:57.993 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.075 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.076 186792 DEBUG nova.network.neutron [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.130 186792 INFO nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.173 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.307 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.308 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.309 186792 INFO nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Creating image(s)#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.310 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "/var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.310 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "/var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.311 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "/var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.324 186792 DEBUG nova.policy [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7564d82375174ff8a84321bbd9d3ff32', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '11ecece124664a798ff1c73359735918', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.329 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.386 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.388 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.389 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.400 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.463 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.464 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.503 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.505 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.505 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.572 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.574 186792 DEBUG nova.virt.disk.api [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Checking if we can resize image /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.574 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.631 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.632 186792 DEBUG nova.virt.disk.api [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Cannot resize image /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.633 186792 DEBUG nova.objects.instance [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lazy-loading 'migration_context' on Instance uuid 2160c105-2e0f-46fc-9039-28d7d834fc0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.666 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.667 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Ensure instance console log exists: /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.668 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.668 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:58 np0005531888 nova_compute[186788]: 2025-11-22 07:44:58.668 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:00 np0005531888 nova_compute[186788]: 2025-11-22 07:45:00.257 186792 DEBUG nova.network.neutron [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Port 4ab0012c-e73f-4cd6-b146-527583d046f3 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 22 02:45:00 np0005531888 nova_compute[186788]: 2025-11-22 07:45:00.267 186792 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi234bmsv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='feb5ca5f-df67-4f29-9c21-71ba30b5af9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 22 02:45:00 np0005531888 nova_compute[186788]: 2025-11-22 07:45:00.388 186792 DEBUG nova.network.neutron [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Successfully created port: a6bc44ec-f3c6-4e42-8d58-733223efef87 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:45:00 np0005531888 systemd[1]: Starting libvirt proxy daemon...
Nov 22 02:45:00 np0005531888 systemd[1]: Started libvirt proxy daemon.
Nov 22 02:45:00 np0005531888 podman[216200]: 2025-11-22 07:45:00.511921672 +0000 UTC m=+0.056793671 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:45:00 np0005531888 kernel: tap4ab0012c-e7: entered promiscuous mode
Nov 22 02:45:00 np0005531888 NetworkManager[55166]: <info>  [1763797500.6353] manager: (tap4ab0012c-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Nov 22 02:45:00 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:00Z|00064|binding|INFO|Claiming lport 4ab0012c-e73f-4cd6-b146-527583d046f3 for this additional chassis.
Nov 22 02:45:00 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:00Z|00065|binding|INFO|4ab0012c-e73f-4cd6-b146-527583d046f3: Claiming fa:16:3e:c2:22:c9 10.100.0.5
Nov 22 02:45:00 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:00Z|00066|binding|INFO|Claiming lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 for this additional chassis.
Nov 22 02:45:00 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:00Z|00067|binding|INFO|77e99205-9615-4ea6-ab25-d16bf8bb4804: Claiming fa:16:3e:ef:ca:bc 19.80.0.156
Nov 22 02:45:00 np0005531888 nova_compute[186788]: 2025-11-22 07:45:00.636 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:00 np0005531888 nova_compute[186788]: 2025-11-22 07:45:00.640 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:00 np0005531888 systemd-udevd[216253]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:45:00 np0005531888 systemd-machined[153106]: New machine qemu-8-instance-00000014.
Nov 22 02:45:00 np0005531888 NetworkManager[55166]: <info>  [1763797500.6863] device (tap4ab0012c-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:45:00 np0005531888 NetworkManager[55166]: <info>  [1763797500.6873] device (tap4ab0012c-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:45:00 np0005531888 systemd[1]: Started Virtual Machine qemu-8-instance-00000014.
Nov 22 02:45:00 np0005531888 nova_compute[186788]: 2025-11-22 07:45:00.723 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:00 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:00Z|00068|binding|INFO|Setting lport 4ab0012c-e73f-4cd6-b146-527583d046f3 ovn-installed in OVS
Nov 22 02:45:00 np0005531888 nova_compute[186788]: 2025-11-22 07:45:00.733 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.204 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.548 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.618 186792 DEBUG nova.network.neutron [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Successfully updated port: a6bc44ec-f3c6-4e42-8d58-733223efef87 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.632 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "refresh_cache-2160c105-2e0f-46fc-9039-28d7d834fc0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.632 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquired lock "refresh_cache-2160c105-2e0f-46fc-9039-28d7d834fc0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.632 186792 DEBUG nova.network.neutron [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.665 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797501.6649563, feb5ca5f-df67-4f29-9c21-71ba30b5af9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.666 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.696 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.756 186792 DEBUG nova.compute.manager [req-14d37301-54fd-4b02-99e4-fbe07abe567e req-10e9388b-8d9a-4dc7-b827-7f70a7aeb9d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received event network-changed-a6bc44ec-f3c6-4e42-8d58-733223efef87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.756 186792 DEBUG nova.compute.manager [req-14d37301-54fd-4b02-99e4-fbe07abe567e req-10e9388b-8d9a-4dc7-b827-7f70a7aeb9d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Refreshing instance network info cache due to event network-changed-a6bc44ec-f3c6-4e42-8d58-733223efef87. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.756 186792 DEBUG oslo_concurrency.lockutils [req-14d37301-54fd-4b02-99e4-fbe07abe567e req-10e9388b-8d9a-4dc7-b827-7f70a7aeb9d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-2160c105-2e0f-46fc-9039-28d7d834fc0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:01 np0005531888 nova_compute[186788]: 2025-11-22 07:45:01.855 186792 DEBUG nova.network.neutron [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:02 np0005531888 nova_compute[186788]: 2025-11-22 07:45:02.536 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797502.5364351, feb5ca5f-df67-4f29-9c21-71ba30b5af9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:02 np0005531888 nova_compute[186788]: 2025-11-22 07:45:02.537 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:45:02 np0005531888 nova_compute[186788]: 2025-11-22 07:45:02.563 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:02 np0005531888 nova_compute[186788]: 2025-11-22 07:45:02.569 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:02 np0005531888 nova_compute[186788]: 2025-11-22 07:45:02.590 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.454 186792 DEBUG nova.network.neutron [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Updating instance_info_cache with network_info: [{"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.472 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Releasing lock "refresh_cache-2160c105-2e0f-46fc-9039-28d7d834fc0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.473 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Instance network_info: |[{"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.473 186792 DEBUG oslo_concurrency.lockutils [req-14d37301-54fd-4b02-99e4-fbe07abe567e req-10e9388b-8d9a-4dc7-b827-7f70a7aeb9d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-2160c105-2e0f-46fc-9039-28d7d834fc0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.473 186792 DEBUG nova.network.neutron [req-14d37301-54fd-4b02-99e4-fbe07abe567e req-10e9388b-8d9a-4dc7-b827-7f70a7aeb9d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Refreshing network info cache for port a6bc44ec-f3c6-4e42-8d58-733223efef87 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.476 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Start _get_guest_xml network_info=[{"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.481 186792 WARNING nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.485 186792 DEBUG nova.virt.libvirt.host [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.487 186792 DEBUG nova.virt.libvirt.host [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.492 186792 DEBUG nova.virt.libvirt.host [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.493 186792 DEBUG nova.virt.libvirt.host [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.494 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.494 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.495 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.495 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.495 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.495 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.496 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.496 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.496 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.496 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.496 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.496 186792 DEBUG nova.virt.hardware [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.500 186792 DEBUG nova.virt.libvirt.vif [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:44:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-509518702',display_name='tempest-ImagesNegativeTestJSON-server-509518702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-509518702',id=21,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11ecece124664a798ff1c73359735918',ramdisk_id='',reservation_id='r-5zt6n2bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1117007124',owner_user_name='tempest-ImagesNegativeTestJSON-1117007124-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:58Z,user_data=None,user_id='7564d82375174ff8a84321bbd9d3ff32',uuid=2160c105-2e0f-46fc-9039-28d7d834fc0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.500 186792 DEBUG nova.network.os_vif_util [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Converting VIF {"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.501 186792 DEBUG nova.network.os_vif_util [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:13:22,bridge_name='br-int',has_traffic_filtering=True,id=a6bc44ec-f3c6-4e42-8d58-733223efef87,network=Network(57cb06ab-e72c-4c97-9546-674aecc5e6ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc44ec-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.502 186792 DEBUG nova.objects.instance [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2160c105-2e0f-46fc-9039-28d7d834fc0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.520 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <uuid>2160c105-2e0f-46fc-9039-28d7d834fc0c</uuid>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <name>instance-00000015</name>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <nova:name>tempest-ImagesNegativeTestJSON-server-509518702</nova:name>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:45:03</nova:creationTime>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        <nova:user uuid="7564d82375174ff8a84321bbd9d3ff32">tempest-ImagesNegativeTestJSON-1117007124-project-member</nova:user>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        <nova:project uuid="11ecece124664a798ff1c73359735918">tempest-ImagesNegativeTestJSON-1117007124</nova:project>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        <nova:port uuid="a6bc44ec-f3c6-4e42-8d58-733223efef87">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <entry name="serial">2160c105-2e0f-46fc-9039-28d7d834fc0c</entry>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <entry name="uuid">2160c105-2e0f-46fc-9039-28d7d834fc0c</entry>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk.config"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:0e:13:22"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <target dev="tapa6bc44ec-f3"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/console.log" append="off"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:45:03 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:45:03 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:45:03 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:45:03 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.521 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Preparing to wait for external event network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.521 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.521 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.521 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.522 186792 DEBUG nova.virt.libvirt.vif [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:44:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-509518702',display_name='tempest-ImagesNegativeTestJSON-server-509518702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-509518702',id=21,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='11ecece124664a798ff1c73359735918',ramdisk_id='',reservation_id='r-5zt6n2bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1117007124',owner_user_name='tempest-ImagesNegativeTestJSON-1117007124-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:58Z,user_data=None,user_id='7564d82375174ff8a84321bbd9d3ff32',uuid=2160c105-2e0f-46fc-9039-28d7d834fc0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.522 186792 DEBUG nova.network.os_vif_util [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Converting VIF {"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.523 186792 DEBUG nova.network.os_vif_util [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:13:22,bridge_name='br-int',has_traffic_filtering=True,id=a6bc44ec-f3c6-4e42-8d58-733223efef87,network=Network(57cb06ab-e72c-4c97-9546-674aecc5e6ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc44ec-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.523 186792 DEBUG os_vif [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:13:22,bridge_name='br-int',has_traffic_filtering=True,id=a6bc44ec-f3c6-4e42-8d58-733223efef87,network=Network(57cb06ab-e72c-4c97-9546-674aecc5e6ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc44ec-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.523 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.524 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.524 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.528 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.528 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6bc44ec-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.530 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa6bc44ec-f3, col_values=(('external_ids', {'iface-id': 'a6bc44ec-f3c6-4e42-8d58-733223efef87', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:13:22', 'vm-uuid': '2160c105-2e0f-46fc-9039-28d7d834fc0c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.532 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:03 np0005531888 NetworkManager[55166]: <info>  [1763797503.5332] manager: (tapa6bc44ec-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.533 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.539 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.540 186792 INFO os_vif [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:13:22,bridge_name='br-int',has_traffic_filtering=True,id=a6bc44ec-f3c6-4e42-8d58-733223efef87,network=Network(57cb06ab-e72c-4c97-9546-674aecc5e6ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc44ec-f3')#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.597 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.597 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.597 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] No VIF found with MAC fa:16:3e:0e:13:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:45:03 np0005531888 nova_compute[186788]: 2025-11-22 07:45:03.598 186792 INFO nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Using config drive#033[00m
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00069|binding|INFO|Claiming lport 4ab0012c-e73f-4cd6-b146-527583d046f3 for this chassis.
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00070|binding|INFO|4ab0012c-e73f-4cd6-b146-527583d046f3: Claiming fa:16:3e:c2:22:c9 10.100.0.5
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00071|binding|INFO|Claiming lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 for this chassis.
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00072|binding|INFO|77e99205-9615-4ea6-ab25-d16bf8bb4804: Claiming fa:16:3e:ef:ca:bc 19.80.0.156
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00073|binding|INFO|Setting lport 4ab0012c-e73f-4cd6-b146-527583d046f3 up in Southbound
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00074|binding|INFO|Setting lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 up in Southbound
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.147 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:ca:bc 19.80.0.156'], port_security=['fa:16:3e:ef:ca:bc 19.80.0.156'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['4ab0012c-e73f-4cd6-b146-527583d046f3'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1561653152', 'neutron:cidrs': '19.80.0.156/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2de8212b-d744-4bab-b451-7daef022c1bc', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1561653152', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '4', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=b9f16a9f-d373-4cb7-a13f-5e20d7a18db8, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=77e99205-9615-4ea6-ab25-d16bf8bb4804) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.149 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:22:c9 10.100.0.5'], port_security=['fa:16:3e:c2:22:c9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-971128270', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'feb5ca5f-df67-4f29-9c21-71ba30b5af9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-971128270', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '11', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=4ab0012c-e73f-4cd6-b146-527583d046f3) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.150 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 77e99205-9615-4ea6-ab25-d16bf8bb4804 in datapath 2de8212b-d744-4bab-b451-7daef022c1bc bound to our chassis#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.152 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2de8212b-d744-4bab-b451-7daef022c1bc#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.165 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bd884141-2b89-4ad2-9416-254d8a6da958]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.167 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2de8212b-d1 in ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.169 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2de8212b-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.169 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cd8b9b-0125-4fcf-a66e-2c6b92c3e4f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.170 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a24b950f-c0b9-4e40-a66f-abbdb153bef2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.183 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[03b980fb-140b-4ba2-a08c-e5aa153ac8a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.200 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2ae281ab-dd22-44f0-bc67-61f3473e13a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.230 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[332826fd-0812-4629-89c9-faca09560f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.236 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e508de6f-1914-47a1-a7ca-23968fc10624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 NetworkManager[55166]: <info>  [1763797504.2386] manager: (tap2de8212b-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Nov 22 02:45:04 np0005531888 systemd-udevd[216296]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.270 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[7f8cec2b-210e-49ee-b54e-fa46a65c1d71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.275 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[42269d99-9369-4a6a-baba-3473c5e202c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 NetworkManager[55166]: <info>  [1763797504.3063] device (tap2de8212b-d0): carrier: link connected
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.312 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[93b4797e-f1ad-472b-8e38-e0c07f8392f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.332 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b5330a-3426-466a-9997-b9012f9f94c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2de8212b-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:70:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424495, 'reachable_time': 35769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216317, 'error': None, 'target': 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.353 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d06f1e14-2e55-49ab-a66d-0c8cd1f66419]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:70ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424495, 'tstamp': 424495}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216318, 'error': None, 'target': 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.374 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2f979c5c-3f1d-41eb-903a-22deb5f74955]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2de8212b-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:70:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424495, 'reachable_time': 35769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216319, 'error': None, 'target': 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.388 186792 INFO nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Creating config drive at /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk.config#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.396 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_0xmlbe7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.416 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4a554f60-3006-4d1d-a726-9f8517f8de31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.477 186792 INFO nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Post operation of migration started#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.483 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ba60f965-da84-4550-a1fb-144bdba813e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.485 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2de8212b-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.486 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.486 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2de8212b-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.488 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:04 np0005531888 NetworkManager[55166]: <info>  [1763797504.4889] manager: (tap2de8212b-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 22 02:45:04 np0005531888 kernel: tap2de8212b-d0: entered promiscuous mode
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.493 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.495 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2de8212b-d0, col_values=(('external_ids', {'iface-id': 'ebed6d9f-62b8-40d5-8d3c-93d6149e3602'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.496 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00075|binding|INFO|Releasing lport ebed6d9f-62b8-40d5-8d3c-93d6149e3602 from this chassis (sb_readonly=0)
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.512 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.517 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.517 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2de8212b-d744-4bab-b451-7daef022c1bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2de8212b-d744-4bab-b451-7daef022c1bc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.519 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1fda5a57-b554-486b-b18a-f61ec0e8d457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.519 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-2de8212b-d744-4bab-b451-7daef022c1bc
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/2de8212b-d744-4bab-b451-7daef022c1bc.pid.haproxy
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 2de8212b-d744-4bab-b451-7daef022c1bc
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.521 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'env', 'PROCESS_TAG=haproxy-2de8212b-d744-4bab-b451-7daef022c1bc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2de8212b-d744-4bab-b451-7daef022c1bc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.523 186792 DEBUG oslo_concurrency.processutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_0xmlbe7" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:04 np0005531888 NetworkManager[55166]: <info>  [1763797504.5818] manager: (tapa6bc44ec-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Nov 22 02:45:04 np0005531888 systemd-udevd[216311]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:45:04 np0005531888 kernel: tapa6bc44ec-f3: entered promiscuous mode
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00076|binding|INFO|Claiming lport a6bc44ec-f3c6-4e42-8d58-733223efef87 for this chassis.
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00077|binding|INFO|a6bc44ec-f3c6-4e42-8d58-733223efef87: Claiming fa:16:3e:0e:13:22 10.100.0.12
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.585 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:04 np0005531888 NetworkManager[55166]: <info>  [1763797504.5933] device (tapa6bc44ec-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:45:04 np0005531888 NetworkManager[55166]: <info>  [1763797504.5947] device (tapa6bc44ec-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:45:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:04.598 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:13:22 10.100.0.12'], port_security=['fa:16:3e:0e:13:22 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2160c105-2e0f-46fc-9039-28d7d834fc0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57cb06ab-e72c-4c97-9546-674aecc5e6ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11ecece124664a798ff1c73359735918', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd34333b2-3a84-41f4-9cf7-6015655c2033', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=587792a0-45b0-4481-9df3-aa1d3d5dc7e8, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a6bc44ec-f3c6-4e42-8d58-733223efef87) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:04 np0005531888 systemd-machined[153106]: New machine qemu-9-instance-00000015.
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.641 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00078|binding|INFO|Setting lport a6bc44ec-f3c6-4e42-8d58-733223efef87 ovn-installed in OVS
Nov 22 02:45:04 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:04Z|00079|binding|INFO|Setting lport a6bc44ec-f3c6-4e42-8d58-733223efef87 up in Southbound
Nov 22 02:45:04 np0005531888 systemd[1]: Started Virtual Machine qemu-9-instance-00000015.
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:04 np0005531888 podman[216380]: 2025-11-22 07:45:04.920025964 +0000 UTC m=+0.054479183 container create 07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.957 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797504.9565406, 2160c105-2e0f-46fc-9039-28d7d834fc0c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.957 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:04 np0005531888 systemd[1]: Started libpod-conmon-07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf.scope.
Nov 22 02:45:04 np0005531888 podman[216380]: 2025-11-22 07:45:04.892747881 +0000 UTC m=+0.027201130 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.992 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.995 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797504.956983, 2160c105-2e0f-46fc-9039-28d7d834fc0c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:04 np0005531888 nova_compute[186788]: 2025-11-22 07:45:04.995 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:45:04 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:45:05 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a80ed86de3f7329bdf6091e0448ef38f74d4866e345e3dee2aa840d794428d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.010 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.015 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:05 np0005531888 podman[216380]: 2025-11-22 07:45:05.019793199 +0000 UTC m=+0.154246438 container init 07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:45:05 np0005531888 podman[216380]: 2025-11-22 07:45:05.026384654 +0000 UTC m=+0.160837863 container start 07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.039 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:05 np0005531888 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216401]: [NOTICE]   (216405) : New worker (216407) forked
Nov 22 02:45:05 np0005531888 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216401]: [NOTICE]   (216405) : Loading success.
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.060 186792 DEBUG nova.compute.manager [req-c066492b-e771-4f36-af85-2d951eb88d16 req-eb392ad8-bfe0-4198-adb1-b86292c0490c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received event network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.060 186792 DEBUG oslo_concurrency.lockutils [req-c066492b-e771-4f36-af85-2d951eb88d16 req-eb392ad8-bfe0-4198-adb1-b86292c0490c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.061 186792 DEBUG oslo_concurrency.lockutils [req-c066492b-e771-4f36-af85-2d951eb88d16 req-eb392ad8-bfe0-4198-adb1-b86292c0490c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.061 186792 DEBUG oslo_concurrency.lockutils [req-c066492b-e771-4f36-af85-2d951eb88d16 req-eb392ad8-bfe0-4198-adb1-b86292c0490c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.061 186792 DEBUG nova.compute.manager [req-c066492b-e771-4f36-af85-2d951eb88d16 req-eb392ad8-bfe0-4198-adb1-b86292c0490c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Processing event network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.062 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.065 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797505.065335, 2160c105-2e0f-46fc-9039-28d7d834fc0c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.065 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.067 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.070 186792 INFO nova.virt.libvirt.driver [-] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Instance spawned successfully.#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.071 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.085 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.096 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.099 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.100 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.100 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.101 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.101 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.101 186792 DEBUG nova.virt.libvirt.driver [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.111 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 4ab0012c-e73f-4cd6-b146-527583d046f3 in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 unbound from our chassis#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.116 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.128 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8da410de-2337-4638-8d42-ae8219719e28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.129 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd5fa4f6-01 in ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.131 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd5fa4f6-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.132 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a01de4-4d2f-468a-88fa-e7c74f9e5fc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.134 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bb9202cb-af9f-466f-9715-6edfc9b1607e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.144 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.145 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[8778256e-4ba2-444f-9281-f351f2c90a5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.157 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[63f3fe44-283e-4230-b022-808f575be67a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.181 186792 INFO nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Took 6.87 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.182 186792 DEBUG nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.194 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3adbd9-d598-4119-b0fc-c8c9dedc447c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 NetworkManager[55166]: <info>  [1763797505.2030] manager: (tapcd5fa4f6-00): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.205 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3f6cef93-0dd9-484b-b39c-1b5489bb316e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.235 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[9deb555d-452d-4981-b58d-72dd4db2ff75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.239 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb2e8fe-f0d9-46b7-9ac0-50357cfe6b63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 NetworkManager[55166]: <info>  [1763797505.2643] device (tapcd5fa4f6-00): carrier: link connected
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.269 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[ac58d0b3-e15d-4ccf-9f30-671c5bc3f222]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.281 186792 INFO nova.compute.manager [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Took 7.53 seconds to build instance.#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.290 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[788a23f3-7366-4856-b429-42a7c805bcf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424590, 'reachable_time': 44559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216426, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.307 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7655f30c-c9c9-459e-9114-8c30d20e0bff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:db2b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424590, 'tstamp': 424590}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216427, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.309 186792 DEBUG oslo_concurrency.lockutils [None req-69f88663-5d35-48e0-bb21-d36bca508457 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.315 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.315 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquired lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.316 186792 DEBUG nova.network.neutron [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.328 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b41cd2c1-c290-496d-adfa-9912c3ab2e65]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 3, 'rx_bytes': 90, 'tx_bytes': 266, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 3, 'rx_bytes': 90, 'tx_bytes': 266, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424590, 'reachable_time': 44559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 224, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 224, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216428, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.366 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[59c601e4-a9b5-46e2-a2ed-138ed727de57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.418 186792 DEBUG nova.compute.manager [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.432 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[56d597f9-ce28-4fce-adb2-322c90b5d872]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.434 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.435 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.436 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd5fa4f6-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:05 np0005531888 NetworkManager[55166]: <info>  [1763797505.4395] manager: (tapcd5fa4f6-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 22 02:45:05 np0005531888 kernel: tapcd5fa4f6-00: entered promiscuous mode
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.443 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd5fa4f6-00, col_values=(('external_ids', {'iface-id': 'f400467f-3f35-4435-bb4a-0b3da05366fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:05 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:05Z|00080|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.446 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.447 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.454 186792 DEBUG nova.network.neutron [req-14d37301-54fd-4b02-99e4-fbe07abe567e req-10e9388b-8d9a-4dc7-b827-7f70a7aeb9d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Updated VIF entry in instance network info cache for port a6bc44ec-f3c6-4e42-8d58-733223efef87. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.455 186792 DEBUG nova.network.neutron [req-14d37301-54fd-4b02-99e4-fbe07abe567e req-10e9388b-8d9a-4dc7-b827-7f70a7aeb9d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Updating instance_info_cache with network_info: [{"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.456 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ea55c356-cb81-48b1-8fbe-58b966c44845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.458 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.458 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:05.460 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'env', 'PROCESS_TAG=haproxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.481 186792 DEBUG oslo_concurrency.lockutils [req-14d37301-54fd-4b02-99e4-fbe07abe567e req-10e9388b-8d9a-4dc7-b827-7f70a7aeb9d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-2160c105-2e0f-46fc-9039-28d7d834fc0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.527 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.527 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.544 186792 DEBUG nova.objects.instance [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'pci_requests' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.558 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.558 186792 INFO nova.compute.claims [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.559 186792 DEBUG nova.objects.instance [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'resources' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.568 186792 DEBUG nova.objects.instance [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'pci_devices' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.611 186792 INFO nova.compute.resource_tracker [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Updating resource usage from migration 6daa597c-30c1-47c8-a172-26297bb49453#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.611 186792 DEBUG nova.compute.resource_tracker [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Starting to track incoming migration 6daa597c-30c1-47c8-a172-26297bb49453 with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.720 186792 DEBUG nova.compute.provider_tree [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.739 186792 DEBUG nova.scheduler.client.report [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.764 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:05 np0005531888 nova_compute[186788]: 2025-11-22 07:45:05.764 186792 INFO nova.compute.manager [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Migrating#033[00m
Nov 22 02:45:05 np0005531888 podman[216460]: 2025-11-22 07:45:05.845425937 +0000 UTC m=+0.050099624 container create 9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 02:45:05 np0005531888 systemd[1]: Started libpod-conmon-9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f.scope.
Nov 22 02:45:05 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:45:05 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/798cef787ddeabb8aba1f27f8bb756bd58927e41d54d9919a653d93c034148af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:45:05 np0005531888 podman[216460]: 2025-11-22 07:45:05.819009877 +0000 UTC m=+0.023683594 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:45:05 np0005531888 podman[216460]: 2025-11-22 07:45:05.930382811 +0000 UTC m=+0.135056528 container init 9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:45:05 np0005531888 podman[216460]: 2025-11-22 07:45:05.938196297 +0000 UTC m=+0.142869994 container start 9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:45:05 np0005531888 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216475]: [NOTICE]   (216479) : New worker (216481) forked
Nov 22 02:45:05 np0005531888 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216475]: [NOTICE]   (216479) : Loading success.
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.022 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a6bc44ec-f3c6-4e42-8d58-733223efef87 in datapath 57cb06ab-e72c-4c97-9546-674aecc5e6ef unbound from our chassis#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.026 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 57cb06ab-e72c-4c97-9546-674aecc5e6ef#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.038 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aca57d0e-0877-4485-92bc-3ea7cc5c4ee9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.039 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap57cb06ab-e1 in ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.041 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap57cb06ab-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.041 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9c7f71-6fdd-437c-a6cb-6608ea225f82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.042 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a91c2cf4-2c2b-4e2e-a8ad-5721550bcac5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.054 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[90deb161-4b89-4f1f-b199-2a12bcc731d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.068 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec05562-a503-4050-a4a3-fa58913f0442]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.101 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdf26e2-f95b-469e-afd0-79c0c6f92886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 NetworkManager[55166]: <info>  [1763797506.1098] manager: (tap57cb06ab-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.111 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4bafcf54-6039-4bdc-8b5c-ea781e0c476b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.145 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5c92afed-c493-4b21-8c0a-4a9a6346ad05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.148 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[7b181da0-21fe-4671-a894-b785236b8878]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 NetworkManager[55166]: <info>  [1763797506.1741] device (tap57cb06ab-e0): carrier: link connected
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.180 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d475b94f-36d8-465d-9930-232a9fba44ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.199 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[98700ed5-1119-46e0-9a9b-a29cd931ad47]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57cb06ab-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:be:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424681, 'reachable_time': 15570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216500, 'error': None, 'target': 'ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.218 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[feabf452-dc80-400d-bfaa-8c5d347f0056]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:bed4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424681, 'tstamp': 424681}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216501, 'error': None, 'target': 'ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.236 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eea3edfd-1482-4108-b35f-e13fe190cf9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57cb06ab-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:be:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424681, 'reachable_time': 15570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216502, 'error': None, 'target': 'ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.268 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ef56ebbb-6e29-4a25-bd55-b9965fe43c6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.334 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[889397ca-6799-4285-89f5-7bf2b6f2b152]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.336 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57cb06ab-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.337 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.337 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57cb06ab-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.339 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:06 np0005531888 kernel: tap57cb06ab-e0: entered promiscuous mode
Nov 22 02:45:06 np0005531888 NetworkManager[55166]: <info>  [1763797506.3407] manager: (tap57cb06ab-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.342 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap57cb06ab-e0, col_values=(('external_ids', {'iface-id': 'b376944e-8e66-4908-9103-25b506f68bc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:06 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:06Z|00081|binding|INFO|Releasing lport b376944e-8e66-4908-9103-25b506f68bc0 from this chassis (sb_readonly=0)
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.343 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.355 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.356 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/57cb06ab-e72c-4c97-9546-674aecc5e6ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/57cb06ab-e72c-4c97-9546-674aecc5e6ef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.357 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6c868532-9e39-43eb-974f-db14ba331ef3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.358 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-57cb06ab-e72c-4c97-9546-674aecc5e6ef
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/57cb06ab-e72c-4c97-9546-674aecc5e6ef.pid.haproxy
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 57cb06ab-e72c-4c97-9546-674aecc5e6ef
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.359 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef', 'env', 'PROCESS_TAG=haproxy-57cb06ab-e72c-4c97-9546-674aecc5e6ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/57cb06ab-e72c-4c97-9546-674aecc5e6ef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:45:06 np0005531888 podman[216514]: 2025-11-22 07:45:06.696377669 +0000 UTC m=+0.069771436 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:45:06 np0005531888 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:45:06 np0005531888 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:45:06 np0005531888 systemd-logind[825]: New session 34 of user nova.
Nov 22 02:45:06 np0005531888 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:45:06 np0005531888 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:45:06 np0005531888 podman[216552]: 2025-11-22 07:45:06.786851331 +0000 UTC m=+0.064514614 container create 110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.802 186792 DEBUG nova.network.neutron [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updating instance_info_cache with network_info: [{"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.821 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Releasing lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:06 np0005531888 systemd[1]: Started libpod-conmon-110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98.scope.
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.846 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.846 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.846 186792 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:06 np0005531888 podman[216552]: 2025-11-22 07:45:06.753249311 +0000 UTC m=+0.030912614 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.858 186792 INFO nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 22 02:45:06 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:45:06 np0005531888 virtqemud[186358]: Domain id=8 name='instance-00000014' uuid=feb5ca5f-df67-4f29-9c21-71ba30b5af9c is tainted: custom-monitor
Nov 22 02:45:06 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ab881ea33ef27d499a910dc0626a506bdb8a6046fd037593309ba13daf722c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.874 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "2160c105-2e0f-46fc-9039-28d7d834fc0c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.875 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.875 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.876 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.877 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:06 np0005531888 podman[216552]: 2025-11-22 07:45:06.883996151 +0000 UTC m=+0.161659464 container init 110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.886 186792 INFO nova.compute.manager [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Terminating instance#033[00m
Nov 22 02:45:06 np0005531888 podman[216552]: 2025-11-22 07:45:06.88997428 +0000 UTC m=+0.167637563 container start 110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.896 186792 DEBUG nova.compute.manager [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:45:06 np0005531888 neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef[216571]: [NOTICE]   (216584) : New worker (216586) forked
Nov 22 02:45:06 np0005531888 neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef[216571]: [NOTICE]   (216584) : Loading success.
Nov 22 02:45:06 np0005531888 kernel: tapa6bc44ec-f3 (unregistering): left promiscuous mode
Nov 22 02:45:06 np0005531888 NetworkManager[55166]: <info>  [1763797506.9262] device (tapa6bc44ec-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:45:06 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:06Z|00082|binding|INFO|Releasing lport a6bc44ec-f3c6-4e42-8d58-733223efef87 from this chassis (sb_readonly=0)
Nov 22 02:45:06 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:06Z|00083|binding|INFO|Setting lport a6bc44ec-f3c6-4e42-8d58-733223efef87 down in Southbound
Nov 22 02:45:06 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:06Z|00084|binding|INFO|Removing iface tapa6bc44ec-f3 ovn-installed in OVS
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.943 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:06 np0005531888 systemd[216564]: Queued start job for default target Main User Target.
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.948 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:13:22 10.100.0.12'], port_security=['fa:16:3e:0e:13:22 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2160c105-2e0f-46fc-9039-28d7d834fc0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57cb06ab-e72c-4c97-9546-674aecc5e6ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '11ecece124664a798ff1c73359735918', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd34333b2-3a84-41f4-9cf7-6015655c2033', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=587792a0-45b0-4481-9df3-aa1d3d5dc7e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a6bc44ec-f3c6-4e42-8d58-733223efef87) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:06 np0005531888 nova_compute[186788]: 2025-11-22 07:45:06.955 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:06 np0005531888 systemd[216564]: Created slice User Application Slice.
Nov 22 02:45:06 np0005531888 systemd[216564]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:45:06 np0005531888 systemd[216564]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:45:06 np0005531888 systemd[216564]: Reached target Paths.
Nov 22 02:45:06 np0005531888 systemd[216564]: Reached target Timers.
Nov 22 02:45:06 np0005531888 systemd[216564]: Starting D-Bus User Message Bus Socket...
Nov 22 02:45:06 np0005531888 systemd[216564]: Starting Create User's Volatile Files and Directories...
Nov 22 02:45:06 np0005531888 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000015.scope: Deactivated successfully.
Nov 22 02:45:06 np0005531888 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000015.scope: Consumed 2.132s CPU time.
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.975 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a6bc44ec-f3c6-4e42-8d58-733223efef87 in datapath 57cb06ab-e72c-4c97-9546-674aecc5e6ef unbound from our chassis#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.976 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 57cb06ab-e72c-4c97-9546-674aecc5e6ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.977 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8a215a-648a-4fc1-8940-d2d3e587a0d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:06.978 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef namespace which is not needed anymore#033[00m
Nov 22 02:45:06 np0005531888 systemd-machined[153106]: Machine qemu-9-instance-00000015 terminated.
Nov 22 02:45:06 np0005531888 systemd[216564]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:45:06 np0005531888 systemd[216564]: Reached target Sockets.
Nov 22 02:45:06 np0005531888 systemd[216564]: Finished Create User's Volatile Files and Directories.
Nov 22 02:45:06 np0005531888 systemd[216564]: Reached target Basic System.
Nov 22 02:45:06 np0005531888 systemd[216564]: Reached target Main User Target.
Nov 22 02:45:06 np0005531888 systemd[216564]: Startup finished in 186ms.
Nov 22 02:45:06 np0005531888 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:45:06 np0005531888 systemd[1]: Started Session 34 of User nova.
Nov 22 02:45:07 np0005531888 systemd[1]: session-34.scope: Deactivated successfully.
Nov 22 02:45:07 np0005531888 systemd-logind[825]: Session 34 logged out. Waiting for processes to exit.
Nov 22 02:45:07 np0005531888 systemd-logind[825]: Removed session 34.
Nov 22 02:45:07 np0005531888 neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef[216571]: [NOTICE]   (216584) : haproxy version is 2.8.14-c23fe91
Nov 22 02:45:07 np0005531888 neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef[216571]: [NOTICE]   (216584) : path to executable is /usr/sbin/haproxy
Nov 22 02:45:07 np0005531888 neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef[216571]: [WARNING]  (216584) : Exiting Master process...
Nov 22 02:45:07 np0005531888 neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef[216571]: [WARNING]  (216584) : Exiting Master process...
Nov 22 02:45:07 np0005531888 neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef[216571]: [ALERT]    (216584) : Current worker (216586) exited with code 143 (Terminated)
Nov 22 02:45:07 np0005531888 neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef[216571]: [WARNING]  (216584) : All workers exited. Exiting... (0)
Nov 22 02:45:07 np0005531888 systemd[1]: libpod-110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98.scope: Deactivated successfully.
Nov 22 02:45:07 np0005531888 conmon[216571]: conmon 110a38fcf3d20fff1ebf <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98.scope/container/memory.events
Nov 22 02:45:07 np0005531888 podman[216619]: 2025-11-22 07:45:07.135012109 +0000 UTC m=+0.050932056 container died 110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.156 186792 DEBUG nova.compute.manager [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received event network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.157 186792 DEBUG oslo_concurrency.lockutils [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.157 186792 DEBUG oslo_concurrency.lockutils [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.157 186792 DEBUG oslo_concurrency.lockutils [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.157 186792 DEBUG nova.compute.manager [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] No waiting events found dispatching network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.157 186792 WARNING nova.compute.manager [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received unexpected event network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.157 186792 DEBUG nova.compute.manager [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received event network-vif-unplugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.158 186792 DEBUG oslo_concurrency.lockutils [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.158 186792 DEBUG oslo_concurrency.lockutils [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.158 186792 DEBUG oslo_concurrency.lockutils [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.158 186792 DEBUG nova.compute.manager [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] No waiting events found dispatching network-vif-unplugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.158 186792 DEBUG nova.compute.manager [req-c0f48d5a-d66a-4ae0-8143-9bd4c73741c5 req-b8b058f3-4902-4a5c-93b1-c617bc5890c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received event network-vif-unplugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:45:07 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98-userdata-shm.mount: Deactivated successfully.
Nov 22 02:45:07 np0005531888 systemd[1]: var-lib-containers-storage-overlay-77ab881ea33ef27d499a910dc0626a506bdb8a6046fd037593309ba13daf722c-merged.mount: Deactivated successfully.
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.176 186792 INFO nova.virt.libvirt.driver [-] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Instance destroyed successfully.#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.177 186792 DEBUG nova.objects.instance [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lazy-loading 'resources' on Instance uuid 2160c105-2e0f-46fc-9039-28d7d834fc0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:07 np0005531888 podman[216619]: 2025-11-22 07:45:07.184741152 +0000 UTC m=+0.100661099 container cleanup 110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:45:07 np0005531888 systemd[1]: libpod-conmon-110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98.scope: Deactivated successfully.
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.198 186792 DEBUG nova.virt.libvirt.vif [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:44:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-509518702',display_name='tempest-ImagesNegativeTestJSON-server-509518702',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-509518702',id=21,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:45:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='11ecece124664a798ff1c73359735918',ramdisk_id='',reservation_id='r-5zt6n2bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-1117007124',owner_user_name='tempest-ImagesNegativeTestJSON-1117007124-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:45:05Z,user_data=None,user_id='7564d82375174ff8a84321bbd9d3ff32',uuid=2160c105-2e0f-46fc-9039-28d7d834fc0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.198 186792 DEBUG nova.network.os_vif_util [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Converting VIF {"id": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "address": "fa:16:3e:0e:13:22", "network": {"id": "57cb06ab-e72c-4c97-9546-674aecc5e6ef", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-800625021-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "11ecece124664a798ff1c73359735918", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6bc44ec-f3", "ovs_interfaceid": "a6bc44ec-f3c6-4e42-8d58-733223efef87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.200 186792 DEBUG nova.network.os_vif_util [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:13:22,bridge_name='br-int',has_traffic_filtering=True,id=a6bc44ec-f3c6-4e42-8d58-733223efef87,network=Network(57cb06ab-e72c-4c97-9546-674aecc5e6ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc44ec-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.202 186792 DEBUG os_vif [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:13:22,bridge_name='br-int',has_traffic_filtering=True,id=a6bc44ec-f3c6-4e42-8d58-733223efef87,network=Network(57cb06ab-e72c-4c97-9546-674aecc5e6ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc44ec-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.204 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.204 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6bc44ec-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.208 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.212 186792 INFO os_vif [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:13:22,bridge_name='br-int',has_traffic_filtering=True,id=a6bc44ec-f3c6-4e42-8d58-733223efef87,network=Network(57cb06ab-e72c-4c97-9546-674aecc5e6ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6bc44ec-f3')#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.213 186792 INFO nova.virt.libvirt.driver [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Deleting instance files /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c_del#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.213 186792 INFO nova.virt.libvirt.driver [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Deletion of /var/lib/nova/instances/2160c105-2e0f-46fc-9039-28d7d834fc0c_del complete#033[00m
Nov 22 02:45:07 np0005531888 systemd-logind[825]: New session 36 of user nova.
Nov 22 02:45:07 np0005531888 systemd[1]: Started Session 36 of User nova.
Nov 22 02:45:07 np0005531888 podman[216667]: 2025-11-22 07:45:07.26424034 +0000 UTC m=+0.045548379 container remove 110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.270 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c22909-f866-43e2-8884-3cd95c031071]: (4, ('Sat Nov 22 07:45:07 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef (110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98)\n110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98\nSat Nov 22 07:45:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef (110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98)\n110a38fcf3d20fff1ebf2869e5438cbdfaa54939d7f80437de5fce81b2c8ca98\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.272 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0fc3dccb-9a5a-4012-8b14-2c6cbb7388d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.272 186792 INFO nova.compute.manager [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.273 186792 DEBUG oslo.service.loopingcall [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.273 186792 DEBUG nova.compute.manager [-] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.273 186792 DEBUG nova.network.neutron [-] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.273 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57cb06ab-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.275 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:07 np0005531888 kernel: tap57cb06ab-e0: left promiscuous mode
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.288 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.293 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4976926e-cc13-4654-ba1e-1006cc1ea3f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:07 np0005531888 systemd[1]: session-36.scope: Deactivated successfully.
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.308 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e42cf1e4-f351-4181-b588-88cb4510af44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:07 np0005531888 systemd-logind[825]: Session 36 logged out. Waiting for processes to exit.
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.310 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9d5c50fe-a462-4339-b4a8-32fefadb90de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:07 np0005531888 systemd-logind[825]: Removed session 36.
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.325 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb5bf38-41ca-49cd-acbe-96c01df489db]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424673, 'reachable_time': 32545, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216683, 'error': None, 'target': 'ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.327 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-57cb06ab-e72c-4c97-9546-674aecc5e6ef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:45:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:07.327 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae5b122-b5e4-49e0-9f09-15b1443c7606]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:07 np0005531888 systemd[1]: run-netns-ovnmeta\x2d57cb06ab\x2de72c\x2d4c97\x2d9546\x2d674aecc5e6ef.mount: Deactivated successfully.
Nov 22 02:45:07 np0005531888 nova_compute[186788]: 2025-11-22 07:45:07.869 186792 INFO nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.082 186792 DEBUG nova.network.neutron [-] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.095 186792 INFO nova.compute.manager [-] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Took 0.82 seconds to deallocate network for instance.#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.158 186792 DEBUG nova.compute.manager [req-c3fbbf43-a3f1-40b4-b0c4-5c93af9ca9c4 req-f88313f5-c6ef-404d-9491-208fffb6728b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received event network-vif-deleted-a6bc44ec-f3c6-4e42-8d58-733223efef87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.161 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.161 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.273 186792 DEBUG nova.compute.provider_tree [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.288 186792 DEBUG nova.scheduler.client.report [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.309 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.345 186792 INFO nova.scheduler.client.report [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Deleted allocations for instance 2160c105-2e0f-46fc-9039-28d7d834fc0c#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.407 186792 DEBUG oslo_concurrency.lockutils [None req-4973cdb0-2774-4a4b-b8ea-761a0aebb4f5 7564d82375174ff8a84321bbd9d3ff32 11ecece124664a798ff1c73359735918 - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:08 np0005531888 podman[216688]: 2025-11-22 07:45:08.693641018 +0000 UTC m=+0.055266933 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.875 186792 INFO nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.882 186792 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:08 np0005531888 nova_compute[186788]: 2025-11-22 07:45:08.900 186792 DEBUG nova.objects.instance [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:45:09 np0005531888 nova_compute[186788]: 2025-11-22 07:45:09.243 186792 DEBUG nova.compute.manager [req-d7a0cdba-e118-498b-93e0-e0d975b26470 req-8fd815c6-6e41-4704-80ca-22ead8ef59d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received event network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:09 np0005531888 nova_compute[186788]: 2025-11-22 07:45:09.243 186792 DEBUG oslo_concurrency.lockutils [req-d7a0cdba-e118-498b-93e0-e0d975b26470 req-8fd815c6-6e41-4704-80ca-22ead8ef59d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:09 np0005531888 nova_compute[186788]: 2025-11-22 07:45:09.244 186792 DEBUG oslo_concurrency.lockutils [req-d7a0cdba-e118-498b-93e0-e0d975b26470 req-8fd815c6-6e41-4704-80ca-22ead8ef59d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:09 np0005531888 nova_compute[186788]: 2025-11-22 07:45:09.244 186792 DEBUG oslo_concurrency.lockutils [req-d7a0cdba-e118-498b-93e0-e0d975b26470 req-8fd815c6-6e41-4704-80ca-22ead8ef59d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2160c105-2e0f-46fc-9039-28d7d834fc0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:09 np0005531888 nova_compute[186788]: 2025-11-22 07:45:09.244 186792 DEBUG nova.compute.manager [req-d7a0cdba-e118-498b-93e0-e0d975b26470 req-8fd815c6-6e41-4704-80ca-22ead8ef59d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] No waiting events found dispatching network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:45:09 np0005531888 nova_compute[186788]: 2025-11-22 07:45:09.244 186792 WARNING nova.compute.manager [req-d7a0cdba-e118-498b-93e0-e0d975b26470 req-8fd815c6-6e41-4704-80ca-22ead8ef59d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Received unexpected event network-vif-plugged-a6bc44ec-f3c6-4e42-8d58-733223efef87 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.192 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "0319a6cd-b217-43df-aaf2-8c9f6688a151" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.192 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "0319a6cd-b217-43df-aaf2-8c9f6688a151" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.210 186792 DEBUG nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.289 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.290 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.297 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.297 186792 INFO nova.compute.claims [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.489 186792 DEBUG nova.compute.provider_tree [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.503 186792 DEBUG nova.scheduler.client.report [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.523 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.525 186792 DEBUG nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.579 186792 DEBUG nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.579 186792 DEBUG nova.network.neutron [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.603 186792 INFO nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.646 186792 DEBUG nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.765 186792 DEBUG nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.766 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.766 186792 INFO nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Creating image(s)#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.767 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "/var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.767 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "/var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.768 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "/var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.779 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.839 186792 DEBUG nova.network.neutron [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.840 186792 DEBUG nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.841 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.842 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.842 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.852 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.872 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.873 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.874 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.874 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.874 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.884 186792 INFO nova.compute.manager [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Terminating instance#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.890 186792 DEBUG nova.compute.manager [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:45:10 np0005531888 kernel: tap4ab0012c-e7 (unregistering): left promiscuous mode
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.915 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.916 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:10 np0005531888 NetworkManager[55166]: <info>  [1763797510.9167] device (tap4ab0012c-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:45:10 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:10Z|00085|binding|INFO|Releasing lport 4ab0012c-e73f-4cd6-b146-527583d046f3 from this chassis (sb_readonly=0)
Nov 22 02:45:10 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:10Z|00086|binding|INFO|Setting lport 4ab0012c-e73f-4cd6-b146-527583d046f3 down in Southbound
Nov 22 02:45:10 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:10Z|00087|binding|INFO|Releasing lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 from this chassis (sb_readonly=0)
Nov 22 02:45:10 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:10Z|00088|binding|INFO|Setting lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 down in Southbound
Nov 22 02:45:10 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:10Z|00089|binding|INFO|Removing iface tap4ab0012c-e7 ovn-installed in OVS
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.939 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:10 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:10Z|00090|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 02:45:10 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:10Z|00091|binding|INFO|Releasing lport ebed6d9f-62b8-40d5-8d3c-93d6149e3602 from this chassis (sb_readonly=0)
Nov 22 02:45:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:10.944 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:ca:bc 19.80.0.156'], port_security=['fa:16:3e:ef:ca:bc 19.80.0.156'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['4ab0012c-e73f-4cd6-b146-527583d046f3'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1561653152', 'neutron:cidrs': '19.80.0.156/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2de8212b-d744-4bab-b451-7daef022c1bc', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1561653152', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '5', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=b9f16a9f-d373-4cb7-a13f-5e20d7a18db8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=77e99205-9615-4ea6-ab25-d16bf8bb4804) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:10.946 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:22:c9 10.100.0.5'], port_security=['fa:16:3e:c2:22:c9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-971128270', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'feb5ca5f-df67-4f29-9c21-71ba30b5af9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-971128270', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '13', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=4ab0012c-e73f-4cd6-b146-527583d046f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:10.948 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 77e99205-9615-4ea6-ab25-d16bf8bb4804 in datapath 2de8212b-d744-4bab-b451-7daef022c1bc unbound from our chassis#033[00m
Nov 22 02:45:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:10.949 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2de8212b-d744-4bab-b451-7daef022c1bc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:45:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:10.950 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e682b55d-dbe5-456c-8f00-ea111e14a350]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:10.951 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc namespace which is not needed anymore#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.962 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.963 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.964 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:10 np0005531888 nova_compute[186788]: 2025-11-22 07:45:10.985 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.029 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.030 186792 DEBUG nova.virt.disk.api [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Checking if we can resize image /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.030 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.070 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531888 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000014.scope: Deactivated successfully.
Nov 22 02:45:11 np0005531888 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000014.scope: Consumed 1.884s CPU time.
Nov 22 02:45:11 np0005531888 systemd-machined[153106]: Machine qemu-8-instance-00000014 terminated.
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216401]: [NOTICE]   (216405) : haproxy version is 2.8.14-c23fe91
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216401]: [NOTICE]   (216405) : path to executable is /usr/sbin/haproxy
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216401]: [WARNING]  (216405) : Exiting Master process...
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216401]: [WARNING]  (216405) : Exiting Master process...
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216401]: [ALERT]    (216405) : Current worker (216407) exited with code 143 (Terminated)
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216401]: [WARNING]  (216405) : All workers exited. Exiting... (0)
Nov 22 02:45:11 np0005531888 systemd[1]: libpod-07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf.scope: Deactivated successfully.
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.104 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.105 186792 DEBUG nova.virt.disk.api [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Cannot resize image /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.105 186792 DEBUG nova.objects.instance [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lazy-loading 'migration_context' on Instance uuid 0319a6cd-b217-43df-aaf2-8c9f6688a151 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:11 np0005531888 podman[216749]: 2025-11-22 07:45:11.105995888 +0000 UTC m=+0.060964866 container died 07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.125 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.125 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Ensure instance console log exists: /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.126 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.126 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.126 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.128 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.136 186792 WARNING nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:11 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf-userdata-shm.mount: Deactivated successfully.
Nov 22 02:45:11 np0005531888 systemd[1]: var-lib-containers-storage-overlay-8a80ed86de3f7329bdf6091e0448ef38f74d4866e345e3dee2aa840d794428d4-merged.mount: Deactivated successfully.
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.153 186792 DEBUG nova.virt.libvirt.host [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.154 186792 DEBUG nova.virt.libvirt.host [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:45:11 np0005531888 podman[216749]: 2025-11-22 07:45:11.158019869 +0000 UTC m=+0.112988857 container cleanup 07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:45:11 np0005531888 systemd[1]: libpod-conmon-07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf.scope: Deactivated successfully.
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.167 186792 DEBUG nova.virt.libvirt.host [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.168 186792 DEBUG nova.virt.libvirt.host [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.169 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.169 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.170 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.170 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.170 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.170 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.170 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.170 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.171 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.171 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.171 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.171 186792 DEBUG nova.virt.hardware [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.175 186792 DEBUG nova.objects.instance [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0319a6cd-b217-43df-aaf2-8c9f6688a151 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.180 186792 INFO nova.virt.libvirt.driver [-] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Instance destroyed successfully.#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.180 186792 DEBUG nova.objects.instance [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lazy-loading 'resources' on Instance uuid feb5ca5f-df67-4f29-9c21-71ba30b5af9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.199 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <uuid>0319a6cd-b217-43df-aaf2-8c9f6688a151</uuid>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <name>instance-00000017</name>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-1161735959</nova:name>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:45:11</nova:creationTime>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:        <nova:user uuid="9dc3d104549745fdaace1dd5280da2f2">tempest-ServersAdminNegativeTestJSON-1347039048-project-member</nova:user>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:        <nova:project uuid="8778f7e37a30439da41d0a6b383be684">tempest-ServersAdminNegativeTestJSON-1347039048</nova:project>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <entry name="serial">0319a6cd-b217-43df-aaf2-8c9f6688a151</entry>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <entry name="uuid">0319a6cd-b217-43df-aaf2-8c9f6688a151</entry>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk.config"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/console.log" append="off"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:45:11 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:45:11 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:45:11 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:45:11 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.201 186792 DEBUG nova.virt.libvirt.vif [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:44:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1661145969',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1661145969',id=20,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:44:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-lpdner4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:45:08Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=feb5ca5f-df67-4f29-9c21-71ba30b5af9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.202 186792 DEBUG nova.network.os_vif_util [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converting VIF {"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.202 186792 DEBUG nova.network.os_vif_util [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.202 186792 DEBUG os_vif [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.204 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.204 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ab0012c-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.205 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.207 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.208 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.209 186792 INFO os_vif [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7')#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.210 186792 INFO nova.virt.libvirt.driver [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Deleting instance files /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c_del#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.211 186792 INFO nova.virt.libvirt.driver [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Deletion of /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c_del complete#033[00m
Nov 22 02:45:11 np0005531888 podman[216796]: 2025-11-22 07:45:11.225881107 +0000 UTC m=+0.045744225 container remove 07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.232 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[627699c6-bdfc-44cf-8c2e-547d2f31b4bd]: (4, ('Sat Nov 22 07:45:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc (07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf)\n07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf\nSat Nov 22 07:45:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc (07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf)\n07da27780ec73803f1dd26ded35593427e5e8f32a02a95a2bb23647595ec5daf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.234 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[be786e56-758b-4a5a-a53d-ae1f577689eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.235 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2de8212b-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.236 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531888 kernel: tap2de8212b-d0: left promiscuous mode
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.248 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.250 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[77eeaa05-5f6f-4054-8a53-f20837093380]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.270 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab7d50c-77be-40e0-9704-dd4a0855cf59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.271 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b60ef3-d071-476f-b0fa-057be23c3229]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.282 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.282 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.283 186792 INFO nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Using config drive#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.291 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fdec8c9b-3151-4a9b-ace9-6c9e229bfd55]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424486, 'reachable_time': 17468, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216810, 'error': None, 'target': 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.294 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.294 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9cf79d-16da-43c4-8f79-fd8af92ab5e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.294 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 4ab0012c-e73f-4cd6-b146-527583d046f3 in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 unbound from our chassis#033[00m
Nov 22 02:45:11 np0005531888 systemd[1]: run-netns-ovnmeta\x2d2de8212b\x2dd744\x2d4bab\x2db451\x2d7daef022c1bc.mount: Deactivated successfully.
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.295 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.296 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[04e73af4-edd4-4690-9af5-11b24f3c3f46]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.297 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace which is not needed anymore#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.327 186792 INFO nova.compute.manager [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.328 186792 DEBUG oslo.service.loopingcall [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.328 186792 DEBUG nova.compute.manager [-] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.329 186792 DEBUG nova.network.neutron [-] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216475]: [NOTICE]   (216479) : haproxy version is 2.8.14-c23fe91
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216475]: [NOTICE]   (216479) : path to executable is /usr/sbin/haproxy
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216475]: [WARNING]  (216479) : Exiting Master process...
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216475]: [ALERT]    (216479) : Current worker (216481) exited with code 143 (Terminated)
Nov 22 02:45:11 np0005531888 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216475]: [WARNING]  (216479) : All workers exited. Exiting... (0)
Nov 22 02:45:11 np0005531888 systemd[1]: libpod-9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f.scope: Deactivated successfully.
Nov 22 02:45:11 np0005531888 podman[216828]: 2025-11-22 07:45:11.433303984 +0000 UTC m=+0.045286764 container died 9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:45:11 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f-userdata-shm.mount: Deactivated successfully.
Nov 22 02:45:11 np0005531888 systemd[1]: var-lib-containers-storage-overlay-798cef787ddeabb8aba1f27f8bb756bd58927e41d54d9919a653d93c034148af-merged.mount: Deactivated successfully.
Nov 22 02:45:11 np0005531888 podman[216828]: 2025-11-22 07:45:11.468893054 +0000 UTC m=+0.080875854 container cleanup 9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:45:11 np0005531888 systemd[1]: libpod-conmon-9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f.scope: Deactivated successfully.
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.543 186792 INFO nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Creating config drive at /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk.config#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.548 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgbeo12kt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:11 np0005531888 podman[216855]: 2025-11-22 07:45:11.548860704 +0000 UTC m=+0.053261233 container remove 9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.554 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[70623adc-13d9-4a36-b188-6b76d1116aed]: (4, ('Sat Nov 22 07:45:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f)\n9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f\nSat Nov 22 07:45:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f)\n9a9e02d4d762ee8ebdb31d4c6feb124239484c0977da60509c28339f9b54d75f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.558 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b72a00-5e3f-42c3-9a00-d18b8e461a79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.559 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:11 np0005531888 kernel: tapcd5fa4f6-00: left promiscuous mode
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.572 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.576 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba7b931-4f67-4c5e-b256-3b6509fef0f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.598 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fa52a1-d9ab-4539-ab1f-202f319b1ba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.599 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[df117e3d-9d82-4b5d-a871-73c5d921c359]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.616 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f45e8fb1-d325-4cef-9f68-a9dcc9c42dad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424583, 'reachable_time': 32210, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216873, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.619 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:45:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:11.619 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a9e059-ec56-454d-9a51-a862e4ef7bfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.649 186792 DEBUG nova.compute.manager [req-3ca11330-fe3b-4694-aa51-4c5286edaa5b req-d7e35f9b-070e-4531-ade1-74b57fe0fbd4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.649 186792 DEBUG oslo_concurrency.lockutils [req-3ca11330-fe3b-4694-aa51-4c5286edaa5b req-d7e35f9b-070e-4531-ade1-74b57fe0fbd4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.650 186792 DEBUG oslo_concurrency.lockutils [req-3ca11330-fe3b-4694-aa51-4c5286edaa5b req-d7e35f9b-070e-4531-ade1-74b57fe0fbd4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.650 186792 DEBUG oslo_concurrency.lockutils [req-3ca11330-fe3b-4694-aa51-4c5286edaa5b req-d7e35f9b-070e-4531-ade1-74b57fe0fbd4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.650 186792 DEBUG nova.compute.manager [req-3ca11330-fe3b-4694-aa51-4c5286edaa5b req-d7e35f9b-070e-4531-ade1-74b57fe0fbd4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.650 186792 DEBUG nova.compute.manager [req-3ca11330-fe3b-4694-aa51-4c5286edaa5b req-d7e35f9b-070e-4531-ade1-74b57fe0fbd4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:45:11 np0005531888 nova_compute[186788]: 2025-11-22 07:45:11.680 186792 DEBUG oslo_concurrency.processutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgbeo12kt" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:11 np0005531888 systemd-machined[153106]: New machine qemu-10-instance-00000017.
Nov 22 02:45:11 np0005531888 systemd[1]: Started Virtual Machine qemu-10-instance-00000017.
Nov 22 02:45:11 np0005531888 podman[216885]: 2025-11-22 07:45:11.844900808 +0000 UTC m=+0.073670874 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm)
Nov 22 02:45:12 np0005531888 systemd[1]: run-netns-ovnmeta\x2dcd5fa4f6\x2d0f1b\x2d41f2\x2d9643\x2d3c1a36620dc9.mount: Deactivated successfully.
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.914 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797512.9135027, 0319a6cd-b217-43df-aaf2-8c9f6688a151 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.915 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.918 186792 DEBUG nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.918 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.923 186792 INFO nova.virt.libvirt.driver [-] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Instance spawned successfully.#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.924 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.944 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.953 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.958 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.958 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.958 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.959 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.959 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.960 186792 DEBUG nova.virt.libvirt.driver [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.985 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.985 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797512.9148748, 0319a6cd-b217-43df-aaf2-8c9f6688a151 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:12 np0005531888 nova_compute[186788]: 2025-11-22 07:45:12.985 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.014 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.018 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.038 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.041 186792 INFO nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Took 2.28 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.042 186792 DEBUG nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.132 186792 INFO nova.compute.manager [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Took 2.87 seconds to build instance.#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.150 186792 DEBUG oslo_concurrency.lockutils [None req-a9e33a60-c81c-40b4-9b93-694da321afcc 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "0319a6cd-b217-43df-aaf2-8c9f6688a151" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.842 186792 DEBUG nova.compute.manager [req-826c2a49-f9aa-46a8-b6d3-7c15db7e946c req-13097824-8191-4e0e-9cdb-fed5dc1f2469 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.843 186792 DEBUG oslo_concurrency.lockutils [req-826c2a49-f9aa-46a8-b6d3-7c15db7e946c req-13097824-8191-4e0e-9cdb-fed5dc1f2469 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.843 186792 DEBUG oslo_concurrency.lockutils [req-826c2a49-f9aa-46a8-b6d3-7c15db7e946c req-13097824-8191-4e0e-9cdb-fed5dc1f2469 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.843 186792 DEBUG oslo_concurrency.lockutils [req-826c2a49-f9aa-46a8-b6d3-7c15db7e946c req-13097824-8191-4e0e-9cdb-fed5dc1f2469 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.843 186792 DEBUG nova.compute.manager [req-826c2a49-f9aa-46a8-b6d3-7c15db7e946c req-13097824-8191-4e0e-9cdb-fed5dc1f2469 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:45:13 np0005531888 nova_compute[186788]: 2025-11-22 07:45:13.844 186792 WARNING nova.compute.manager [req-826c2a49-f9aa-46a8-b6d3-7c15db7e946c req-13097824-8191-4e0e-9cdb-fed5dc1f2469 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received unexpected event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:45:15 np0005531888 nova_compute[186788]: 2025-11-22 07:45:15.181 186792 DEBUG nova.network.neutron [-] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:15 np0005531888 nova_compute[186788]: 2025-11-22 07:45:15.199 186792 INFO nova.compute.manager [-] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Took 3.87 seconds to deallocate network for instance.#033[00m
Nov 22 02:45:15 np0005531888 nova_compute[186788]: 2025-11-22 07:45:15.262 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:15 np0005531888 nova_compute[186788]: 2025-11-22 07:45:15.262 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:15 np0005531888 nova_compute[186788]: 2025-11-22 07:45:15.269 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:15 np0005531888 nova_compute[186788]: 2025-11-22 07:45:15.299 186792 INFO nova.scheduler.client.report [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Deleted allocations for instance feb5ca5f-df67-4f29-9c21-71ba30b5af9c#033[00m
Nov 22 02:45:15 np0005531888 nova_compute[186788]: 2025-11-22 07:45:15.387 186792 DEBUG oslo_concurrency.lockutils [None req-e219dd9f-23e9-4f22-aadb-a180e931bac1 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.513s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:16 np0005531888 nova_compute[186788]: 2025-11-22 07:45:16.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:16 np0005531888 nova_compute[186788]: 2025-11-22 07:45:16.209 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:17 np0005531888 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:45:17 np0005531888 systemd[216564]: Activating special unit Exit the Session...
Nov 22 02:45:17 np0005531888 systemd[216564]: Stopped target Main User Target.
Nov 22 02:45:17 np0005531888 systemd[216564]: Stopped target Basic System.
Nov 22 02:45:17 np0005531888 systemd[216564]: Stopped target Paths.
Nov 22 02:45:17 np0005531888 systemd[216564]: Stopped target Sockets.
Nov 22 02:45:17 np0005531888 systemd[216564]: Stopped target Timers.
Nov 22 02:45:17 np0005531888 systemd[216564]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:45:17 np0005531888 systemd[216564]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:45:17 np0005531888 systemd[216564]: Closed D-Bus User Message Bus Socket.
Nov 22 02:45:17 np0005531888 systemd[216564]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:45:17 np0005531888 systemd[216564]: Removed slice User Application Slice.
Nov 22 02:45:17 np0005531888 systemd[216564]: Reached target Shutdown.
Nov 22 02:45:17 np0005531888 systemd[216564]: Finished Exit the Session.
Nov 22 02:45:17 np0005531888 systemd[216564]: Reached target Exit the Session.
Nov 22 02:45:17 np0005531888 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:45:17 np0005531888 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:45:17 np0005531888 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:45:17 np0005531888 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:45:17 np0005531888 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:45:17 np0005531888 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:45:17 np0005531888 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:45:20 np0005531888 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:45:20 np0005531888 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:45:20 np0005531888 systemd-logind[825]: New session 37 of user nova.
Nov 22 02:45:20 np0005531888 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:45:20 np0005531888 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:45:20 np0005531888 systemd[216924]: Queued start job for default target Main User Target.
Nov 22 02:45:20 np0005531888 systemd[216924]: Created slice User Application Slice.
Nov 22 02:45:20 np0005531888 systemd[216924]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:45:20 np0005531888 systemd[216924]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:45:20 np0005531888 systemd[216924]: Reached target Paths.
Nov 22 02:45:20 np0005531888 systemd[216924]: Reached target Timers.
Nov 22 02:45:20 np0005531888 systemd[216924]: Starting D-Bus User Message Bus Socket...
Nov 22 02:45:20 np0005531888 systemd[216924]: Starting Create User's Volatile Files and Directories...
Nov 22 02:45:20 np0005531888 systemd[216924]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:45:20 np0005531888 systemd[216924]: Finished Create User's Volatile Files and Directories.
Nov 22 02:45:20 np0005531888 systemd[216924]: Reached target Sockets.
Nov 22 02:45:20 np0005531888 systemd[216924]: Reached target Basic System.
Nov 22 02:45:20 np0005531888 systemd[216924]: Reached target Main User Target.
Nov 22 02:45:20 np0005531888 systemd[216924]: Startup finished in 150ms.
Nov 22 02:45:20 np0005531888 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:45:20 np0005531888 systemd[1]: Started Session 37 of User nova.
Nov 22 02:45:21 np0005531888 nova_compute[186788]: 2025-11-22 07:45:21.210 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:21 np0005531888 nova_compute[186788]: 2025-11-22 07:45:21.214 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:21 np0005531888 nova_compute[186788]: 2025-11-22 07:45:21.242 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 22 02:45:21 np0005531888 nova_compute[186788]: 2025-11-22 07:45:21.242 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:45:21 np0005531888 nova_compute[186788]: 2025-11-22 07:45:21.244 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:21 np0005531888 nova_compute[186788]: 2025-11-22 07:45:21.245 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:45:21 np0005531888 systemd[1]: session-37.scope: Deactivated successfully.
Nov 22 02:45:21 np0005531888 systemd-logind[825]: Session 37 logged out. Waiting for processes to exit.
Nov 22 02:45:21 np0005531888 systemd-logind[825]: Removed session 37.
Nov 22 02:45:21 np0005531888 systemd-logind[825]: New session 39 of user nova.
Nov 22 02:45:21 np0005531888 systemd[1]: Started Session 39 of User nova.
Nov 22 02:45:21 np0005531888 systemd[1]: session-39.scope: Deactivated successfully.
Nov 22 02:45:21 np0005531888 systemd-logind[825]: Session 39 logged out. Waiting for processes to exit.
Nov 22 02:45:21 np0005531888 systemd-logind[825]: Removed session 39.
Nov 22 02:45:21 np0005531888 systemd-logind[825]: New session 40 of user nova.
Nov 22 02:45:21 np0005531888 systemd[1]: Started Session 40 of User nova.
Nov 22 02:45:21 np0005531888 systemd-logind[825]: Session 40 logged out. Waiting for processes to exit.
Nov 22 02:45:21 np0005531888 systemd[1]: session-40.scope: Deactivated successfully.
Nov 22 02:45:21 np0005531888 systemd-logind[825]: Removed session 40.
Nov 22 02:45:22 np0005531888 nova_compute[186788]: 2025-11-22 07:45:22.174 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797507.1726124, 2160c105-2e0f-46fc-9039-28d7d834fc0c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:22 np0005531888 nova_compute[186788]: 2025-11-22 07:45:22.175 186792 INFO nova.compute.manager [-] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:45:22 np0005531888 nova_compute[186788]: 2025-11-22 07:45:22.194 186792 DEBUG nova.compute.manager [None req-07ac88bf-febb-4989-a4a9-70a52360fef9 - - - - - -] [instance: 2160c105-2e0f-46fc-9039-28d7d834fc0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:22 np0005531888 nova_compute[186788]: 2025-11-22 07:45:22.329 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:22 np0005531888 nova_compute[186788]: 2025-11-22 07:45:22.331 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:22 np0005531888 nova_compute[186788]: 2025-11-22 07:45:22.332 186792 DEBUG nova.network.neutron [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:22 np0005531888 nova_compute[186788]: 2025-11-22 07:45:22.603 186792 DEBUG nova.network.neutron [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.259 186792 DEBUG nova.network.neutron [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.292 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.484 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.487 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.488 186792 INFO nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Creating image(s)#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.489 186792 DEBUG nova.objects.instance [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.505 186792 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.567 186792 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.569 186792 DEBUG nova.virt.disk.api [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Checking if we can resize image /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.569 186792 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.629 186792 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.630 186792 DEBUG nova.virt.disk.api [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Cannot resize image /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:45:23 np0005531888 podman[216955]: 2025-11-22 07:45:23.718223725 +0000 UTC m=+0.078165296 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:45:23 np0005531888 podman[216956]: 2025-11-22 07:45:23.746303297 +0000 UTC m=+0.106578276 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.804 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.805 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Ensure instance console log exists: /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.806 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.807 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.807 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.812 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.819 186792 WARNING nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.829 186792 DEBUG nova.virt.libvirt.host [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.831 186792 DEBUG nova.virt.libvirt.host [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.835 186792 DEBUG nova.virt.libvirt.host [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.836 186792 DEBUG nova.virt.libvirt.host [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.838 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.838 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.839 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.839 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.840 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.840 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.840 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.841 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.841 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.841 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.842 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.842 186792 DEBUG nova.virt.hardware [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.843 186792 DEBUG nova.objects.instance [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.861 186792 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.929 186792 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.931 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.931 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.932 186792 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.935 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <uuid>88c868e5-67c5-4f22-b584-d8772316044d</uuid>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <name>instance-00000016</name>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <memory>196608</memory>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <nova:name>tempest-MigrationsAdminTest-server-1406881377</nova:name>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:45:23</nova:creationTime>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.micro">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:        <nova:memory>192</nova:memory>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <entry name="serial">88c868e5-67c5-4f22-b584-d8772316044d</entry>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <entry name="uuid">88c868e5-67c5-4f22-b584-d8772316044d</entry>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/console.log" append="off"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:45:23 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:45:23 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:45:23 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:45:23 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.998 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.999 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:23 np0005531888 nova_compute[186788]: 2025-11-22 07:45:23.999 186792 INFO nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Using config drive#033[00m
Nov 22 02:45:24 np0005531888 systemd-machined[153106]: New machine qemu-11-instance-00000016.
Nov 22 02:45:24 np0005531888 systemd[1]: Started Virtual Machine qemu-11-instance-00000016.
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.359 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797524.3585446, 88c868e5-67c5-4f22-b584-d8772316044d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.359 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.363 186792 DEBUG nova.compute.manager [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.368 186792 INFO nova.virt.libvirt.driver [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance running successfully.#033[00m
Nov 22 02:45:24 np0005531888 virtqemud[186358]: argument unsupported: QEMU guest agent is not configured
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.372 186792 DEBUG nova.virt.libvirt.guest [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.373 186792 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.377 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.382 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.464 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.465 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797524.3607998, 88c868e5-67c5-4f22-b584-d8772316044d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.465 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.499 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:24 np0005531888 nova_compute[186788]: 2025-11-22 07:45:24.504 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.177 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797511.1739779, feb5ca5f-df67-4f29-9c21-71ba30b5af9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.179 186792 INFO nova.compute.manager [-] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.207 186792 DEBUG nova.compute.manager [None req-2553cb90-222a-4489-ab4e-6cbbc4fd0702 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.245 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.248 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.280 186792 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.281 186792 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.281 186792 DEBUG nova.network.neutron [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.522 186792 DEBUG nova.network.neutron [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.853 186792 DEBUG nova.network.neutron [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.884 186792 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:26 np0005531888 nova_compute[186788]: 2025-11-22 07:45:26.906 186792 DEBUG nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Creating tmpfile /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/tmpp6zk3x_o to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Nov 22 02:45:26 np0005531888 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Deactivated successfully.
Nov 22 02:45:26 np0005531888 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Consumed 2.877s CPU time.
Nov 22 02:45:27 np0005531888 systemd-machined[153106]: Machine qemu-11-instance-00000016 terminated.
Nov 22 02:45:27 np0005531888 podman[217045]: 2025-11-22 07:45:27.088583034 +0000 UTC m=+0.071579181 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.165 186792 INFO nova.virt.libvirt.driver [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance destroyed successfully.#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.165 186792 DEBUG nova.objects.instance [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'resources' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.180 186792 INFO nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Deleting instance files /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_del#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.186 186792 INFO nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Deletion of /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_del complete#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.280 186792 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.280 186792 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.298 186792 DEBUG nova.objects.instance [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'migration_context' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.409 186792 DEBUG nova.compute.provider_tree [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.425 186792 DEBUG nova.scheduler.client.report [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:27 np0005531888 nova_compute[186788]: 2025-11-22 07:45:27.482 186792 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.868 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "0319a6cd-b217-43df-aaf2-8c9f6688a151" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.868 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "0319a6cd-b217-43df-aaf2-8c9f6688a151" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.869 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "0319a6cd-b217-43df-aaf2-8c9f6688a151-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.869 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "0319a6cd-b217-43df-aaf2-8c9f6688a151-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.869 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "0319a6cd-b217-43df-aaf2-8c9f6688a151-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.876 186792 INFO nova.compute.manager [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Terminating instance#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.884 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "refresh_cache-0319a6cd-b217-43df-aaf2-8c9f6688a151" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.885 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquired lock "refresh_cache-0319a6cd-b217-43df-aaf2-8c9f6688a151" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:29 np0005531888 nova_compute[186788]: 2025-11-22 07:45:29.885 186792 DEBUG nova.network.neutron [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.179 186792 DEBUG nova.network.neutron [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.517 186792 DEBUG nova.network.neutron [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.535 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Releasing lock "refresh_cache-0319a6cd-b217-43df-aaf2-8c9f6688a151" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.536 186792 DEBUG nova.compute.manager [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:45:30 np0005531888 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000017.scope: Deactivated successfully.
Nov 22 02:45:30 np0005531888 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000017.scope: Consumed 14.953s CPU time.
Nov 22 02:45:30 np0005531888 systemd-machined[153106]: Machine qemu-10-instance-00000017 terminated.
Nov 22 02:45:30 np0005531888 podman[217079]: 2025-11-22 07:45:30.668534085 +0000 UTC m=+0.067673484 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.783 186792 INFO nova.virt.libvirt.driver [-] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Instance destroyed successfully.#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.783 186792 DEBUG nova.objects.instance [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lazy-loading 'resources' on Instance uuid 0319a6cd-b217-43df-aaf2-8c9f6688a151 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.797 186792 INFO nova.virt.libvirt.driver [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Deleting instance files /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151_del#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.797 186792 INFO nova.virt.libvirt.driver [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Deletion of /var/lib/nova/instances/0319a6cd-b217-43df-aaf2-8c9f6688a151_del complete#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.982 186792 INFO nova.compute.manager [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.983 186792 DEBUG oslo.service.loopingcall [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.983 186792 DEBUG nova.compute.manager [-] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:45:30 np0005531888 nova_compute[186788]: 2025-11-22 07:45:30.983 186792 DEBUG nova.network.neutron [-] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.247 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.249 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.288 186792 DEBUG nova.network.neutron [-] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.301 186792 DEBUG nova.network.neutron [-] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.321 186792 INFO nova.compute.manager [-] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Took 0.34 seconds to deallocate network for instance.#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.615 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.615 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.713 186792 DEBUG nova.compute.provider_tree [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.733 186792 DEBUG nova.scheduler.client.report [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.762 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.813 186792 INFO nova.scheduler.client.report [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Deleted allocations for instance 0319a6cd-b217-43df-aaf2-8c9f6688a151#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.912 186792 DEBUG oslo_concurrency.lockutils [None req-ac8f1f72-3402-423a-8f0c-f3ef6a4fe4f8 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "0319a6cd-b217-43df-aaf2-8c9f6688a151" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:45:31 np0005531888 nova_compute[186788]: 2025-11-22 07:45:31.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:45:32 np0005531888 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:45:32 np0005531888 systemd[216924]: Activating special unit Exit the Session...
Nov 22 02:45:32 np0005531888 systemd[216924]: Stopped target Main User Target.
Nov 22 02:45:32 np0005531888 systemd[216924]: Stopped target Basic System.
Nov 22 02:45:32 np0005531888 systemd[216924]: Stopped target Paths.
Nov 22 02:45:32 np0005531888 systemd[216924]: Stopped target Sockets.
Nov 22 02:45:32 np0005531888 systemd[216924]: Stopped target Timers.
Nov 22 02:45:32 np0005531888 systemd[216924]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:45:32 np0005531888 systemd[216924]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:45:32 np0005531888 systemd[216924]: Closed D-Bus User Message Bus Socket.
Nov 22 02:45:32 np0005531888 systemd[216924]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:45:32 np0005531888 systemd[216924]: Removed slice User Application Slice.
Nov 22 02:45:32 np0005531888 systemd[216924]: Reached target Shutdown.
Nov 22 02:45:32 np0005531888 systemd[216924]: Finished Exit the Session.
Nov 22 02:45:32 np0005531888 systemd[216924]: Reached target Exit the Session.
Nov 22 02:45:32 np0005531888 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:45:32 np0005531888 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:45:32 np0005531888 nova_compute[186788]: 2025-11-22 07:45:32.124 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:32 np0005531888 nova_compute[186788]: 2025-11-22 07:45:32.126 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:32 np0005531888 nova_compute[186788]: 2025-11-22 07:45:32.127 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:45:32 np0005531888 nova_compute[186788]: 2025-11-22 07:45:32.127 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:32 np0005531888 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:45:32 np0005531888 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:45:32 np0005531888 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:45:32 np0005531888 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:45:32 np0005531888 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:45:32 np0005531888 nova_compute[186788]: 2025-11-22 07:45:32.342 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:32 np0005531888 nova_compute[186788]: 2025-11-22 07:45:32.598 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:32 np0005531888 nova_compute[186788]: 2025-11-22 07:45:32.627 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:32 np0005531888 nova_compute[186788]: 2025-11-22 07:45:32.628 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.250 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "0edda70f-511a-49a0-8c13-561c699336c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.250 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "0edda70f-511a-49a0-8c13-561c699336c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.275 186792 DEBUG nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.364 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.364 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.372 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.373 186792 INFO nova.compute.claims [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.552 186792 DEBUG nova.compute.provider_tree [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.567 186792 DEBUG nova.scheduler.client.report [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.599 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.600 186792 DEBUG nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.667 186792 DEBUG nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.667 186792 DEBUG nova.network.neutron [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.685 186792 INFO nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.704 186792 DEBUG nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.829 186792 DEBUG nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.831 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.831 186792 INFO nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Creating image(s)#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.832 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.833 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.833 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.848 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.909 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.910 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.911 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.926 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.984 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:34 np0005531888 nova_compute[186788]: 2025-11-22 07:45:34.985 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.022 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.023 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.024 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.086 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.088 186792 DEBUG nova.virt.disk.api [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Checking if we can resize image /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.088 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.156 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.157 186792 DEBUG nova.virt.disk.api [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Cannot resize image /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.158 186792 DEBUG nova.objects.instance [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'migration_context' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.169 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.170 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Ensure instance console log exists: /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.171 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.171 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.171 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.286 186792 DEBUG nova.network.neutron [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.286 186792 DEBUG nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.288 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.292 186792 WARNING nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.296 186792 DEBUG nova.virt.libvirt.host [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.297 186792 DEBUG nova.virt.libvirt.host [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.300 186792 DEBUG nova.virt.libvirt.host [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.301 186792 DEBUG nova.virt.libvirt.host [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.302 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.302 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.303 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.303 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.303 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.303 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.303 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.304 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.304 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.304 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.304 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.304 186792 DEBUG nova.virt.hardware [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.308 186792 DEBUG nova.objects.instance [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'pci_devices' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.325 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <uuid>0edda70f-511a-49a0-8c13-561c699336c1</uuid>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <name>instance-0000001a</name>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <nova:name>tempest-MigrationsAdminTest-server-1141640296</nova:name>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:45:35</nova:creationTime>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <entry name="serial">0edda70f-511a-49a0-8c13-561c699336c1</entry>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <entry name="uuid">0edda70f-511a-49a0-8c13-561c699336c1</entry>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/console.log" append="off"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:45:35 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:45:35 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:45:35 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:45:35 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.378 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.378 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.379 186792 INFO nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Using config drive#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.687 186792 INFO nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Creating config drive at /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.692 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg1mp8ryx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:35.818 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.819 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:35.822 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:45:35 np0005531888 nova_compute[186788]: 2025-11-22 07:45:35.822 186792 DEBUG oslo_concurrency.processutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg1mp8ryx" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:35.823 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:35 np0005531888 systemd-machined[153106]: New machine qemu-12-instance-0000001a.
Nov 22 02:45:35 np0005531888 systemd[1]: Started Virtual Machine qemu-12-instance-0000001a.
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.200 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797536.1992471, 0edda70f-511a-49a0-8c13-561c699336c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.202 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.206 186792 DEBUG nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.206 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.211 186792 INFO nova.virt.libvirt.driver [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance spawned successfully.#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.212 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.237 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.246 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.253 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.254 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.254 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.255 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.256 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.256 186792 DEBUG nova.virt.libvirt.driver [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.260 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.261 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.281 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.282 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797536.2011151, 0edda70f-511a-49a0-8c13-561c699336c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.282 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.315 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.320 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.352 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.356 186792 INFO nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Took 1.53 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.357 186792 DEBUG nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.456 186792 INFO nova.compute.manager [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Took 2.12 seconds to build instance.#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.485 186792 DEBUG oslo_concurrency.lockutils [None req-f8b85289-96b2-468d-a927-151e061b881a 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "0edda70f-511a-49a0-8c13-561c699336c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.620 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:36.797 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:36.798 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:45:36.798 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:36 np0005531888 nova_compute[186788]: 2025-11-22 07:45:36.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:37 np0005531888 podman[217153]: 2025-11-22 07:45:37.702743322 +0000 UTC m=+0.071348245 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:45:37 np0005531888 nova_compute[186788]: 2025-11-22 07:45:37.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:39 np0005531888 podman[217173]: 2025-11-22 07:45:39.696021328 +0000 UTC m=+0.061036809 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:45:39 np0005531888 nova_compute[186788]: 2025-11-22 07:45:39.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.748 186792 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.748 186792 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquired lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.748 186792 DEBUG nova.network.neutron [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.926 186792 DEBUG nova.network.neutron [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.975 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.975 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:40 np0005531888 nova_compute[186788]: 2025-11-22 07:45:40.976 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.055 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.130 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.131 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.191 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.197 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.262 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.264 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.264 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.264 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.265 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.266 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.286 186792 DEBUG nova.network.neutron [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.290 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.313 186792 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Releasing lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.328 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.430 186792 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.430 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Creating file /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/9d33c5f8b6ee4c1b9c5a80675eacc1af.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.431 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/9d33c5f8b6ee4c1b9c5a80675eacc1af.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.535 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.537 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5467MB free_disk=73.42966079711914GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.537 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.538 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.608 186792 INFO nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating resource usage from migration 3de1f32a-6ea3-48fe-b75c-a192ccefb94a#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.669 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 3cf2b323-ba35-4807-8337-288f6c983860 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.669 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Migration 3de1f32a-6ea3-48fe-b75c-a192ccefb94a is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.670 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.670 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.747 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.761 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.795 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.795 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.869 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/9d33c5f8b6ee4c1b9c5a80675eacc1af.tmp" returned: 1 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.870 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/9d33c5f8b6ee4c1b9c5a80675eacc1af.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.870 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Creating directory /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 02:45:41 np0005531888 nova_compute[186788]: 2025-11-22 07:45:41.871 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:42 np0005531888 nova_compute[186788]: 2025-11-22 07:45:42.068 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1" returned: 0 in 0.197s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:42 np0005531888 nova_compute[186788]: 2025-11-22 07:45:42.072 186792 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:45:42 np0005531888 nova_compute[186788]: 2025-11-22 07:45:42.163 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797527.1624115, 88c868e5-67c5-4f22-b584-d8772316044d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:42 np0005531888 nova_compute[186788]: 2025-11-22 07:45:42.164 186792 INFO nova.compute.manager [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:45:42 np0005531888 nova_compute[186788]: 2025-11-22 07:45:42.186 186792 DEBUG nova.compute.manager [None req-91471fce-d937-4970-9177-5aa4461b070e - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:42 np0005531888 podman[217209]: 2025-11-22 07:45:42.691702218 +0000 UTC m=+0.064860976 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Nov 22 02:45:43 np0005531888 nova_compute[186788]: 2025-11-22 07:45:43.796 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:43 np0005531888 nova_compute[186788]: 2025-11-22 07:45:43.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:45 np0005531888 nova_compute[186788]: 2025-11-22 07:45:45.781 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797530.780206, 0319a6cd-b217-43df-aaf2-8c9f6688a151 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:45 np0005531888 nova_compute[186788]: 2025-11-22 07:45:45.782 186792 INFO nova.compute.manager [-] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:45:45 np0005531888 nova_compute[186788]: 2025-11-22 07:45:45.800 186792 DEBUG nova.compute.manager [None req-bec96d82-ee35-45f0-b2b2-b1f67f00bafb - - - - - -] [instance: 0319a6cd-b217-43df-aaf2-8c9f6688a151] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:46 np0005531888 nova_compute[186788]: 2025-11-22 07:45:46.288 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:46 np0005531888 nova_compute[186788]: 2025-11-22 07:45:46.291 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:51 np0005531888 nova_compute[186788]: 2025-11-22 07:45:51.290 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:51 np0005531888 nova_compute[186788]: 2025-11-22 07:45:51.293 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:52 np0005531888 nova_compute[186788]: 2025-11-22 07:45:52.128 186792 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:45:54 np0005531888 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Nov 22 02:45:54 np0005531888 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001a.scope: Consumed 13.882s CPU time.
Nov 22 02:45:54 np0005531888 systemd-machined[153106]: Machine qemu-12-instance-0000001a terminated.
Nov 22 02:45:54 np0005531888 podman[217245]: 2025-11-22 07:45:54.405263537 +0000 UTC m=+0.068118679 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:45:54 np0005531888 podman[217246]: 2025-11-22 07:45:54.46962857 +0000 UTC m=+0.128466190 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 02:45:55 np0005531888 nova_compute[186788]: 2025-11-22 07:45:55.149 186792 INFO nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 02:45:55 np0005531888 nova_compute[186788]: 2025-11-22 07:45:55.156 186792 INFO nova.virt.libvirt.driver [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance destroyed successfully.#033[00m
Nov 22 02:45:55 np0005531888 nova_compute[186788]: 2025-11-22 07:45:55.160 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:55 np0005531888 nova_compute[186788]: 2025-11-22 07:45:55.226 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:55 np0005531888 nova_compute[186788]: 2025-11-22 07:45:55.227 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:55 np0005531888 nova_compute[186788]: 2025-11-22 07:45:55.286 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:55 np0005531888 nova_compute[186788]: 2025-11-22 07:45:55.289 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Copying file /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk to 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:45:55 np0005531888 nova_compute[186788]: 2025-11-22 07:45:55.289 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.085 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "scp -r /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk" returned: 0 in 0.796s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.086 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Copying file /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.087 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk.config 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.294 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.309 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "scp -C -r /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk.config 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.310 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Copying file /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.311 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk.info 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.524 186792 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "scp -C -r /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_resize/disk.info 192.168.122.101:/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.704 186792 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "0edda70f-511a-49a0-8c13-561c699336c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.705 186792 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "0edda70f-511a-49a0-8c13-561c699336c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:56 np0005531888 nova_compute[186788]: 2025-11-22 07:45:56.705 186792 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "0edda70f-511a-49a0-8c13-561c699336c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:57 np0005531888 ovn_controller[95067]: 2025-11-22T07:45:57Z|00092|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Nov 22 02:45:57 np0005531888 podman[217326]: 2025-11-22 07:45:57.689468444 +0000 UTC m=+0.061473880 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:46:01 np0005531888 nova_compute[186788]: 2025-11-22 07:46:01.295 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:01 np0005531888 podman[217350]: 2025-11-22 07:46:01.671619358 +0000 UTC m=+0.045661790 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 02:46:03 np0005531888 nova_compute[186788]: 2025-11-22 07:46:03.906 186792 INFO nova.compute.manager [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Swapping old allocation on dict_keys(['1afd6948-7df7-46e7-8718-35e2b3007a5d']) held by migration 3de1f32a-6ea3-48fe-b75c-a192ccefb94a for instance#033[00m
Nov 22 02:46:03 np0005531888 nova_compute[186788]: 2025-11-22 07:46:03.939 186792 DEBUG nova.scheduler.client.report [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Overwriting current allocation {'allocations': {'9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 20}}, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'consumer_generation': 1} on consumer 0edda70f-511a-49a0-8c13-561c699336c1 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.189 186792 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.190 186792 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.190 186792 DEBUG nova.network.neutron [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.445 186792 DEBUG nova.network.neutron [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.745 186792 DEBUG nova.network.neutron [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.759 186792 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.760 186792 DEBUG nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.770 186792 DEBUG nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.775 186792 WARNING nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.782 186792 DEBUG nova.virt.libvirt.host [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.783 186792 DEBUG nova.virt.libvirt.host [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.788 186792 DEBUG nova.virt.libvirt.host [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.788 186792 DEBUG nova.virt.libvirt.host [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.790 186792 DEBUG nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.790 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.790 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.790 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.791 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.791 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.791 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.791 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.791 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.792 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.792 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.792 186792 DEBUG nova.virt.hardware [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.792 186792 DEBUG nova.objects.instance [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.810 186792 DEBUG oslo_concurrency.processutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.870 186792 DEBUG oslo_concurrency.processutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.872 186792 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.872 186792 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.873 186792 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:04 np0005531888 nova_compute[186788]: 2025-11-22 07:46:04.876 186792 DEBUG nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <uuid>0edda70f-511a-49a0-8c13-561c699336c1</uuid>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <name>instance-0000001a</name>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <nova:name>tempest-MigrationsAdminTest-server-1141640296</nova:name>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:46:04</nova:creationTime>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <entry name="serial">0edda70f-511a-49a0-8c13-561c699336c1</entry>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <entry name="uuid">0edda70f-511a-49a0-8c13-561c699336c1</entry>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/console.log" append="off"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <input type="keyboard" bus="usb"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:46:04 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:46:04 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:46:04 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:46:04 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:46:04 np0005531888 systemd-machined[153106]: New machine qemu-13-instance-0000001a.
Nov 22 02:46:05 np0005531888 systemd[1]: Started Virtual Machine qemu-13-instance-0000001a.
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.429 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for 0edda70f-511a-49a0-8c13-561c699336c1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.431 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797565.428144, 0edda70f-511a-49a0-8c13-561c699336c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.431 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.434 186792 DEBUG nova.compute.manager [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.440 186792 INFO nova.virt.libvirt.driver [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance running successfully.#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.440 186792 DEBUG nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.476 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.479 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.522 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.523 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797565.4292102, 0edda70f-511a-49a0-8c13-561c699336c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.523 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] VM Started (Lifecycle Event)#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.554 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.558 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.590 186792 INFO nova.compute.manager [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating instance to original state: 'active'#033[00m
Nov 22 02:46:05 np0005531888 nova_compute[186788]: 2025-11-22 07:46:05.594 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 22 02:46:06 np0005531888 nova_compute[186788]: 2025-11-22 07:46:06.300 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:46:06 np0005531888 nova_compute[186788]: 2025-11-22 07:46:06.303 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:46:06 np0005531888 nova_compute[186788]: 2025-11-22 07:46:06.303 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 22 02:46:06 np0005531888 nova_compute[186788]: 2025-11-22 07:46:06.303 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:46:06 np0005531888 nova_compute[186788]: 2025-11-22 07:46:06.305 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:46:06 np0005531888 nova_compute[186788]: 2025-11-22 07:46:06.306 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.488 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "0edda70f-511a-49a0-8c13-561c699336c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.489 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "0edda70f-511a-49a0-8c13-561c699336c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.489 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "0edda70f-511a-49a0-8c13-561c699336c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.490 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "0edda70f-511a-49a0-8c13-561c699336c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.490 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "0edda70f-511a-49a0-8c13-561c699336c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.498 186792 INFO nova.compute.manager [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Terminating instance#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.505 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.505 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.505 186792 DEBUG nova.network.neutron [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:46:07 np0005531888 nova_compute[186788]: 2025-11-22 07:46:07.865 186792 DEBUG nova.network.neutron [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.158 186792 DEBUG nova.network.neutron [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.182 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.184 186792 DEBUG nova.compute.manager [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:46:08 np0005531888 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Nov 22 02:46:08 np0005531888 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001a.scope: Consumed 3.236s CPU time.
Nov 22 02:46:08 np0005531888 systemd-machined[153106]: Machine qemu-13-instance-0000001a terminated.
Nov 22 02:46:08 np0005531888 podman[217398]: 2025-11-22 07:46:08.338053341 +0000 UTC m=+0.067181535 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.440 186792 INFO nova.virt.libvirt.driver [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance destroyed successfully.#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.441 186792 DEBUG nova.objects.instance [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'resources' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.456 186792 INFO nova.virt.libvirt.driver [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Deleting instance files /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_del#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.463 186792 INFO nova.virt.libvirt.driver [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Deletion of /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_del complete#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.528 186792 INFO nova.compute.manager [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.529 186792 DEBUG oslo.service.loopingcall [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.529 186792 DEBUG nova.compute.manager [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.529 186792 DEBUG nova.network.neutron [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.734 186792 DEBUG nova.network.neutron [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.750 186792 DEBUG nova.network.neutron [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.764 186792 INFO nova.compute.manager [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Took 0.23 seconds to deallocate network for instance.#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.819 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.820 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.886 186792 DEBUG nova.compute.provider_tree [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.900 186792 DEBUG nova.scheduler.client.report [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.916 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:08 np0005531888 nova_compute[186788]: 2025-11-22 07:46:08.940 186792 INFO nova.scheduler.client.report [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Deleted allocations for instance 0edda70f-511a-49a0-8c13-561c699336c1#033[00m
Nov 22 02:46:09 np0005531888 nova_compute[186788]: 2025-11-22 07:46:09.019 186792 DEBUG oslo_concurrency.lockutils [None req-d10f6935-ada4-4d02-9d76-bdc076994ac0 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "0edda70f-511a-49a0-8c13-561c699336c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.530s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:10 np0005531888 podman[217427]: 2025-11-22 07:46:10.686105762 +0000 UTC m=+0.056585537 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:46:11 np0005531888 nova_compute[186788]: 2025-11-22 07:46:11.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:13 np0005531888 podman[217451]: 2025-11-22 07:46:13.690852852 +0000 UTC m=+0.065057672 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.759 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "3cf2b323-ba35-4807-8337-288f6c983860" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.760 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.760 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "3cf2b323-ba35-4807-8337-288f6c983860-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.760 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.761 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.769 186792 INFO nova.compute.manager [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Terminating instance#033[00m
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.775 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.775 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:15 np0005531888 nova_compute[186788]: 2025-11-22 07:46:15.775 186792 DEBUG nova.network.neutron [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:46:16 np0005531888 nova_compute[186788]: 2025-11-22 07:46:16.026 186792 DEBUG nova.network.neutron [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:16 np0005531888 nova_compute[186788]: 2025-11-22 07:46:16.262 186792 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating tmpfile /var/lib/nova/instances/tmp8ztptjqj to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 22 02:46:16 np0005531888 nova_compute[186788]: 2025-11-22 07:46:16.263 186792 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ztptjqj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 22 02:46:16 np0005531888 nova_compute[186788]: 2025-11-22 07:46:16.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.329 186792 DEBUG nova.network.neutron [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.347 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.348 186792 DEBUG nova.compute.manager [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:46:17 np0005531888 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000013.scope: Deactivated successfully.
Nov 22 02:46:17 np0005531888 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000013.scope: Consumed 16.869s CPU time.
Nov 22 02:46:17 np0005531888 systemd-machined[153106]: Machine qemu-7-instance-00000013 terminated.
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.606 186792 INFO nova.virt.libvirt.driver [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance destroyed successfully.#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.606 186792 DEBUG nova.objects.instance [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'resources' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.618 186792 INFO nova.virt.libvirt.driver [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Deleting instance files /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_del#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.624 186792 INFO nova.virt.libvirt.driver [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Deletion of /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_del complete#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.703 186792 INFO nova.compute.manager [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.703 186792 DEBUG oslo.service.loopingcall [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.704 186792 DEBUG nova.compute.manager [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.704 186792 DEBUG nova.network.neutron [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.908 186792 DEBUG nova.network.neutron [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.922 186792 DEBUG nova.network.neutron [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:17 np0005531888 nova_compute[186788]: 2025-11-22 07:46:17.937 186792 INFO nova.compute.manager [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Took 0.23 seconds to deallocate network for instance.#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.011 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.011 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.096 186792 DEBUG nova.compute.provider_tree [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.114 186792 DEBUG nova.scheduler.client.report [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.159 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.194 186792 INFO nova.scheduler.client.report [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Deleted allocations for instance 3cf2b323-ba35-4807-8337-288f6c983860#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.241 186792 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ztptjqj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.274 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.274 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquired lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.274 186792 DEBUG nova.network.neutron [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:46:18 np0005531888 nova_compute[186788]: 2025-11-22 07:46:18.298 186792 DEBUG oslo_concurrency.lockutils [None req-f30da220-1345-42dc-8a25-7b76b74c8d0d 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.218 186792 DEBUG nova.network.neutron [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.259 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Releasing lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.268 186792 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ztptjqj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.268 186792 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating instance directory: /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.269 186792 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating disk.info with the contents: {'/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk': 'qcow2', '/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.269 186792 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.270 186792 DEBUG nova.objects.instance [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.299 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.317 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.362 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.364 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.365 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.381 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.446 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.447 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.486 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.487 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.487 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.542 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.543 186792 DEBUG nova.virt.disk.api [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Checking if we can resize image /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.544 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.605 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.607 186792 DEBUG nova.virt.disk.api [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Cannot resize image /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.607 186792 DEBUG nova.objects.instance [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'migration_context' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.630 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.656 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config 485376" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.658 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config to /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:46:21 np0005531888 nova_compute[186788]: 2025-11-22 07:46:21.658 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.142 186792 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.143 186792 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.144 186792 DEBUG nova.virt.libvirt.vif [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:11Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.145 186792 DEBUG nova.network.os_vif_util [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.146 186792 DEBUG nova.network.os_vif_util [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.146 186792 DEBUG os_vif [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.147 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.147 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.149 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.153 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.153 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap973cdfc2-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.154 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap973cdfc2-4a, col_values=(('external_ids', {'iface-id': '973cdfc2-4ad8-4f41-b383-4b64b1b5433f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:9b:98', 'vm-uuid': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.156 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:22 np0005531888 NetworkManager[55166]: <info>  [1763797582.1576] manager: (tap973cdfc2-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.159 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.166 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.168 186792 INFO os_vif [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a')#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.168 186792 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 22 02:46:22 np0005531888 nova_compute[186788]: 2025-11-22 07:46:22.169 186792 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ztptjqj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 22 02:46:23 np0005531888 nova_compute[186788]: 2025-11-22 07:46:23.438 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797568.4365895, 0edda70f-511a-49a0-8c13-561c699336c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:23 np0005531888 nova_compute[186788]: 2025-11-22 07:46:23.439 186792 INFO nova.compute.manager [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:46:23 np0005531888 nova_compute[186788]: 2025-11-22 07:46:23.457 186792 DEBUG nova.compute.manager [None req-9dd4e98d-a4bc-4d19-b0fb-d25fa2d7b60f - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:24 np0005531888 podman[217503]: 2025-11-22 07:46:24.702081156 +0000 UTC m=+0.073175957 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 02:46:24 np0005531888 podman[217504]: 2025-11-22 07:46:24.726883006 +0000 UTC m=+0.095425382 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:46:24 np0005531888 nova_compute[186788]: 2025-11-22 07:46:24.960 186792 DEBUG nova.network.neutron [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 22 02:46:24 np0005531888 nova_compute[186788]: 2025-11-22 07:46:24.969 186792 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ztptjqj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 22 02:46:25 np0005531888 kernel: tap973cdfc2-4a: entered promiscuous mode
Nov 22 02:46:25 np0005531888 NetworkManager[55166]: <info>  [1763797585.3770] manager: (tap973cdfc2-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.381 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:25Z|00093|binding|INFO|Claiming lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f for this additional chassis.
Nov 22 02:46:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:25Z|00094|binding|INFO|973cdfc2-4ad8-4f41-b383-4b64b1b5433f: Claiming fa:16:3e:02:9b:98 10.100.0.14
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.391 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:25 np0005531888 systemd-udevd[217565]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:46:25 np0005531888 systemd-machined[153106]: New machine qemu-14-instance-0000001c.
Nov 22 02:46:25 np0005531888 NetworkManager[55166]: <info>  [1763797585.4336] device (tap973cdfc2-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:46:25 np0005531888 NetworkManager[55166]: <info>  [1763797585.4345] device (tap973cdfc2-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.447 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:25Z|00095|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f ovn-installed in OVS
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.452 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:25 np0005531888 systemd[1]: Started Virtual Machine qemu-14-instance-0000001c.
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.821 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.822 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.847 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.946 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.946 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.955 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:46:25 np0005531888 nova_compute[186788]: 2025-11-22 07:46:25.957 186792 INFO nova.compute.claims [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.138 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797586.1382985, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.139 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Started (Lifecycle Event)#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.212 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.219 186792 DEBUG nova.compute.provider_tree [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.314 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.421 186792 DEBUG nova.scheduler.client.report [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.759 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.762 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.899 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.900 186792 DEBUG nova.network.neutron [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:46:26 np0005531888 nova_compute[186788]: 2025-11-22 07:46:26.955 186792 INFO nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.081 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.103 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797587.1034973, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.104 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.128 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.132 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.147 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.157 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.192 186792 DEBUG nova.policy [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.361 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.363 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.364 186792 INFO nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Creating image(s)#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.364 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "/var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.365 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "/var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.365 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "/var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.380 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.445 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.447 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.448 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.460 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.518 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.519 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.563 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.565 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.566 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.634 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.635 186792 DEBUG nova.virt.disk.api [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Checking if we can resize image /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.636 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.702 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.703 186792 DEBUG nova.virt.disk.api [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Cannot resize image /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.703 186792 DEBUG nova.objects.instance [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'migration_context' on Instance uuid eabc0ddf-fdc1-473a-b224-0dc0954d754c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.720 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.721 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Ensure instance console log exists: /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.721 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.722 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:27 np0005531888 nova_compute[186788]: 2025-11-22 07:46:27.722 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:28 np0005531888 podman[217611]: 2025-11-22 07:46:28.705737515 +0000 UTC m=+0.067505335 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:46:29 np0005531888 nova_compute[186788]: 2025-11-22 07:46:29.173 186792 DEBUG nova.network.neutron [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Successfully created port: 0bb26125-f048-4751-9c50-c8b2ff511438 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:46:29 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:29Z|00096|binding|INFO|Claiming lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f for this chassis.
Nov 22 02:46:29 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:29Z|00097|binding|INFO|973cdfc2-4ad8-4f41-b383-4b64b1b5433f: Claiming fa:16:3e:02:9b:98 10.100.0.14
Nov 22 02:46:29 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:29Z|00098|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f up in Southbound
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.866 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:9b:98 10.100.0.14'], port_security=['fa:16:3e:02:9b:98 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=973cdfc2-4ad8-4f41-b383-4b64b1b5433f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.868 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 bound to our chassis#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.870 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.886 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7474b7e8-ac2a-4eda-8cbb-c3a14d702309]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.887 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc3f966e1-81 in ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.890 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc3f966e1-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.890 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb6ad07-49a4-4d8d-af29-90b5bb48139e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.892 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[290b1701-2ed0-47ac-ae7c-1f48016e520f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.909 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[548761ac-5cbc-41f5-9385-e0948a2109e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.926 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f38b6303-0639-49ab-b6cd-04c9adf7c0a6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.961 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[232a121f-a2b2-48a3-9376-41d1511cb6ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:29.967 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e5641b-dc44-46e9-8f8b-4f73c5d91709]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:29 np0005531888 NetworkManager[55166]: <info>  [1763797589.9683] manager: (tapc3f966e1-80): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Nov 22 02:46:29 np0005531888 systemd-udevd[217642]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.007 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d2de330f-d340-4709-be47-4d0a22bcac83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.012 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[0c73701b-bac4-426b-9e6b-0c3cdfe0a1c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 NetworkManager[55166]: <info>  [1763797590.0414] device (tapc3f966e1-80): carrier: link connected
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.047 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[04e6626d-adca-499e-802c-ab25efcd46c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.067 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[92e57f51-8eee-43d5-b01e-1542561e5055]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433068, 'reachable_time': 26368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217661, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.088 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5468ea4d-1ba6-45e1-a1ec-3c480a9872db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:7499'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433068, 'tstamp': 433068}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217662, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.107 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eee4ac84-5772-481d-94df-b4374c7a74d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433068, 'reachable_time': 26368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217663, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 nova_compute[186788]: 2025-11-22 07:46:30.134 186792 INFO nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Post operation of migration started#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.140 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d293c3a0-57c7-400c-8d07-6d45686bfb71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.198 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ff77bc42-ca9d-4e4a-8eba-8c00ad5a818c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.200 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.200 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.200 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3f966e1-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:30 np0005531888 nova_compute[186788]: 2025-11-22 07:46:30.202 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:30 np0005531888 kernel: tapc3f966e1-80: entered promiscuous mode
Nov 22 02:46:30 np0005531888 NetworkManager[55166]: <info>  [1763797590.2036] manager: (tapc3f966e1-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.205 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc3f966e1-80, col_values=(('external_ids', {'iface-id': '8206cb6d-dd78-493d-a276-fccb0eeecc7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:30 np0005531888 nova_compute[186788]: 2025-11-22 07:46:30.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:30Z|00099|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 02:46:30 np0005531888 nova_compute[186788]: 2025-11-22 07:46:30.218 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.218 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:46:30 np0005531888 nova_compute[186788]: 2025-11-22 07:46:30.219 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.219 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ddaab3-5620-4366-88e2-01074640f73d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.220 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:46:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:30.221 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'env', 'PROCESS_TAG=haproxy-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:46:30 np0005531888 podman[217696]: 2025-11-22 07:46:30.615669162 +0000 UTC m=+0.062290621 container create b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:46:30 np0005531888 systemd[1]: Started libpod-conmon-b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f.scope.
Nov 22 02:46:30 np0005531888 podman[217696]: 2025-11-22 07:46:30.580590453 +0000 UTC m=+0.027211932 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:46:30 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:46:30 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6bc217b20e5d14d9af3d247ee842f76c6b550d0efc8d26dedae306493d5fd7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:46:30 np0005531888 podman[217696]: 2025-11-22 07:46:30.714586611 +0000 UTC m=+0.161208090 container init b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:46:30 np0005531888 podman[217696]: 2025-11-22 07:46:30.724177014 +0000 UTC m=+0.170798473 container start b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:46:30 np0005531888 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217712]: [NOTICE]   (217716) : New worker (217718) forked
Nov 22 02:46:30 np0005531888 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217712]: [NOTICE]   (217716) : Loading success.
Nov 22 02:46:31 np0005531888 nova_compute[186788]: 2025-11-22 07:46:31.066 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:31 np0005531888 nova_compute[186788]: 2025-11-22 07:46:31.066 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquired lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:31 np0005531888 nova_compute[186788]: 2025-11-22 07:46:31.067 186792 DEBUG nova.network.neutron [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:46:31 np0005531888 nova_compute[186788]: 2025-11-22 07:46:31.316 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.111 186792 DEBUG nova.network.neutron [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Successfully updated port: 0bb26125-f048-4751-9c50-c8b2ff511438 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.123 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "refresh_cache-eabc0ddf-fdc1-473a-b224-0dc0954d754c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.124 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquired lock "refresh_cache-eabc0ddf-fdc1-473a-b224-0dc0954d754c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.124 186792 DEBUG nova.network.neutron [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.162 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.478 186792 DEBUG nova.network.neutron [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.566 186792 DEBUG nova.network.neutron [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.593 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Releasing lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.606 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797577.6034667, 3cf2b323-ba35-4807-8337-288f6c983860 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.606 186792 INFO nova.compute.manager [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.636 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.636 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.637 186792 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.639 186792 DEBUG nova.compute.manager [None req-cad6bcb5-8a25-4c33-ab81-439b05c07170 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:32 np0005531888 nova_compute[186788]: 2025-11-22 07:46:32.644 186792 INFO nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 22 02:46:32 np0005531888 virtqemud[186358]: Domain id=14 name='instance-0000001c' uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 is tainted: custom-monitor
Nov 22 02:46:32 np0005531888 podman[217727]: 2025-11-22 07:46:32.689817627 +0000 UTC m=+0.062002804 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 22 02:46:33 np0005531888 nova_compute[186788]: 2025-11-22 07:46:33.656 186792 INFO nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 22 02:46:33 np0005531888 nova_compute[186788]: 2025-11-22 07:46:33.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:33 np0005531888 nova_compute[186788]: 2025-11-22 07:46:33.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:46:33 np0005531888 nova_compute[186788]: 2025-11-22 07:46:33.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:46:33 np0005531888 nova_compute[186788]: 2025-11-22 07:46:33.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 02:46:33 np0005531888 nova_compute[186788]: 2025-11-22 07:46:33.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.141 186792 DEBUG nova.network.neutron [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Updating instance_info_cache with network_info: [{"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.202 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Releasing lock "refresh_cache-eabc0ddf-fdc1-473a-b224-0dc0954d754c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.203 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Instance network_info: |[{"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.205 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Start _get_guest_xml network_info=[{"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.212 186792 WARNING nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.216 186792 DEBUG nova.compute.manager [req-9b353d96-4215-4363-98e7-bf4fb1f285ec req-e06837de-062c-4a33-90e5-580fa4d35dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received event network-changed-0bb26125-f048-4751-9c50-c8b2ff511438 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.216 186792 DEBUG nova.compute.manager [req-9b353d96-4215-4363-98e7-bf4fb1f285ec req-e06837de-062c-4a33-90e5-580fa4d35dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Refreshing instance network info cache due to event network-changed-0bb26125-f048-4751-9c50-c8b2ff511438. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.217 186792 DEBUG oslo_concurrency.lockutils [req-9b353d96-4215-4363-98e7-bf4fb1f285ec req-e06837de-062c-4a33-90e5-580fa4d35dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eabc0ddf-fdc1-473a-b224-0dc0954d754c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.217 186792 DEBUG oslo_concurrency.lockutils [req-9b353d96-4215-4363-98e7-bf4fb1f285ec req-e06837de-062c-4a33-90e5-580fa4d35dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eabc0ddf-fdc1-473a-b224-0dc0954d754c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.217 186792 DEBUG nova.network.neutron [req-9b353d96-4215-4363-98e7-bf4fb1f285ec req-e06837de-062c-4a33-90e5-580fa4d35dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Refreshing network info cache for port 0bb26125-f048-4751-9c50-c8b2ff511438 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.226 186792 DEBUG nova.virt.libvirt.host [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.227 186792 DEBUG nova.virt.libvirt.host [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.230 186792 DEBUG nova.virt.libvirt.host [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.231 186792 DEBUG nova.virt.libvirt.host [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.232 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.233 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.233 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.233 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.234 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.234 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.234 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.234 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.234 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.235 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.235 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.235 186792 DEBUG nova.virt.hardware [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.239 186792 DEBUG nova.virt.libvirt.vif [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-2063451211',display_name='tempest-ImagesOneServerNegativeTestJSON-server-2063451211',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-2063451211',id=30,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-8u6fkyns',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:27Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=eabc0ddf-fdc1-473a-b224-0dc0954d754c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.239 186792 DEBUG nova.network.os_vif_util [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.240 186792 DEBUG nova.network.os_vif_util [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:f8:7c,bridge_name='br-int',has_traffic_filtering=True,id=0bb26125-f048-4751-9c50-c8b2ff511438,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bb26125-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.241 186792 DEBUG nova.objects.instance [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'pci_devices' on Instance uuid eabc0ddf-fdc1-473a-b224-0dc0954d754c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.262 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <uuid>eabc0ddf-fdc1-473a-b224-0dc0954d754c</uuid>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <name>instance-0000001e</name>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-2063451211</nova:name>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:46:34</nova:creationTime>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        <nova:user uuid="b47fa480dd1c4c9f81da16b464195f2b">tempest-ImagesOneServerNegativeTestJSON-328128522-project-member</nova:user>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        <nova:project uuid="b94109a356454dbda245fe5e57d0cd82">tempest-ImagesOneServerNegativeTestJSON-328128522</nova:project>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        <nova:port uuid="0bb26125-f048-4751-9c50-c8b2ff511438">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <entry name="serial">eabc0ddf-fdc1-473a-b224-0dc0954d754c</entry>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <entry name="uuid">eabc0ddf-fdc1-473a-b224-0dc0954d754c</entry>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.config"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:de:f8:7c"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <target dev="tap0bb26125-f0"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/console.log" append="off"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:46:34 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:46:34 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:46:34 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:46:34 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.264 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Preparing to wait for external event network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.265 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.265 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.266 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.267 186792 DEBUG nova.virt.libvirt.vif [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-2063451211',display_name='tempest-ImagesOneServerNegativeTestJSON-server-2063451211',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-2063451211',id=30,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-8u6fkyns',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:27Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=eabc0ddf-fdc1-473a-b224-0dc0954d754c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.267 186792 DEBUG nova.network.os_vif_util [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.268 186792 DEBUG nova.network.os_vif_util [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:f8:7c,bridge_name='br-int',has_traffic_filtering=True,id=0bb26125-f048-4751-9c50-c8b2ff511438,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bb26125-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.268 186792 DEBUG os_vif [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:f8:7c,bridge_name='br-int',has_traffic_filtering=True,id=0bb26125-f048-4751-9c50-c8b2ff511438,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bb26125-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.269 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.269 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.270 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.273 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.273 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0bb26125-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.274 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0bb26125-f0, col_values=(('external_ids', {'iface-id': '0bb26125-f048-4751-9c50-c8b2ff511438', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:f8:7c', 'vm-uuid': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.275 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:34 np0005531888 NetworkManager[55166]: <info>  [1763797594.2767] manager: (tap0bb26125-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.279 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.285 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.287 186792 INFO os_vif [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:f8:7c,bridge_name='br-int',has_traffic_filtering=True,id=0bb26125-f048-4751-9c50-c8b2ff511438,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bb26125-f0')#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.345 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.345 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.346 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No VIF found with MAC fa:16:3e:de:f8:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.346 186792 INFO nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Using config drive#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.663 186792 INFO nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.669 186792 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:34 np0005531888 nova_compute[186788]: 2025-11-22 07:46:34.687 186792 DEBUG nova.objects.instance [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.304 186792 INFO nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Creating config drive at /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.config#033[00m
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.309 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpddlwicxb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.442 186792 DEBUG oslo_concurrency.processutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpddlwicxb" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:35 np0005531888 kernel: tap0bb26125-f0: entered promiscuous mode
Nov 22 02:46:35 np0005531888 NetworkManager[55166]: <info>  [1763797595.5133] manager: (tap0bb26125-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.515 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:35Z|00100|binding|INFO|Claiming lport 0bb26125-f048-4751-9c50-c8b2ff511438 for this chassis.
Nov 22 02:46:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:35Z|00101|binding|INFO|0bb26125-f048-4751-9c50-c8b2ff511438: Claiming fa:16:3e:de:f8:7c 10.100.0.12
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.542 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:f8:7c 10.100.0.12'], port_security=['fa:16:3e:de:f8:7c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4aa99606-7691-4fcb-846d-56459aaaa088', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b94109a356454dbda245fe5e57d0cd82', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ebc0842-f2b0-4995-8bc2-4b71e8009dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=440686f5-fec3-41db-bbb0-53e12589d6a4, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=0bb26125-f048-4751-9c50-c8b2ff511438) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.543 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 0bb26125-f048-4751-9c50-c8b2ff511438 in datapath 4aa99606-7691-4fcb-846d-56459aaaa088 bound to our chassis#033[00m
Nov 22 02:46:35 np0005531888 systemd-udevd[217764]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.545 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4aa99606-7691-4fcb-846d-56459aaaa088#033[00m
Nov 22 02:46:35 np0005531888 NetworkManager[55166]: <info>  [1763797595.5614] device (tap0bb26125-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.561 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2b06ac0d-0581-459e-99fc-db862dc3d09a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.562 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4aa99606-71 in ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:46:35 np0005531888 NetworkManager[55166]: <info>  [1763797595.5625] device (tap0bb26125-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.565 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4aa99606-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.565 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3a903f-c896-466e-ab62-9f1220e66872]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.566 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1627bf67-a6b8-4acf-93cf-de4eb38752d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 systemd-machined[153106]: New machine qemu-15-instance-0000001e.
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.582 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.584 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[999ef9e9-07a1-45f7-9584-96261fd3fc9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:35Z|00102|binding|INFO|Setting lport 0bb26125-f048-4751-9c50-c8b2ff511438 ovn-installed in OVS
Nov 22 02:46:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:35Z|00103|binding|INFO|Setting lport 0bb26125-f048-4751-9c50-c8b2ff511438 up in Southbound
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.588 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:35 np0005531888 systemd[1]: Started Virtual Machine qemu-15-instance-0000001e.
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.599 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b7e69efc-4cf0-46ed-ad23-8d32e4d4b0e1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.631 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4083be-f7cf-489c-a826-ce6c556b550f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.638 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba4ac9c-37e2-4f43-b751-9214de2b2250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 NetworkManager[55166]: <info>  [1763797595.6398] manager: (tap4aa99606-70): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.680 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[ab2ec1c8-2653-4559-8f1f-7776a805ce4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.683 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[9d73b807-0c14-4efb-944e-c9c4f9497fd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 NetworkManager[55166]: <info>  [1763797595.7117] device (tap4aa99606-70): carrier: link connected
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.720 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6490956e-ae7f-4073-b54c-61e3e2f45714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.738 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cbcf7ae0-ab82-4765-99e9-f4c847edef75]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4aa99606-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:b3:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433635, 'reachable_time': 38484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217800, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.756 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[35cf78f6-9a18-4ff7-87b5-750b3519307e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe85:b3a9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433635, 'tstamp': 433635}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217801, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.774 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[81a5acac-7627-4679-8536-ff46344c5ca2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4aa99606-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:b3:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433635, 'reachable_time': 38484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217802, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.814 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c99ee0d7-320b-4412-8a5a-4c96aa758e91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.878 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[89b0c19a-17d0-4cc0-8d9e-89fe83a3fcdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.880 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4aa99606-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.880 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.880 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4aa99606-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.882 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:35 np0005531888 kernel: tap4aa99606-70: entered promiscuous mode
Nov 22 02:46:35 np0005531888 NetworkManager[55166]: <info>  [1763797595.8841] manager: (tap4aa99606-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.887 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4aa99606-70, col_values=(('external_ids', {'iface-id': 'ef41332e-7ec0-4d28-b824-d5b12ab6995f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.888 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:35Z|00104|binding|INFO|Releasing lport ef41332e-7ec0-4d28-b824-d5b12ab6995f from this chassis (sb_readonly=0)
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.891 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.891 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c69b3ad9-774b-4337-88c8-ddd7ba54fe4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.892 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-4aa99606-7691-4fcb-846d-56459aaaa088
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 4aa99606-7691-4fcb-846d-56459aaaa088
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:46:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:35.893 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'env', 'PROCESS_TAG=haproxy-4aa99606-7691-4fcb-846d-56459aaaa088', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4aa99606-7691-4fcb-846d-56459aaaa088.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:46:35 np0005531888 nova_compute[186788]: 2025-11-22 07:46:35.901 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:36 np0005531888 podman[217834]: 2025-11-22 07:46:36.261076026 +0000 UTC m=+0.051168939 container create b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:46:36 np0005531888 systemd[1]: Started libpod-conmon-b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631.scope.
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.317 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:36 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:46:36 np0005531888 podman[217834]: 2025-11-22 07:46:36.232650025 +0000 UTC m=+0.022742958 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:46:36 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aedf59e408f4cc2856d6b8bd51c9a8e09a8ae81e057fc062ffde8cf83a60afe9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:46:36 np0005531888 podman[217834]: 2025-11-22 07:46:36.342656645 +0000 UTC m=+0.132749588 container init b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:46:36 np0005531888 podman[217834]: 2025-11-22 07:46:36.349380846 +0000 UTC m=+0.139473759 container start b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 02:46:36 np0005531888 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[217853]: [NOTICE]   (217859) : New worker (217862) forked
Nov 22 02:46:36 np0005531888 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[217853]: [NOTICE]   (217859) : Loading success.
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.398 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797596.3974745, eabc0ddf-fdc1-473a-b224-0dc0954d754c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.398 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] VM Started (Lifecycle Event)#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.432 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.436 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797596.398667, eabc0ddf-fdc1-473a-b224-0dc0954d754c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.437 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.476 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.480 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.502 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:46:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:36.798 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:36.799 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:36.800 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.838 186792 DEBUG nova.compute.manager [req-a2a7c844-d24d-45a2-bc0d-f6d9b8abaf68 req-6612991e-5e8b-4e69-9d27-6a4743bf1878 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received event network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.839 186792 DEBUG oslo_concurrency.lockutils [req-a2a7c844-d24d-45a2-bc0d-f6d9b8abaf68 req-6612991e-5e8b-4e69-9d27-6a4743bf1878 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.839 186792 DEBUG oslo_concurrency.lockutils [req-a2a7c844-d24d-45a2-bc0d-f6d9b8abaf68 req-6612991e-5e8b-4e69-9d27-6a4743bf1878 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.839 186792 DEBUG oslo_concurrency.lockutils [req-a2a7c844-d24d-45a2-bc0d-f6d9b8abaf68 req-6612991e-5e8b-4e69-9d27-6a4743bf1878 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.839 186792 DEBUG nova.compute.manager [req-a2a7c844-d24d-45a2-bc0d-f6d9b8abaf68 req-6612991e-5e8b-4e69-9d27-6a4743bf1878 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Processing event network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.840 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.839 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'name': 'tempest-LiveMigrationTest-server-810629940', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001c', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd48bda61691e4f778b6d30c0dc773a30', 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'hostId': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.844 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797596.843841, eabc0ddf-fdc1-473a-b224-0dc0954d754c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.844 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.846 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001e', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b94109a356454dbda245fe5e57d0cd82', 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'hostId': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.846 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.848 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.850 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 / tap973cdfc2-4a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.850 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.852 186792 INFO nova.virt.libvirt.driver [-] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Instance spawned successfully.#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.853 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.853 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for eabc0ddf-fdc1-473a-b224-0dc0954d754c / tap0bb26125-f0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.853 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5aac90f6-bb70-45d6-b133-0af0665b9072', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:36.846856', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5faa3940-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': 'd99549c961567e2f7038bfd04efebdb6b42a6c549343552ef2e6634b0dac89f0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:36.846856', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5faaa1f0-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': '30e16ebb7cb9199c18e2e4c78fec44550e88ade86b4de873299bade56897d4e8'}]}, 'timestamp': '2025-11-22 07:46:36.854101', '_unique_id': 'decb36888e6842828a99c1166c76f0d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.855 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.857 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.857 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9653cb9e-1c31-423d-abfd-0cc1b15681ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:36.857006', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fab2224-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': 'a714bd8edc2495d9ff172daf513b44a253f7c552c32f6c04e04ac01075f6d056'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:36.857006', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fab2e4a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': 'c5e0d6e926f87894461d7e3e8e5b2d67640b20dd67563ab3d0200a1bfb2e4935'}]}, 'timestamp': '2025-11-22 07:46:36.857666', '_unique_id': '71fb52b1857d435cbb595777745a5656'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.858 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.859 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.870 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.870 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.882 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.882 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.882 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5f9c2f8-cd0a-4996-981c-9bdf4fbbc6eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:36.859111', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fad3514-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.558624209, 'message_signature': 'c342a40dcffbe232e0bebdc9c44759d243bcb167ae52295de9e996ee04180e88'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:36.859111', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fad3e88-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.558624209, 'message_signature': '9d8e3fe9ee036d0017e8ddee7cd4f3ffec524a02d89978bc127f45e7cca0c5a8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:36.859111', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5faf08ee-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.570596233, 'message_signature': '90b71b77a906d988fe6bb7991308a79c4dce46bf35690bdc44cd4e0df31d341b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:36.859111', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5faf133e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.570596233, 'message_signature': '57af08ab210a01333da06cff1ef73d4d12ee41b306370875060c1c178176a2b7'}]}, 'timestamp': '2025-11-22 07:46:36.883186', '_unique_id': '1b43cba853ef42b1bd7b611a31701c79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.885 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.885 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.885 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.885 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7df930f2-5468-43e4-94fa-b13a6c908eec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:36.885131', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5faf6c44-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.558624209, 'message_signature': '1e5a165b011385c1d8b112e027a96c385c6cf4a96192f1e32ddb0d79e7ffed95'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:36.885131', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5faf74fa-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.558624209, 'message_signature': '723cc97177ab1a63661ccdf97ef3541ae24e592902fd19f39ca4b0a03e9b066c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:36.885131', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5faf7d7e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.570596233, 'message_signature': '38cf299bfde75da7aec4642370032e908cf2cdcc7aa1dec1aae1bf57ef315870'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:36.885131', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5faf84cc-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.570596233, 'message_signature': '833bd78ac22a50c1f9e60b22a558c346ab7da2d40b18d46f3ae91f04a123c53d'}]}, 'timestamp': '2025-11-22 07:46:36.886060', '_unique_id': '6b02bd91f6574553be2dc7eae2be7a90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.887 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.887 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.887 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b9a8040-1e3b-49f5-99dc-0b6d0d33da6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:36.887265', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fafc05e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': '92df9af98b788aebe25d097576205964ae91840985de6dd6b969a0f1de96292e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:36.887265', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fafc98c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': '0c1b5a92a229871c9b51b5957ca803b03af77c896be356cc580bfd400320f021'}]}, 'timestamp': '2025-11-22 07:46:36.887795', '_unique_id': 'af93f5a5c1e443e286def65c787f6bb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.888 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.889 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-810629940>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-2063451211>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-810629940>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-2063451211>]
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.897 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.897 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.898 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.898 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.899 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.899 186792 DEBUG nova.virt.libvirt.driver [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.909 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.918 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.919 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.947 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.948 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '959bf78f-5e2b-4850-a0ba-8987cd1c9c7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:36.889280', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fb48a44-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '3eb0b8016f823c2e82d458ac3e4d288aa6f2b7318376bbda4ed74a54251eff0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:36.889280', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fb4967e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '88cc36ff11e45d36d2aa9194b78a99d25217041b66ca25bef3ff1d86bc76af81'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:36.889280', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fb90d80-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': '5d7a6a201ca5aed4a7f07b3b064574edf78f9398985135c3a102af1e9c7d33fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:36.889280', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fb91d70-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': 'c919ea715f163fcd09f7de061d32974c864259948d5499d6c4816f4d1c3e16e9'}]}, 'timestamp': '2025-11-22 07:46:36.948949', '_unique_id': 'f5eac0eb2bb44041b4bde590f5aaeff4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.951 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.951 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.outgoing.bytes volume: 5616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.951 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '092ec54d-f155-48d2-aa0e-f3220aa146bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5616, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:36.951260', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fb982d8-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': 'dde9ccd2148ee61ac1d3baebf42761db61cc60df2013686a13acfea196b12599'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:36.951260', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fb98e7c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': '2ce1639e8e4f4493f7b8a0704303ef42f82e53b9e14b71e1582aeade4932d6f3'}]}, 'timestamp': '2025-11-22 07:46:36.951860', '_unique_id': 'df4790d277bb4c16bd69d520cd053195'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.953 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.953 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.953 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.954 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.954 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12404599-5d7c-4759-b093-0d9c5cc06841', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:36.953420', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fb9d80a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '92e2ee74723a81262969361fe82109ed20e83c102bd55089405dcf277f1427ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:36.953420', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fb9e2be-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '9c12bedc8abac448a677bbfe8acd94d03569b8990c080231dc363ca73cd09633'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:36.953420', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fb9ee6c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': '17cc34efedab273526ef6f91fcce59e357faccaf1c7f37eb80b53cbcbb2f40e2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:36.953420', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fb9f90c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': '1eec676ca6558fdca6ab5f35edb44422688b9b4b5e441f4617d35850fb4a864a'}]}, 'timestamp': '2025-11-22 07:46:36.954608', '_unique_id': '80e6d392bada405a94add4290cdeb32c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.956 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.956 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.956 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.957 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.957 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffda0919-2b97-46ff-94b7-ae4315be45fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:36.956496', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fba50c8-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '33e4ceb596dd9dcb811ac5c6d3fe4cbf73cd30ab67264046fc3a67d0b17491bc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:36.956496', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fba5c3a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': 'cadc60a5b389c8ad817967257d7d9f8da4909ceea798e42bdf10a2589779434d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:36.956496', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fba670c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': 'ddc18dbb0e4db475513bbe5bcc5c0b67ab44630309d20e69de3620e0c38c2c4e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:36.956496', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fba7116-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': '7ef1d55ca3c34ed1a701e929489ae204f47ca15b0ed9099d6dc29644221dc77c'}]}, 'timestamp': '2025-11-22 07:46:36.957667', '_unique_id': '415ded7940414d6d8d2497b16962ea5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.959 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.959 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-810629940>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-2063451211>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-810629940>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-2063451211>]
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.959 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.959 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.959 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-810629940>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-2063451211>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-810629940>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-2063451211>]
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.960 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.960 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3d7868d-8af4-4ad1-b783-f0b8b2facfd8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:36.960248', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fbae29a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': 'f335ceff4b7c23d7a6160605db44762834cf2947bda0267511de6dd678e504a5'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:36.960248', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fbaeff6-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': '7edef0425ab2936d222b4103e090e8ce9919641f6ef263b6a8f51a9aff3b0959'}]}, 'timestamp': '2025-11-22 07:46:36.960912', '_unique_id': 'b1927b1974eb4c70abb9d74bf639c268'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.962 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.979 186792 INFO nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Took 9.62 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:46:36 np0005531888 nova_compute[186788]: 2025-11-22 07:46:36.980 186792 DEBUG nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.981 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.999 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:36.999 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance eabc0ddf-fdc1-473a-b224-0dc0954d754c: ceilometer.compute.pollsters.NoVolumeException
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f442eae-e2b0-43b0-8d1b-8c9095b39ac4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'timestamp': '2025-11-22T07:46:36.962615', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5fbe2a0e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.680808578, 'message_signature': '7184b30199bc0b17e755d1aca29d37d5da18b809d9643a2922910ad321434fef'}]}, 'timestamp': '2025-11-22 07:46:36.999615', '_unique_id': 'f2aeb18795e94be5b1fd07b23a2a49ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.000 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.001 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.001 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74400fd6-31c6-4c9b-8c92-5e2f91df39d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:37.001618', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fc1326c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': '458bf5eba760dfb328f5bf7fb4d61984e2801487ef2b72d66c6d03fb58c2eed2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:37.001618', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fc13f1e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': 'bb666d148d87b8f4cd842078ec0e865fa05b3fbeb4952e3bc48eedf575765d2c'}]}, 'timestamp': '2025-11-22 07:46:37.002243', '_unique_id': 'd07ca27e621444569e6835dc9065e273'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.002 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.003 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.003 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56b325ea-a87c-4200-b9fb-81a9b9650a9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:37.003630', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fc17fce-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': 'fd89c3f8172c65fd6410c960cf83400b9162c6878450e9efd34e874b87a1de5d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:37.003630', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fc1887a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': '27721578985eaca7d2d22f2926799de45961e38a27bc5c71aafa91343574953a'}]}, 'timestamp': '2025-11-22 07:46:37.004096', '_unique_id': 'e74d40e55a1b435d9480a2e5077a7c46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.004 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.005 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.005 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/cpu volume: 80000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.005 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/cpu volume: 120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '763bb9fa-0a88-4f76-92e8-d1bd3dcb2f70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80000000, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'timestamp': '2025-11-22T07:46:37.005301', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5fc1c092-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.680808578, 'message_signature': '80b3786b712c0fdbe1006bfca32f79c85fd11c435597c4e1e854d4243bd47c4c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 120000000, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'timestamp': '2025-11-22T07:46:37.005301', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5fc1c984-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.698422415, 'message_signature': '8cfb52b13582207787819b07a6b5161496ca3960bb5f66830282a2ab9585fd09'}]}, 'timestamp': '2025-11-22 07:46:37.005757', '_unique_id': '98ae7f0464994e739afdaef076a6b218'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.007 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.007 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-810629940>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-2063451211>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-810629940>, <NovaLikeServer: tempest-ImagesOneServerNegativeTestJSON-server-2063451211>]
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.007 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.007 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8be87db6-57b8-4266-affa-039e2edda0b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:37.007286', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fc20e08-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': '996eb123951fa0641c4583a6c1000547b05ee8e19aac88a001e0103e4fcfe3ee'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:37.007286', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fc21dc6-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': '52429c04866678a0fce5eaf2240848755501605f5812da58caa34e2dc97bbebd'}]}, 'timestamp': '2025-11-22 07:46:37.007928', '_unique_id': '9fc33cc89c91446083964ab00f0e78b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.009 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.write.requests volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.009 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.009 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.009 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c19da036-adae-440e-ae9a-57fa0f0c50d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 8, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:37.009164', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fc25764-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '5735bd717e3b97c31b5b47709c90cb007df3e5ed5019073a0b28116e3d19186d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:37.009164', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fc25fb6-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '2e779780070e030ffef0f094730a2cebfa08030d43091edffd6320e2f40886cf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:37.009164', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fc26998-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': 'e2641ec033614dcf2f57887e01cec53873404e09217a401ace515e57c0c62d1d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:37.009164', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fc2721c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': '07db4fa2c460a8b021ee86302c022cba5ea88a8826baa048e3dafbddafd6140d'}]}, 'timestamp': '2025-11-22 07:46:37.010082', '_unique_id': 'a3fed6fc889141c3bfc39ca38f706ce2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.010 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.011 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.011 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53e0cfb8-d37a-4b2e-9ee8-1017a8088800', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:37.011684', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fc2b984-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': 'c12f400e6317d051e70a80e085601ee5133e083084f24ba297df473f5ac9bbe2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:37.011684', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fc2c172-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': '8d01e73cc8ae5dd6d05c74cd36d9eaae8f7ddbfb8a78f2f793fd50b0748cfd69'}]}, 'timestamp': '2025-11-22 07:46:37.012103', '_unique_id': 'd40d919671ef4642ac6992a0cc35eca2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.012 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.013 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.013 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a755225-2410-4d5c-b7f7-150129473666', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': 'instance-0000001c-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-tap973cdfc2-4a', 'timestamp': '2025-11-22T07:46:37.013259', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'tap973cdfc2-4a', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:02:9b:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap973cdfc2-4a'}, 'message_id': '5fc2f73c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.546353357, 'message_signature': '681eafc9d983fd5d2130eb14f6fd06388ad2d829fb64c2407cf9fd339b0cd5bf'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'instance-0000001e-eabc0ddf-fdc1-473a-b224-0dc0954d754c-tap0bb26125-f0', 'timestamp': '2025-11-22T07:46:37.013259', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'tap0bb26125-f0', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:f8:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb26125-f0'}, 'message_id': '5fc300c4-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.550887392, 'message_signature': 'c9634ace7cbb4eeb53775ac1c5eee5804f9b71e18702cf8778251db05b14a3bf'}]}, 'timestamp': '2025-11-22 07:46:37.013727', '_unique_id': '4c6c2a8955794d44be7ce2c9da219699'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.014 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.015 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.015 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.015 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93416792-fecb-4b32-9430-979c49d94d39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:37.014819', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fc333e6-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.558624209, 'message_signature': '48868fd7fa60783c90fdb3b87bd118820b0a853d471edf4f51a66924a4eec9eb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:37.014819', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fc33b8e-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.558624209, 'message_signature': 'ffae29a625eed1a0b65ad6607ff50cdcea5c999af20182286a3eb1481f7cd20c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:37.014819', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fc342f0-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.570596233, 'message_signature': '3828f884d3598bc88f105d4c35e59392f5d276ba16b41280db4c4004b03b5f5e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:37.014819', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fc34a0c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.570596233, 'message_signature': 'e3a13cc19fc149cf824e5f1b641613ded0e33881d8f0b9e185fab29e8cbd1399'}]}, 'timestamp': '2025-11-22 07:46:37.015614', '_unique_id': '21b451f5408045be86dae8eb8cf28420'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.016 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.write.bytes volume: 167936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.017 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.017 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.017 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97b38bee-4b35-494d-8779-5b3da1c22ba4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 167936, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:37.016848', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fc38346-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '457694aea8be204821c8ef0f81809dec4a1acf73e3ef2303b8a0a13a27264e15'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:37.016848', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fc38ae4-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': '4e77ca7b9c87567182e59ede058793287bca7b64b7d4585260eca6edbc2e3e3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:37.016848', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fc39264-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': '961f03017bf3a03d9db1cbb1a586fafb0288fed2d73ce5b1f259f073ab880c95'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:37.016848', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fc39976-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': 'fec73e3b1c0b97e20fab4ee682aba11cfbdc0259e3e83b55b98e7781788d0f25'}]}, 'timestamp': '2025-11-22 07:46:37.017653', '_unique_id': 'ed49518350484aa08fc7727dbec0fd32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.018 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.write.latency volume: 6756862 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.019 12 DEBUG ceilometer.compute.pollsters [-] 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.019 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.019 12 DEBUG ceilometer.compute.pollsters [-] eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bab2e50b-32e2-4647-ad9e-038de72614ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6756862, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-vda', 'timestamp': '2025-11-22T07:46:37.018811', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fc3cfc2-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': 'bb6b2d75fd29e7199c16a2cfd11e942f678fc8ab575d6a0bdcb2005227d6f490'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_name': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_name': None, 'resource_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-sda', 'timestamp': '2025-11-22T07:46:37.018811', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-810629940', 'name': 'instance-0000001c', 'instance_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'instance_type': 'm1.nano', 'host': 'b4f37627cdba1f75593baefe7a9b42deb657fd645515ebc8e632ddc3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fc3d724-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.588881977, 'message_signature': 'f2afdfa43afa3b0002358ec5d878281c38cf14a7e85c4c25cf74e80a17ba48b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-vda', 'timestamp': '2025-11-22T07:46:37.018811', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5fc3de9a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': 'b61312ea4ca81b296b8e66b8fd1230524f7973b10976eb3de20c80372e854c07'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_name': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_name': None, 'resource_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c-sda', 'timestamp': '2025-11-22T07:46:37.018811', 'resource_metadata': {'display_name': 'tempest-ImagesOneServerNegativeTestJSON-server-2063451211', 'name': 'instance-0000001e', 'instance_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'instance_type': 'm1.nano', 'host': '1efd018f9a992b48379f643181b900f3f31e0d065130eb87d480d96b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5fc3e5d4-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4337.618768585, 'message_signature': '88ae328b337c515e82ab285fa092deec52ef758bbb903fccd44f4c6c3f734691'}]}, 'timestamp': '2025-11-22 07:46:37.019599', '_unique_id': 'a4c11d57b75943739297204d8cd53642'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:46:37.020 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:37 np0005531888 nova_compute[186788]: 2025-11-22 07:46:37.084 186792 INFO nova.compute.manager [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Took 11.17 seconds to build instance.#033[00m
Nov 22 02:46:37 np0005531888 nova_compute[186788]: 2025-11-22 07:46:37.107 186792 DEBUG oslo_concurrency.lockutils [None req-f58bb793-89bc-43d1-9107-0513d57fd6b2 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:37 np0005531888 nova_compute[186788]: 2025-11-22 07:46:37.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:37 np0005531888 nova_compute[186788]: 2025-11-22 07:46:37.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:38 np0005531888 nova_compute[186788]: 2025-11-22 07:46:38.504 186792 DEBUG nova.network.neutron [req-9b353d96-4215-4363-98e7-bf4fb1f285ec req-e06837de-062c-4a33-90e5-580fa4d35dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Updated VIF entry in instance network info cache for port 0bb26125-f048-4751-9c50-c8b2ff511438. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:46:38 np0005531888 nova_compute[186788]: 2025-11-22 07:46:38.504 186792 DEBUG nova.network.neutron [req-9b353d96-4215-4363-98e7-bf4fb1f285ec req-e06837de-062c-4a33-90e5-580fa4d35dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Updating instance_info_cache with network_info: [{"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:38 np0005531888 nova_compute[186788]: 2025-11-22 07:46:38.530 186792 DEBUG oslo_concurrency.lockutils [req-9b353d96-4215-4363-98e7-bf4fb1f285ec req-e06837de-062c-4a33-90e5-580fa4d35dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eabc0ddf-fdc1-473a-b224-0dc0954d754c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:38 np0005531888 podman[217871]: 2025-11-22 07:46:38.705804669 +0000 UTC m=+0.066657122 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:46:38 np0005531888 nova_compute[186788]: 2025-11-22 07:46:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:39 np0005531888 nova_compute[186788]: 2025-11-22 07:46:39.104 186792 DEBUG nova.compute.manager [req-d2e293b2-677c-4c20-98f4-8b73956e5467 req-739b5803-3c93-4fa9-afc3-590635172eed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received event network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:39 np0005531888 nova_compute[186788]: 2025-11-22 07:46:39.105 186792 DEBUG oslo_concurrency.lockutils [req-d2e293b2-677c-4c20-98f4-8b73956e5467 req-739b5803-3c93-4fa9-afc3-590635172eed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:39 np0005531888 nova_compute[186788]: 2025-11-22 07:46:39.105 186792 DEBUG oslo_concurrency.lockutils [req-d2e293b2-677c-4c20-98f4-8b73956e5467 req-739b5803-3c93-4fa9-afc3-590635172eed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:39 np0005531888 nova_compute[186788]: 2025-11-22 07:46:39.105 186792 DEBUG oslo_concurrency.lockutils [req-d2e293b2-677c-4c20-98f4-8b73956e5467 req-739b5803-3c93-4fa9-afc3-590635172eed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:39 np0005531888 nova_compute[186788]: 2025-11-22 07:46:39.105 186792 DEBUG nova.compute.manager [req-d2e293b2-677c-4c20-98f4-8b73956e5467 req-739b5803-3c93-4fa9-afc3-590635172eed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] No waiting events found dispatching network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:39 np0005531888 nova_compute[186788]: 2025-11-22 07:46:39.106 186792 WARNING nova.compute.manager [req-d2e293b2-677c-4c20-98f4-8b73956e5467 req-739b5803-3c93-4fa9-afc3-590635172eed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received unexpected event network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:46:39 np0005531888 nova_compute[186788]: 2025-11-22 07:46:39.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:39 np0005531888 nova_compute[186788]: 2025-11-22 07:46:39.994 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Check if temp file /var/lib/nova/instances/tmppob76voc exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 22 02:46:40 np0005531888 nova_compute[186788]: 2025-11-22 07:46:40.001 186792 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:40 np0005531888 nova_compute[186788]: 2025-11-22 07:46:40.065 186792 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:40 np0005531888 nova_compute[186788]: 2025-11-22 07:46:40.067 186792 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:40 np0005531888 nova_compute[186788]: 2025-11-22 07:46:40.150 186792 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:40 np0005531888 nova_compute[186788]: 2025-11-22 07:46:40.152 186792 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppob76voc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 22 02:46:41 np0005531888 nova_compute[186788]: 2025-11-22 07:46:41.287 186792 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:41 np0005531888 nova_compute[186788]: 2025-11-22 07:46:41.320 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:41 np0005531888 nova_compute[186788]: 2025-11-22 07:46:41.343 186792 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:41 np0005531888 nova_compute[186788]: 2025-11-22 07:46:41.344 186792 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:41 np0005531888 nova_compute[186788]: 2025-11-22 07:46:41.403 186792 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:41 np0005531888 podman[217903]: 2025-11-22 07:46:41.701373835 +0000 UTC m=+0.057874949 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:46:41 np0005531888 nova_compute[186788]: 2025-11-22 07:46:41.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:41 np0005531888 nova_compute[186788]: 2025-11-22 07:46:41.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:41 np0005531888 nova_compute[186788]: 2025-11-22 07:46:41.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:46:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:42.916 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:46:42 np0005531888 nova_compute[186788]: 2025-11-22 07:46:42.917 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:42.918 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:46:42 np0005531888 nova_compute[186788]: 2025-11-22 07:46:42.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:42 np0005531888 nova_compute[186788]: 2025-11-22 07:46:42.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:42 np0005531888 nova_compute[186788]: 2025-11-22 07:46:42.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:42 np0005531888 nova_compute[186788]: 2025-11-22 07:46:42.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:42 np0005531888 nova_compute[186788]: 2025-11-22 07:46:42.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.076 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.138 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.139 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.201 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.209 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.277 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.278 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.340 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.543 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.546 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5477MB free_disk=73.42719268798828GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.546 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.546 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.612 186792 INFO nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating resource usage from migration 31f45e55-ca5e-47f9-8b1e-254643907ee9#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.640 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance eabc0ddf-fdc1-473a-b224-0dc0954d754c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.641 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Migration 31f45e55-ca5e-47f9-8b1e-254643907ee9 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.642 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.642 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.709 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.726 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.746 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:46:43 np0005531888 nova_compute[186788]: 2025-11-22 07:46:43.747 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:43 np0005531888 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:46:43 np0005531888 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:46:43 np0005531888 systemd-logind[825]: New session 41 of user nova.
Nov 22 02:46:43 np0005531888 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:46:44 np0005531888 podman[217942]: 2025-11-22 07:46:44.004355983 +0000 UTC m=+0.084036882 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 22 02:46:44 np0005531888 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:46:44 np0005531888 systemd[217963]: Queued start job for default target Main User Target.
Nov 22 02:46:44 np0005531888 systemd[217963]: Created slice User Application Slice.
Nov 22 02:46:44 np0005531888 systemd[217963]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:46:44 np0005531888 systemd[217963]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:46:44 np0005531888 systemd[217963]: Reached target Paths.
Nov 22 02:46:44 np0005531888 systemd[217963]: Reached target Timers.
Nov 22 02:46:44 np0005531888 systemd[217963]: Starting D-Bus User Message Bus Socket...
Nov 22 02:46:44 np0005531888 systemd[217963]: Starting Create User's Volatile Files and Directories...
Nov 22 02:46:44 np0005531888 systemd[217963]: Finished Create User's Volatile Files and Directories.
Nov 22 02:46:44 np0005531888 systemd[217963]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:46:44 np0005531888 systemd[217963]: Reached target Sockets.
Nov 22 02:46:44 np0005531888 systemd[217963]: Reached target Basic System.
Nov 22 02:46:44 np0005531888 systemd[217963]: Reached target Main User Target.
Nov 22 02:46:44 np0005531888 systemd[217963]: Startup finished in 150ms.
Nov 22 02:46:44 np0005531888 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:46:44 np0005531888 systemd[1]: Started Session 41 of User nova.
Nov 22 02:46:44 np0005531888 nova_compute[186788]: 2025-11-22 07:46:44.281 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:44 np0005531888 systemd[1]: session-41.scope: Deactivated successfully.
Nov 22 02:46:44 np0005531888 systemd-logind[825]: Session 41 logged out. Waiting for processes to exit.
Nov 22 02:46:44 np0005531888 systemd-logind[825]: Removed session 41.
Nov 22 02:46:44 np0005531888 nova_compute[186788]: 2025-11-22 07:46:44.635 186792 DEBUG nova.compute.manager [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:44 np0005531888 nova_compute[186788]: 2025-11-22 07:46:44.706 186792 INFO nova.compute.manager [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] instance snapshotting#033[00m
Nov 22 02:46:44 np0005531888 nova_compute[186788]: 2025-11-22 07:46:44.746 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.011 186792 INFO nova.virt.libvirt.driver [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Beginning live snapshot process#033[00m
Nov 22 02:46:45 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.218 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.284 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.286 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.340 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.353 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.408 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.410 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpraq_3vnh/f5749243db6942f8a8dd0c4aa428ed62.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.451 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpraq_3vnh/f5749243db6942f8a8dd0c4aa428ed62.delta 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.453 186792 INFO nova.virt.libvirt.driver [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.482 186792 DEBUG nova.compute.manager [req-bbd7480d-9ced-4519-9729-6a48bdc527b9 req-376b73b5-3750-4f67-a671-ec69edbc9f78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.483 186792 DEBUG oslo_concurrency.lockutils [req-bbd7480d-9ced-4519-9729-6a48bdc527b9 req-376b73b5-3750-4f67-a671-ec69edbc9f78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.484 186792 DEBUG oslo_concurrency.lockutils [req-bbd7480d-9ced-4519-9729-6a48bdc527b9 req-376b73b5-3750-4f67-a671-ec69edbc9f78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.484 186792 DEBUG oslo_concurrency.lockutils [req-bbd7480d-9ced-4519-9729-6a48bdc527b9 req-376b73b5-3750-4f67-a671-ec69edbc9f78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.484 186792 DEBUG nova.compute.manager [req-bbd7480d-9ced-4519-9729-6a48bdc527b9 req-376b73b5-3750-4f67-a671-ec69edbc9f78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.484 186792 DEBUG nova.compute.manager [req-bbd7480d-9ced-4519-9729-6a48bdc527b9 req-376b73b5-3750-4f67-a671-ec69edbc9f78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.515 186792 DEBUG nova.virt.libvirt.guest [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.519 186792 INFO nova.virt.libvirt.driver [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.553 186792 DEBUG nova.privsep.utils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.554 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpraq_3vnh/f5749243db6942f8a8dd0c4aa428ed62.delta /var/lib/nova/instances/snapshots/tmpraq_3vnh/f5749243db6942f8a8dd0c4aa428ed62 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.824 186792 DEBUG oslo_concurrency.processutils [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpraq_3vnh/f5749243db6942f8a8dd0c4aa428ed62.delta /var/lib/nova/instances/snapshots/tmpraq_3vnh/f5749243db6942f8a8dd0c4aa428ed62" returned: 0 in 0.270s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:45 np0005531888 nova_compute[186788]: 2025-11-22 07:46:45.826 186792 INFO nova.virt.libvirt.driver [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.120 186792 INFO nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Took 4.71 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.122 186792 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.149 186792 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppob76voc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(31f45e55-ca5e-47f9-8b1e-254643907ee9),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.174 186792 DEBUG nova.objects.instance [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'migration_context' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.175 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.176 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.177 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.190 186792 DEBUG nova.virt.libvirt.vif [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:46:34Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.191 186792 DEBUG nova.network.os_vif_util [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.191 186792 DEBUG nova.network.os_vif_util [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.192 186792 DEBUG nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating guest XML with vif config: <interface type="ethernet">
Nov 22 02:46:46 np0005531888 nova_compute[186788]:  <mac address="fa:16:3e:02:9b:98"/>
Nov 22 02:46:46 np0005531888 nova_compute[186788]:  <model type="virtio"/>
Nov 22 02:46:46 np0005531888 nova_compute[186788]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:46:46 np0005531888 nova_compute[186788]:  <mtu size="1442"/>
Nov 22 02:46:46 np0005531888 nova_compute[186788]:  <target dev="tap973cdfc2-4a"/>
Nov 22 02:46:46 np0005531888 nova_compute[186788]: </interface>
Nov 22 02:46:46 np0005531888 nova_compute[186788]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.192 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.322 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.325 186792 WARNING nova.compute.manager [None req-83245a81-bc4e-4486-9321-a75524a8e970 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Image not found during snapshot: nova.exception.ImageNotFound: Image 50fff91a-5b8c-429f-bffd-ce62a92439dd could not be found.#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.681 186792 DEBUG nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.681 186792 INFO nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 22 02:46:46 np0005531888 nova_compute[186788]: 2025-11-22 07:46:46.749 186792 INFO nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 22 02:46:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:46.920 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.251 186792 DEBUG nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.252 186792 DEBUG nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.757 186792 DEBUG nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.757 186792 DEBUG nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.888 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.889 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.889 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.889 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.889 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.898 186792 INFO nova.compute.manager [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Terminating instance#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.906 186792 DEBUG nova.compute.manager [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:46:47 np0005531888 kernel: tap0bb26125-f0 (unregistering): left promiscuous mode
Nov 22 02:46:47 np0005531888 NetworkManager[55166]: <info>  [1763797607.9371] device (tap0bb26125-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.952 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:47Z|00105|binding|INFO|Releasing lport 0bb26125-f048-4751-9c50-c8b2ff511438 from this chassis (sb_readonly=0)
Nov 22 02:46:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:47Z|00106|binding|INFO|Setting lport 0bb26125-f048-4751-9c50-c8b2ff511438 down in Southbound
Nov 22 02:46:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:47Z|00107|binding|INFO|Removing iface tap0bb26125-f0 ovn-installed in OVS
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.955 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:47.964 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:f8:7c 10.100.0.12'], port_security=['fa:16:3e:de:f8:7c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'eabc0ddf-fdc1-473a-b224-0dc0954d754c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4aa99606-7691-4fcb-846d-56459aaaa088', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b94109a356454dbda245fe5e57d0cd82', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ebc0842-f2b0-4995-8bc2-4b71e8009dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=440686f5-fec3-41db-bbb0-53e12589d6a4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=0bb26125-f048-4751-9c50-c8b2ff511438) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:46:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:47.965 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 0bb26125-f048-4751-9c50-c8b2ff511438 in datapath 4aa99606-7691-4fcb-846d-56459aaaa088 unbound from our chassis#033[00m
Nov 22 02:46:47 np0005531888 nova_compute[186788]: 2025-11-22 07:46:47.966 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:47.968 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4aa99606-7691-4fcb-846d-56459aaaa088, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:46:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:47.970 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b76b76ee-9aa2-4b86-a974-186c039895df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:47.971 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 namespace which is not needed anymore#033[00m
Nov 22 02:46:47 np0005531888 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Nov 22 02:46:47 np0005531888 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001e.scope: Consumed 12.050s CPU time.
Nov 22 02:46:48 np0005531888 systemd-machined[153106]: Machine qemu-15-instance-0000001e terminated.
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[217853]: [NOTICE]   (217859) : haproxy version is 2.8.14-c23fe91
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[217853]: [NOTICE]   (217859) : path to executable is /usr/sbin/haproxy
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[217853]: [WARNING]  (217859) : Exiting Master process...
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[217853]: [WARNING]  (217859) : Exiting Master process...
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[217853]: [ALERT]    (217859) : Current worker (217862) exited with code 143 (Terminated)
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[217853]: [WARNING]  (217859) : All workers exited. Exiting... (0)
Nov 22 02:46:48 np0005531888 systemd[1]: libpod-b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631.scope: Deactivated successfully.
Nov 22 02:46:48 np0005531888 podman[218035]: 2025-11-22 07:46:48.16721847 +0000 UTC m=+0.088848164 container died b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.167 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.169 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797608.1685066, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.169 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.187 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:48 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631-userdata-shm.mount: Deactivated successfully.
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.206 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:46:48 np0005531888 systemd[1]: var-lib-containers-storage-overlay-aedf59e408f4cc2856d6b8bd51c9a8e09a8ae81e057fc062ffde8cf83a60afe9-merged.mount: Deactivated successfully.
Nov 22 02:46:48 np0005531888 podman[218035]: 2025-11-22 07:46:48.224139104 +0000 UTC m=+0.145768788 container cleanup b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.229 186792 INFO nova.virt.libvirt.driver [-] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Instance destroyed successfully.#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.229 186792 DEBUG nova.objects.instance [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'resources' on Instance uuid eabc0ddf-fdc1-473a-b224-0dc0954d754c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:48 np0005531888 systemd[1]: libpod-conmon-b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631.scope: Deactivated successfully.
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.236 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.247 186792 DEBUG nova.virt.libvirt.vif [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-2063451211',display_name='tempest-ImagesOneServerNegativeTestJSON-server-2063451211',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-2063451211',id=30,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-8u6fkyns',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:46:46Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=eabc0ddf-fdc1-473a-b224-0dc0954d754c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.248 186792 DEBUG nova.network.os_vif_util [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "0bb26125-f048-4751-9c50-c8b2ff511438", "address": "fa:16:3e:de:f8:7c", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb26125-f0", "ovs_interfaceid": "0bb26125-f048-4751-9c50-c8b2ff511438", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.248 186792 DEBUG nova.network.os_vif_util [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:f8:7c,bridge_name='br-int',has_traffic_filtering=True,id=0bb26125-f048-4751-9c50-c8b2ff511438,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bb26125-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.249 186792 DEBUG os_vif [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:f8:7c,bridge_name='br-int',has_traffic_filtering=True,id=0bb26125-f048-4751-9c50-c8b2ff511438,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bb26125-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.250 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.251 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0bb26125-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.252 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.254 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.256 186792 INFO os_vif [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:f8:7c,bridge_name='br-int',has_traffic_filtering=True,id=0bb26125-f048-4751-9c50-c8b2ff511438,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bb26125-f0')#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.257 186792 INFO nova.virt.libvirt.driver [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Deleting instance files /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c_del#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.258 186792 INFO nova.virt.libvirt.driver [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Deletion of /var/lib/nova/instances/eabc0ddf-fdc1-473a-b224-0dc0954d754c_del complete#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.263 186792 DEBUG nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.263 186792 DEBUG nova.virt.libvirt.migration [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 02:46:48 np0005531888 podman[218079]: 2025-11-22 07:46:48.294830187 +0000 UTC m=+0.045698010 container remove b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.300 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ba02e4-0881-4146-8bf8-6c08441e02b8]: (4, ('Sat Nov 22 07:46:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 (b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631)\nb9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631\nSat Nov 22 07:46:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 (b9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631)\nb9364cd67656ec98a478d674c47a5f096d18d18e5f6c9ecf866bca47c00ce631\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.303 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[392a4d1a-2c83-4caf-97e2-fd9c0512732c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.304 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4aa99606-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:48 np0005531888 kernel: tap4aa99606-70: left promiscuous mode
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.318 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.321 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a212f7b4-65a3-45ae-aa95-6b754609fce4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.336 186792 DEBUG nova.compute.manager [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.336 186792 DEBUG oslo_concurrency.lockutils [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.336 186792 DEBUG oslo_concurrency.lockutils [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.336 186792 DEBUG oslo_concurrency.lockutils [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.336 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[119687e9-a1c1-486d-9d1c-66c724d09fe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.336 186792 DEBUG nova.compute.manager [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.337 186792 WARNING nova.compute.manager [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.337 186792 DEBUG nova.compute.manager [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-changed-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.337 186792 DEBUG nova.compute.manager [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Refreshing instance network info cache due to event network-changed-973cdfc2-4ad8-4f41-b383-4b64b1b5433f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.337 186792 DEBUG oslo_concurrency.lockutils [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.337 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dd17d418-4d57-4814-95eb-34e38784b5b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.337 186792 DEBUG oslo_concurrency.lockutils [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.338 186792 DEBUG nova.network.neutron [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Refreshing network info cache for port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.355 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cb29fc07-fed1-44ea-959c-c92b7d765753]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433626, 'reachable_time': 22749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218094, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 kernel: tap973cdfc2-4a (unregistering): left promiscuous mode
Nov 22 02:46:48 np0005531888 systemd[1]: run-netns-ovnmeta\x2d4aa99606\x2d7691\x2d4fcb\x2d846d\x2d56459aaaa088.mount: Deactivated successfully.
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.362 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.362 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[616f70fd-0447-4c6e-bc9c-bdc065e8bc7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 NetworkManager[55166]: <info>  [1763797608.3644] device (tap973cdfc2-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:46:48 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:48Z|00108|binding|INFO|Releasing lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f from this chassis (sb_readonly=0)
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.372 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:48Z|00109|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f down in Southbound
Nov 22 02:46:48 np0005531888 ovn_controller[95067]: 2025-11-22T07:46:48Z|00110|binding|INFO|Removing iface tap973cdfc2-4a ovn-installed in OVS
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.378 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.384 186792 INFO nova.compute.manager [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.385 186792 DEBUG oslo.service.loopingcall [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.386 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:9b:98 10.100.0.14'], port_security=['fa:16:3e:02:9b:98 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'df09844c-c111-44b4-9c36-d4950a55a590'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '18', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=973cdfc2-4ad8-4f41-b383-4b64b1b5433f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.385 186792 DEBUG nova.compute.manager [-] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.385 186792 DEBUG nova.network.neutron [-] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.387 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 unbound from our chassis#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.389 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.390 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ba62b1-ca82-4cef-945c-a5c167b0d0ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.390 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 namespace which is not needed anymore#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.398 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Nov 22 02:46:48 np0005531888 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001c.scope: Consumed 2.689s CPU time.
Nov 22 02:46:48 np0005531888 systemd-machined[153106]: Machine qemu-14-instance-0000001c terminated.
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217712]: [NOTICE]   (217716) : haproxy version is 2.8.14-c23fe91
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217712]: [NOTICE]   (217716) : path to executable is /usr/sbin/haproxy
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217712]: [WARNING]  (217716) : Exiting Master process...
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217712]: [ALERT]    (217716) : Current worker (217718) exited with code 143 (Terminated)
Nov 22 02:46:48 np0005531888 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217712]: [WARNING]  (217716) : All workers exited. Exiting... (0)
Nov 22 02:46:48 np0005531888 systemd[1]: libpod-b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f.scope: Deactivated successfully.
Nov 22 02:46:48 np0005531888 podman[218116]: 2025-11-22 07:46:48.527400837 +0000 UTC m=+0.045814963 container died b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:46:48 np0005531888 NetworkManager[55166]: <info>  [1763797608.5557] manager: (tap973cdfc2-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Nov 22 02:46:48 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f-userdata-shm.mount: Deactivated successfully.
Nov 22 02:46:48 np0005531888 systemd[1]: var-lib-containers-storage-overlay-c6bc217b20e5d14d9af3d247ee842f76c6b550d0efc8d26dedae306493d5fd7c-merged.mount: Deactivated successfully.
Nov 22 02:46:48 np0005531888 podman[218116]: 2025-11-22 07:46:48.579773365 +0000 UTC m=+0.098187491 container cleanup b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:46:48 np0005531888 systemd[1]: libpod-conmon-b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f.scope: Deactivated successfully.
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.602 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.603 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.603 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 22 02:46:48 np0005531888 podman[218160]: 2025-11-22 07:46:48.640949047 +0000 UTC m=+0.036524747 container remove b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.649 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f4bda208-9035-479d-9058-38a1368f0da9]: (4, ('Sat Nov 22 07:46:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 (b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f)\nb997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f\nSat Nov 22 07:46:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 (b997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f)\nb997b33c2d030182e186cccc01dcec27eefcf740fbd777538aa08e9fa5655f6f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.650 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[10af4111-70c4-4236-afa2-277f38ab74c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.651 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.653 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 kernel: tapc3f966e1-80: left promiscuous mode
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.669 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.673 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0178be84-fd19-468b-8ee6-11be1542e7c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.686 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ff59dff3-7414-4024-a1da-6297466bd828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.687 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e089af-14a2-4e72-a8c7-bb52f1faba44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.708 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[670fa85e-c0f1-4efe-a24e-0cc5a79cab6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433059, 'reachable_time': 30312, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218178, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.710 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:46:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:46:48.711 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[e1307352-ce87-40a5-ab20-9044c6c146fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.765 186792 DEBUG nova.virt.libvirt.guest [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44' (instance-0000001c) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.766 186792 INFO nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration operation has completed#033[00m
Nov 22 02:46:48 np0005531888 nova_compute[186788]: 2025-11-22 07:46:48.766 186792 INFO nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] _post_live_migration() is started..#033[00m
Nov 22 02:46:49 np0005531888 systemd[1]: run-netns-ovnmeta\x2dc3f966e1\x2d8cff\x2d4ca0\x2d9b4f\x2da318c31b0a70.mount: Deactivated successfully.
Nov 22 02:46:49 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.614 186792 DEBUG nova.network.neutron [-] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:49 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.644 186792 INFO nova.compute.manager [-] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Took 1.26 seconds to deallocate network for instance.#033[00m
Nov 22 02:46:49 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.727 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:49 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.728 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:49 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.808 186792 DEBUG nova.compute.provider_tree [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:49 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.826 186792 DEBUG nova.scheduler.client.report [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:49 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.849 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:49 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.914 186792 INFO nova.scheduler.client.report [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Deleted allocations for instance eabc0ddf-fdc1-473a-b224-0dc0954d754c#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:49.999 186792 DEBUG oslo_concurrency.lockutils [None req-b2885dcf-10f8-4d3a-9a61-63ce0490ad36 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.510 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received event network-vif-unplugged-0bb26125-f048-4751-9c50-c8b2ff511438 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.510 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.511 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.511 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.511 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] No waiting events found dispatching network-vif-unplugged-0bb26125-f048-4751-9c50-c8b2ff511438 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.511 186792 WARNING nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received unexpected event network-vif-unplugged-0bb26125-f048-4751-9c50-c8b2ff511438 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.511 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received event network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.512 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.512 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.512 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eabc0ddf-fdc1-473a-b224-0dc0954d754c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.512 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] No waiting events found dispatching network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.512 186792 WARNING nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received unexpected event network-vif-plugged-0bb26125-f048-4751-9c50-c8b2ff511438 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.513 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.513 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.513 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.513 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.514 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.514 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.514 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.515 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.515 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.515 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.515 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.515 186792 WARNING nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.516 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.516 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.516 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.516 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.516 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.517 186792 WARNING nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.517 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Received event network-vif-deleted-0bb26125-f048-4751-9c50-c8b2ff511438 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.517 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.517 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.517 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.518 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.518 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.518 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.518 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.518 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.519 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.519 186792 DEBUG oslo_concurrency.lockutils [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.519 186792 DEBUG nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.519 186792 WARNING nova.compute.manager [req-6157b081-fd51-4e1a-a974-d33c11ab56b4 req-6ec72c2f-cf41-46c5-b217-657c6183be05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.549 186792 DEBUG nova.network.neutron [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Activated binding for port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.550 186792 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.551 186792 DEBUG nova.virt.libvirt.vif [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:46:39Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.551 186792 DEBUG nova.network.os_vif_util [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.552 186792 DEBUG nova.network.os_vif_util [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.552 186792 DEBUG os_vif [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.554 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.554 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap973cdfc2-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.556 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.557 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.674 186792 DEBUG nova.network.neutron [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updated VIF entry in instance network info cache for port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.675 186792 DEBUG nova.network.neutron [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.758 186792 DEBUG oslo_concurrency.lockutils [req-0907750a-ef94-4bda-8d82-6e68f07ecbf5 req-c5d82698-c349-4ea0-ab47-783e4606225b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.785 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.790 186792 INFO os_vif [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a')#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.790 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.791 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.791 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.792 186792 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.792 186792 INFO nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Deleting instance files /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44_del#033[00m
Nov 22 02:46:50 np0005531888 nova_compute[186788]: 2025-11-22 07:46:50.793 186792 INFO nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Deletion of /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44_del complete#033[00m
Nov 22 02:46:51 np0005531888 nova_compute[186788]: 2025-11-22 07:46:51.324 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:52 np0005531888 nova_compute[186788]: 2025-11-22 07:46:52.603 186792 DEBUG nova.compute.manager [req-4932f66c-be11-49df-b1b8-b5118c230b01 req-c68ba625-d6a8-4f74-bf7a-c7c5758795fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:52 np0005531888 nova_compute[186788]: 2025-11-22 07:46:52.603 186792 DEBUG oslo_concurrency.lockutils [req-4932f66c-be11-49df-b1b8-b5118c230b01 req-c68ba625-d6a8-4f74-bf7a-c7c5758795fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:52 np0005531888 nova_compute[186788]: 2025-11-22 07:46:52.603 186792 DEBUG oslo_concurrency.lockutils [req-4932f66c-be11-49df-b1b8-b5118c230b01 req-c68ba625-d6a8-4f74-bf7a-c7c5758795fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:52 np0005531888 nova_compute[186788]: 2025-11-22 07:46:52.603 186792 DEBUG oslo_concurrency.lockutils [req-4932f66c-be11-49df-b1b8-b5118c230b01 req-c68ba625-d6a8-4f74-bf7a-c7c5758795fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:52 np0005531888 nova_compute[186788]: 2025-11-22 07:46:52.603 186792 DEBUG nova.compute.manager [req-4932f66c-be11-49df-b1b8-b5118c230b01 req-c68ba625-d6a8-4f74-bf7a-c7c5758795fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:52 np0005531888 nova_compute[186788]: 2025-11-22 07:46:52.604 186792 WARNING nova.compute.manager [req-4932f66c-be11-49df-b1b8-b5118c230b01 req-c68ba625-d6a8-4f74-bf7a-c7c5758795fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:46:54 np0005531888 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:46:54 np0005531888 systemd[217963]: Activating special unit Exit the Session...
Nov 22 02:46:54 np0005531888 systemd[217963]: Stopped target Main User Target.
Nov 22 02:46:54 np0005531888 systemd[217963]: Stopped target Basic System.
Nov 22 02:46:54 np0005531888 systemd[217963]: Stopped target Paths.
Nov 22 02:46:54 np0005531888 systemd[217963]: Stopped target Sockets.
Nov 22 02:46:54 np0005531888 systemd[217963]: Stopped target Timers.
Nov 22 02:46:54 np0005531888 systemd[217963]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:46:54 np0005531888 systemd[217963]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:46:54 np0005531888 systemd[217963]: Closed D-Bus User Message Bus Socket.
Nov 22 02:46:54 np0005531888 systemd[217963]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:46:54 np0005531888 systemd[217963]: Removed slice User Application Slice.
Nov 22 02:46:54 np0005531888 systemd[217963]: Reached target Shutdown.
Nov 22 02:46:54 np0005531888 systemd[217963]: Finished Exit the Session.
Nov 22 02:46:54 np0005531888 systemd[217963]: Reached target Exit the Session.
Nov 22 02:46:54 np0005531888 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:46:54 np0005531888 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:46:54 np0005531888 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:46:54 np0005531888 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:46:54 np0005531888 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:46:54 np0005531888 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:46:54 np0005531888 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:46:55 np0005531888 nova_compute[186788]: 2025-11-22 07:46:55.558 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531888 podman[218181]: 2025-11-22 07:46:55.722316464 +0000 UTC m=+0.092624740 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 02:46:55 np0005531888 podman[218182]: 2025-11-22 07:46:55.733269082 +0000 UTC m=+0.102939542 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 02:46:56 np0005531888 nova_compute[186788]: 2025-11-22 07:46:56.327 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.024 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.025 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.026 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.051 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.052 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.052 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.052 186792 DEBUG nova.compute.resource_tracker [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.287 186792 WARNING nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.291 186792 DEBUG nova.compute.resource_tracker [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5729MB free_disk=73.45680618286133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.291 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.291 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.340 186792 DEBUG nova.compute.resource_tracker [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Migration for instance 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.387 186792 DEBUG nova.compute.resource_tracker [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.423 186792 DEBUG nova.compute.resource_tracker [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Migration 31f45e55-ca5e-47f9-8b1e-254643907ee9 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.424 186792 DEBUG nova.compute.resource_tracker [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.424 186792 DEBUG nova.compute.resource_tracker [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.479 186792 DEBUG nova.compute.provider_tree [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.508 186792 DEBUG nova.scheduler.client.report [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.551 186792 DEBUG nova.compute.resource_tracker [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.551 186792 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.568 186792 INFO nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.767 186792 INFO nova.scheduler.client.report [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Deleted allocation for migration 31f45e55-ca5e-47f9-8b1e-254643907ee9#033[00m
Nov 22 02:46:57 np0005531888 nova_compute[186788]: 2025-11-22 07:46:57.768 186792 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 22 02:46:59 np0005531888 podman[218224]: 2025-11-22 07:46:59.711712601 +0000 UTC m=+0.069001002 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:47:00 np0005531888 nova_compute[186788]: 2025-11-22 07:47:00.568 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.217 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "a67311b5-a972-4987-b292-17e97d9111ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.218 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "a67311b5-a972-4987-b292-17e97d9111ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.236 186792 DEBUG nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.330 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.347 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.348 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.355 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.356 186792 INFO nova.compute.claims [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.495 186792 DEBUG nova.compute.provider_tree [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.510 186792 DEBUG nova.scheduler.client.report [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.540 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.541 186792 DEBUG nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.603 186792 DEBUG nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.604 186792 DEBUG nova.network.neutron [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.632 186792 INFO nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.666 186792 DEBUG nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.784 186792 DEBUG nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.786 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.786 186792 INFO nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Creating image(s)#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.787 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "/var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.787 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.788 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.801 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.862 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.863 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.864 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.880 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.941 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.942 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.994 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.995 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:01 np0005531888 nova_compute[186788]: 2025-11-22 07:47:01.995 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.050 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.051 186792 DEBUG nova.virt.disk.api [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Checking if we can resize image /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.052 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.111 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.112 186792 DEBUG nova.virt.disk.api [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Cannot resize image /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.112 186792 DEBUG nova.objects.instance [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'migration_context' on Instance uuid a67311b5-a972-4987-b292-17e97d9111ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.128 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.128 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Ensure instance console log exists: /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.129 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.129 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.129 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.477 186792 DEBUG nova.network.neutron [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.477 186792 DEBUG nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.479 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.484 186792 WARNING nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.499 186792 DEBUG nova.virt.libvirt.host [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.501 186792 DEBUG nova.virt.libvirt.host [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.506 186792 DEBUG nova.virt.libvirt.host [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.507 186792 DEBUG nova.virt.libvirt.host [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.508 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.508 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.509 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.509 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.509 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.509 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.510 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.510 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.510 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.510 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.511 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.511 186792 DEBUG nova.virt.hardware [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.515 186792 DEBUG nova.objects.instance [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'pci_devices' on Instance uuid a67311b5-a972-4987-b292-17e97d9111ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.537 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <uuid>a67311b5-a972-4987-b292-17e97d9111ab</uuid>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <name>instance-00000020</name>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1374622036</nova:name>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:47:02</nova:creationTime>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:        <nova:user uuid="f3272c6a12f44ac18db2715976e29248">tempest-ServersOnMultiNodesTest-214232393-project-member</nova:user>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:        <nova:project uuid="b764107a4dca4a799bc3edefe458310b">tempest-ServersOnMultiNodesTest-214232393</nova:project>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <entry name="serial">a67311b5-a972-4987-b292-17e97d9111ab</entry>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <entry name="uuid">a67311b5-a972-4987-b292-17e97d9111ab</entry>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk.config"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/console.log" append="off"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:47:02 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:47:02 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:47:02 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:47:02 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.585 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.586 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.586 186792 INFO nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Using config drive#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.851 186792 INFO nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Creating config drive at /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk.config#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.857 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphfi0c1bt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:02 np0005531888 nova_compute[186788]: 2025-11-22 07:47:02.986 186792 DEBUG oslo_concurrency.processutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphfi0c1bt" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:03 np0005531888 systemd-machined[153106]: New machine qemu-16-instance-00000020.
Nov 22 02:47:03 np0005531888 systemd[1]: Started Virtual Machine qemu-16-instance-00000020.
Nov 22 02:47:03 np0005531888 podman[218273]: 2025-11-22 07:47:03.142681002 +0000 UTC m=+0.072289055 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.238 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797608.2262464, eabc0ddf-fdc1-473a-b224-0dc0954d754c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.239 186792 INFO nova.compute.manager [-] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.261 186792 DEBUG nova.compute.manager [None req-7b7609a4-ecb5-4271-9981-0dcd88b50971 - - - - - -] [instance: eabc0ddf-fdc1-473a-b224-0dc0954d754c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.601 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797608.600873, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.602 186792 INFO nova.compute.manager [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.636 186792 DEBUG nova.compute.manager [None req-f0557bd1-93e4-44fd-a2e6-c5495d37055a - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.719 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797623.7188866, a67311b5-a972-4987-b292-17e97d9111ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.719 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.722 186792 DEBUG nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.722 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.726 186792 INFO nova.virt.libvirt.driver [-] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Instance spawned successfully.#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.727 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.747 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.752 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.753 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.753 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.754 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.754 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.755 186792 DEBUG nova.virt.libvirt.driver [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.759 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.788 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.789 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797623.7207632, a67311b5-a972-4987-b292-17e97d9111ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.789 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] VM Started (Lifecycle Event)#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.826 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.830 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.868 186792 INFO nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Took 2.08 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.869 186792 DEBUG nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.870 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:03 np0005531888 nova_compute[186788]: 2025-11-22 07:47:03.975 186792 INFO nova.compute.manager [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Took 2.65 seconds to build instance.#033[00m
Nov 22 02:47:04 np0005531888 nova_compute[186788]: 2025-11-22 07:47:04.006 186792 DEBUG oslo_concurrency.lockutils [None req-7e4b2aa1-f9c2-4eb1-b726-87f20008dcba f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "a67311b5-a972-4987-b292-17e97d9111ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.788s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:05 np0005531888 nova_compute[186788]: 2025-11-22 07:47:05.573 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:06 np0005531888 nova_compute[186788]: 2025-11-22 07:47:06.330 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:09 np0005531888 podman[218310]: 2025-11-22 07:47:09.701633299 +0000 UTC m=+0.066232572 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 02:47:10 np0005531888 nova_compute[186788]: 2025-11-22 07:47:10.579 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:11 np0005531888 nova_compute[186788]: 2025-11-22 07:47:11.334 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:12 np0005531888 podman[218331]: 2025-11-22 07:47:12.697492782 +0000 UTC m=+0.070030917 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:47:14 np0005531888 podman[218356]: 2025-11-22 07:47:14.699753613 +0000 UTC m=+0.065037131 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Nov 22 02:47:15 np0005531888 nova_compute[186788]: 2025-11-22 07:47:15.583 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:16 np0005531888 nova_compute[186788]: 2025-11-22 07:47:16.336 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:20 np0005531888 nova_compute[186788]: 2025-11-22 07:47:20.587 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:21 np0005531888 nova_compute[186788]: 2025-11-22 07:47:21.338 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:23 np0005531888 nova_compute[186788]: 2025-11-22 07:47:23.947 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "87c6e734-1791-4ca2-ab41-e01032ce4934" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:23 np0005531888 nova_compute[186788]: 2025-11-22 07:47:23.947 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "87c6e734-1791-4ca2-ab41-e01032ce4934" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.006 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.124 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.124 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.132 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.132 186792 INFO nova.compute.claims [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.267 186792 DEBUG nova.compute.provider_tree [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.282 186792 DEBUG nova.scheduler.client.report [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.316 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.330 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "c837965b-440e-43df-b19e-691163d254d5" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.330 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "c837965b-440e-43df-b19e-691163d254d5" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.348 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] No node specified, defaulting to compute-2.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.425 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "c837965b-440e-43df-b19e-691163d254d5" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.426 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.483 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.483 186792 DEBUG nova.network.neutron [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.513 186792 INFO nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.543 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.744 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.745 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.746 186792 INFO nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Creating image(s)#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.747 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "/var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.747 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.748 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.762 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.838 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.840 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.840 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.853 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.885 186792 DEBUG nova.network.neutron [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.885 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.915 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.916 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.951 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.952 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:24 np0005531888 nova_compute[186788]: 2025-11-22 07:47:24.952 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.011 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.012 186792 DEBUG nova.virt.disk.api [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Checking if we can resize image /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.013 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.076 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.078 186792 DEBUG nova.virt.disk.api [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Cannot resize image /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.078 186792 DEBUG nova.objects.instance [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'migration_context' on Instance uuid 87c6e734-1791-4ca2-ab41-e01032ce4934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.254 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.255 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Ensure instance console log exists: /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.255 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.256 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.256 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.258 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.263 186792 WARNING nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.268 186792 DEBUG nova.virt.libvirt.host [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.269 186792 DEBUG nova.virt.libvirt.host [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.279 186792 DEBUG nova.virt.libvirt.host [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.280 186792 DEBUG nova.virt.libvirt.host [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.281 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.282 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.282 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.282 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.283 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.283 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.283 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.284 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.284 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.284 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.284 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.284 186792 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.289 186792 DEBUG nova.objects.instance [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'pci_devices' on Instance uuid 87c6e734-1791-4ca2-ab41-e01032ce4934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.324 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <uuid>87c6e734-1791-4ca2-ab41-e01032ce4934</uuid>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <name>instance-00000026</name>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1882522842-2</nova:name>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:47:25</nova:creationTime>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:        <nova:user uuid="f3272c6a12f44ac18db2715976e29248">tempest-ServersOnMultiNodesTest-214232393-project-member</nova:user>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:        <nova:project uuid="b764107a4dca4a799bc3edefe458310b">tempest-ServersOnMultiNodesTest-214232393</nova:project>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <entry name="serial">87c6e734-1791-4ca2-ab41-e01032ce4934</entry>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <entry name="uuid">87c6e734-1791-4ca2-ab41-e01032ce4934</entry>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk.config"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/console.log" append="off"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:47:25 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:47:25 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:47:25 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:47:25 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.425 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.426 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.426 186792 INFO nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Using config drive#033[00m
Nov 22 02:47:25 np0005531888 nova_compute[186788]: 2025-11-22 07:47:25.591 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:26 np0005531888 nova_compute[186788]: 2025-11-22 07:47:26.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:26 np0005531888 nova_compute[186788]: 2025-11-22 07:47:26.411 186792 INFO nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Creating config drive at /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk.config#033[00m
Nov 22 02:47:26 np0005531888 nova_compute[186788]: 2025-11-22 07:47:26.419 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi8_er7hh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:26 np0005531888 nova_compute[186788]: 2025-11-22 07:47:26.550 186792 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi8_er7hh" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:26 np0005531888 systemd-machined[153106]: New machine qemu-17-instance-00000026.
Nov 22 02:47:26 np0005531888 systemd[1]: Started Virtual Machine qemu-17-instance-00000026.
Nov 22 02:47:26 np0005531888 podman[218418]: 2025-11-22 07:47:26.704800224 +0000 UTC m=+0.074821529 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:47:26 np0005531888 podman[218419]: 2025-11-22 07:47:26.786745673 +0000 UTC m=+0.152625153 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.319 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797647.3175433, 87c6e734-1791-4ca2-ab41-e01032ce4934 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.321 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.324 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.325 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.331 186792 INFO nova.virt.libvirt.driver [-] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Instance spawned successfully.#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.331 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.350 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.356 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.368 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.369 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.369 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.369 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.370 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.370 186792 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.397 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.397 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797647.3202548, 87c6e734-1791-4ca2-ab41-e01032ce4934 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.397 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] VM Started (Lifecycle Event)#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.440 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.444 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.471 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.473 186792 INFO nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Took 2.73 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.474 186792 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.587 186792 INFO nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Took 3.51 seconds to build instance.#033[00m
Nov 22 02:47:27 np0005531888 nova_compute[186788]: 2025-11-22 07:47:27.618 186792 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "87c6e734-1791-4ca2-ab41-e01032ce4934" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:30 np0005531888 nova_compute[186788]: 2025-11-22 07:47:30.594 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:30 np0005531888 podman[218480]: 2025-11-22 07:47:30.691516052 +0000 UTC m=+0.065726678 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:47:31 np0005531888 nova_compute[186788]: 2025-11-22 07:47:31.380 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.205 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "87c6e734-1791-4ca2-ab41-e01032ce4934" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.206 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "87c6e734-1791-4ca2-ab41-e01032ce4934" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.206 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "87c6e734-1791-4ca2-ab41-e01032ce4934-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.206 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "87c6e734-1791-4ca2-ab41-e01032ce4934-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.207 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "87c6e734-1791-4ca2-ab41-e01032ce4934-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.217 186792 INFO nova.compute.manager [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Terminating instance#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.224 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "refresh_cache-87c6e734-1791-4ca2-ab41-e01032ce4934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.224 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquired lock "refresh_cache-87c6e734-1791-4ca2-ab41-e01032ce4934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.224 186792 DEBUG nova.network.neutron [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.465 186792 DEBUG nova.network.neutron [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:33 np0005531888 podman[218502]: 2025-11-22 07:47:33.694282241 +0000 UTC m=+0.060382093 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.732 186792 DEBUG nova.network.neutron [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.752 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Releasing lock "refresh_cache-87c6e734-1791-4ca2-ab41-e01032ce4934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:33 np0005531888 nova_compute[186788]: 2025-11-22 07:47:33.752 186792 DEBUG nova.compute.manager [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:47:33 np0005531888 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000026.scope: Deactivated successfully.
Nov 22 02:47:33 np0005531888 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000026.scope: Consumed 7.109s CPU time.
Nov 22 02:47:33 np0005531888 systemd-machined[153106]: Machine qemu-17-instance-00000026 terminated.
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.008 186792 INFO nova.virt.libvirt.driver [-] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Instance destroyed successfully.#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.008 186792 DEBUG nova.objects.instance [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'resources' on Instance uuid 87c6e734-1791-4ca2-ab41-e01032ce4934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.019 186792 INFO nova.virt.libvirt.driver [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Deleting instance files /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934_del#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.020 186792 INFO nova.virt.libvirt.driver [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Deletion of /var/lib/nova/instances/87c6e734-1791-4ca2-ab41-e01032ce4934_del complete#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.336 186792 INFO nova.compute.manager [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Took 0.58 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.337 186792 DEBUG oslo.service.loopingcall [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.337 186792 DEBUG nova.compute.manager [-] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.338 186792 DEBUG nova.network.neutron [-] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:47:34 np0005531888 nova_compute[186788]: 2025-11-22 07:47:34.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.132 186792 DEBUG nova.network.neutron [-] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.144 186792 DEBUG nova.network.neutron [-] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.160 186792 INFO nova.compute.manager [-] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Took 0.82 seconds to deallocate network for instance.#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.163 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-a67311b5-a972-4987-b292-17e97d9111ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.163 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-a67311b5-a972-4987-b292-17e97d9111ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.163 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.163 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a67311b5-a972-4987-b292-17e97d9111ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.287 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.288 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.597 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.669 186792 DEBUG nova.compute.provider_tree [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.682 186792 DEBUG nova.scheduler.client.report [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.852 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.973 186792 INFO nova.scheduler.client.report [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Deleted allocations for instance 87c6e734-1791-4ca2-ab41-e01032ce4934#033[00m
Nov 22 02:47:35 np0005531888 nova_compute[186788]: 2025-11-22 07:47:35.989 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:36 np0005531888 nova_compute[186788]: 2025-11-22 07:47:36.107 186792 DEBUG oslo_concurrency.lockutils [None req-129cde25-bf4e-4530-9453-86a2921e9d66 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "87c6e734-1791-4ca2-ab41-e01032ce4934" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:36 np0005531888 nova_compute[186788]: 2025-11-22 07:47:36.382 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:36 np0005531888 nova_compute[186788]: 2025-11-22 07:47:36.473 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:36 np0005531888 nova_compute[186788]: 2025-11-22 07:47:36.495 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-a67311b5-a972-4987-b292-17e97d9111ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:36 np0005531888 nova_compute[186788]: 2025-11-22 07:47:36.496 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:47:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:47:36.799 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:47:36.799 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:47:36.800 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:38 np0005531888 nova_compute[186788]: 2025-11-22 07:47:38.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:38 np0005531888 nova_compute[186788]: 2025-11-22 07:47:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:38 np0005531888 nova_compute[186788]: 2025-11-22 07:47:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:40 np0005531888 nova_compute[186788]: 2025-11-22 07:47:40.600 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:40 np0005531888 podman[218531]: 2025-11-22 07:47:40.721845036 +0000 UTC m=+0.089093731 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:47:40 np0005531888 nova_compute[186788]: 2025-11-22 07:47:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.205 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "a67311b5-a972-4987-b292-17e97d9111ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.206 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "a67311b5-a972-4987-b292-17e97d9111ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.206 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "a67311b5-a972-4987-b292-17e97d9111ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.206 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "a67311b5-a972-4987-b292-17e97d9111ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.206 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "a67311b5-a972-4987-b292-17e97d9111ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.215 186792 INFO nova.compute.manager [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Terminating instance#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.223 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "refresh_cache-a67311b5-a972-4987-b292-17e97d9111ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.224 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquired lock "refresh_cache-a67311b5-a972-4987-b292-17e97d9111ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.224 186792 DEBUG nova.network.neutron [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.384 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.433 186792 DEBUG nova.network.neutron [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.784 186792 DEBUG nova.network.neutron [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.798 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Releasing lock "refresh_cache-a67311b5-a972-4987-b292-17e97d9111ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.799 186792 DEBUG nova.compute.manager [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:47:41 np0005531888 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000020.scope: Deactivated successfully.
Nov 22 02:47:41 np0005531888 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000020.scope: Consumed 14.681s CPU time.
Nov 22 02:47:41 np0005531888 systemd-machined[153106]: Machine qemu-16-instance-00000020 terminated.
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:41 np0005531888 nova_compute[186788]: 2025-11-22 07:47:41.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.053 186792 INFO nova.virt.libvirt.driver [-] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Instance destroyed successfully.#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.054 186792 DEBUG nova.objects.instance [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'resources' on Instance uuid a67311b5-a972-4987-b292-17e97d9111ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.075 186792 INFO nova.virt.libvirt.driver [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Deleting instance files /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab_del#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.076 186792 INFO nova.virt.libvirt.driver [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Deletion of /var/lib/nova/instances/a67311b5-a972-4987-b292-17e97d9111ab_del complete#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.152 186792 INFO nova.compute.manager [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.153 186792 DEBUG oslo.service.loopingcall [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.153 186792 DEBUG nova.compute.manager [-] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.154 186792 DEBUG nova.network.neutron [-] [instance: a67311b5-a972-4987-b292-17e97d9111ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.427 186792 DEBUG nova.network.neutron [-] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.442 186792 DEBUG nova.network.neutron [-] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.464 186792 INFO nova.compute.manager [-] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Took 0.31 seconds to deallocate network for instance.#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.541 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.542 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.605 186792 DEBUG nova.compute.provider_tree [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.618 186792 DEBUG nova.scheduler.client.report [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.639 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.685 186792 INFO nova.scheduler.client.report [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Deleted allocations for instance a67311b5-a972-4987-b292-17e97d9111ab#033[00m
Nov 22 02:47:42 np0005531888 nova_compute[186788]: 2025-11-22 07:47:42.769 186792 DEBUG oslo_concurrency.lockutils [None req-e86b4771-4d7b-4d26-a123-938a706a23c2 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "a67311b5-a972-4987-b292-17e97d9111ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:47:43.659 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:47:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:47:43.660 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:47:43 np0005531888 nova_compute[186788]: 2025-11-22 07:47:43.664 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:43 np0005531888 podman[218561]: 2025-11-22 07:47:43.696432349 +0000 UTC m=+0.064096817 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:47:43 np0005531888 nova_compute[186788]: 2025-11-22 07:47:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:43 np0005531888 nova_compute[186788]: 2025-11-22 07:47:43.972 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:43 np0005531888 nova_compute[186788]: 2025-11-22 07:47:43.972 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:43 np0005531888 nova_compute[186788]: 2025-11-22 07:47:43.972 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:43 np0005531888 nova_compute[186788]: 2025-11-22 07:47:43.973 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.152 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.153 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5737MB free_disk=73.45679092407227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.153 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.153 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.221 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.222 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.249 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.269 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.307 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:47:44 np0005531888 nova_compute[186788]: 2025-11-22 07:47:44.307 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:45 np0005531888 nova_compute[186788]: 2025-11-22 07:47:45.301 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:45 np0005531888 nova_compute[186788]: 2025-11-22 07:47:45.603 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531888 podman[218586]: 2025-11-22 07:47:45.698210117 +0000 UTC m=+0.065360669 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350)
Nov 22 02:47:45 np0005531888 nova_compute[186788]: 2025-11-22 07:47:45.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:46 np0005531888 nova_compute[186788]: 2025-11-22 07:47:46.387 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:49 np0005531888 nova_compute[186788]: 2025-11-22 07:47:49.007 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797654.005922, 87c6e734-1791-4ca2-ab41-e01032ce4934 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:49 np0005531888 nova_compute[186788]: 2025-11-22 07:47:49.008 186792 INFO nova.compute.manager [-] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:47:49 np0005531888 nova_compute[186788]: 2025-11-22 07:47:49.024 186792 DEBUG nova.compute.manager [None req-850d2fa1-8ff8-446a-b264-ba1b6d5a2d4a - - - - - -] [instance: 87c6e734-1791-4ca2-ab41-e01032ce4934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:47:49.662 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:50 np0005531888 nova_compute[186788]: 2025-11-22 07:47:50.301 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:50 np0005531888 nova_compute[186788]: 2025-11-22 07:47:50.604 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:51 np0005531888 nova_compute[186788]: 2025-11-22 07:47:51.387 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:53 np0005531888 nova_compute[186788]: 2025-11-22 07:47:53.938 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:53 np0005531888 nova_compute[186788]: 2025-11-22 07:47:53.939 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:53 np0005531888 nova_compute[186788]: 2025-11-22 07:47:53.966 186792 DEBUG nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.101 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.101 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.108 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.109 186792 INFO nova.compute.claims [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.271 186792 DEBUG nova.compute.provider_tree [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.290 186792 DEBUG nova.scheduler.client.report [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.313 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.314 186792 DEBUG nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.374 186792 DEBUG nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.374 186792 DEBUG nova.network.neutron [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.405 186792 INFO nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.437 186792 DEBUG nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.632 186792 DEBUG nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.634 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.634 186792 INFO nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Creating image(s)#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.635 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "/var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.635 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "/var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.636 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "/var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.650 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.713 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.714 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.715 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.728 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.785 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.787 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.835 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.836 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.837 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.902 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.903 186792 DEBUG nova.virt.disk.api [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Checking if we can resize image /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.904 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.974 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.976 186792 DEBUG nova.virt.disk.api [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Cannot resize image /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:47:54 np0005531888 nova_compute[186788]: 2025-11-22 07:47:54.976 186792 DEBUG nova.objects.instance [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lazy-loading 'migration_context' on Instance uuid c3009d0a-92f7-42b9-a930-9bdb9e70bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.001 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.002 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Ensure instance console log exists: /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.003 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.003 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.003 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.050 186792 DEBUG nova.network.neutron [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.051 186792 DEBUG nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.053 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.058 186792 WARNING nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.063 186792 DEBUG nova.virt.libvirt.host [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.064 186792 DEBUG nova.virt.libvirt.host [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.068 186792 DEBUG nova.virt.libvirt.host [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.068 186792 DEBUG nova.virt.libvirt.host [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.070 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.071 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.071 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.071 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.072 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.072 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.072 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.073 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.073 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.073 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.073 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.073 186792 DEBUG nova.virt.hardware [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.078 186792 DEBUG nova.objects.instance [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lazy-loading 'pci_devices' on Instance uuid c3009d0a-92f7-42b9-a930-9bdb9e70bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.108 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <uuid>c3009d0a-92f7-42b9-a930-9bdb9e70bd08</uuid>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <name>instance-00000027</name>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <nova:name>tempest-ListImageFiltersTestJSON-server-1105610867</nova:name>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:47:55</nova:creationTime>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:        <nova:user uuid="f9a51b2699f1471d9e9b3463921a67fe">tempest-ListImageFiltersTestJSON-497193209-project-member</nova:user>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:        <nova:project uuid="072c26a765bb4c6081d04d313aceda15">tempest-ListImageFiltersTestJSON-497193209</nova:project>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:47:55 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <entry name="serial">c3009d0a-92f7-42b9-a930-9bdb9e70bd08</entry>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <entry name="uuid">c3009d0a-92f7-42b9-a930-9bdb9e70bd08</entry>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk"/>
Nov 22 02:47:55 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.config"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/console.log" append="off"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:47:55 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:47:55 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:47:55 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:47:55 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.150 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.150 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.151 186792 INFO nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Using config drive#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.608 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.626 186792 INFO nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Creating config drive at /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.config#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.631 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp_r9hd9w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:55 np0005531888 nova_compute[186788]: 2025-11-22 07:47:55.759 186792 DEBUG oslo_concurrency.processutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp_r9hd9w" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:55 np0005531888 systemd-machined[153106]: New machine qemu-18-instance-00000027.
Nov 22 02:47:55 np0005531888 systemd[1]: Started Virtual Machine qemu-18-instance-00000027.
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.389 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.794 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797676.7937222, c3009d0a-92f7-42b9-a930-9bdb9e70bd08 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.795 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.800 186792 DEBUG nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.800 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.806 186792 INFO nova.virt.libvirt.driver [-] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Instance spawned successfully.#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.807 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:47:56 np0005531888 podman[218648]: 2025-11-22 07:47:56.846886752 +0000 UTC m=+0.102880769 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.849 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.858 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.861 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.862 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.862 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.863 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.863 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.863 186792 DEBUG nova.virt.libvirt.driver [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.933 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.933 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797676.797937, c3009d0a-92f7-42b9-a930-9bdb9e70bd08 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.934 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] VM Started (Lifecycle Event)#033[00m
Nov 22 02:47:56 np0005531888 podman[218669]: 2025-11-22 07:47:56.945695251 +0000 UTC m=+0.097414997 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.961 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:56 np0005531888 nova_compute[186788]: 2025-11-22 07:47:56.964 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:57 np0005531888 nova_compute[186788]: 2025-11-22 07:47:57.010 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:57 np0005531888 nova_compute[186788]: 2025-11-22 07:47:57.049 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797662.0485487, a67311b5-a972-4987-b292-17e97d9111ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:57 np0005531888 nova_compute[186788]: 2025-11-22 07:47:57.050 186792 INFO nova.compute.manager [-] [instance: a67311b5-a972-4987-b292-17e97d9111ab] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:47:57 np0005531888 nova_compute[186788]: 2025-11-22 07:47:57.080 186792 DEBUG nova.compute.manager [None req-37adcfbe-caf9-4c21-a825-569e8581ef56 - - - - - -] [instance: a67311b5-a972-4987-b292-17e97d9111ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:57 np0005531888 nova_compute[186788]: 2025-11-22 07:47:57.321 186792 INFO nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Took 2.69 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:47:57 np0005531888 nova_compute[186788]: 2025-11-22 07:47:57.322 186792 DEBUG nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:57 np0005531888 nova_compute[186788]: 2025-11-22 07:47:57.455 186792 INFO nova.compute.manager [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Took 3.41 seconds to build instance.#033[00m
Nov 22 02:47:57 np0005531888 nova_compute[186788]: 2025-11-22 07:47:57.559 186792 DEBUG oslo_concurrency.lockutils [None req-7a7a482a-8b4c-4353-bb68-e28022ea92b7 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:00 np0005531888 nova_compute[186788]: 2025-11-22 07:48:00.611 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:01 np0005531888 nova_compute[186788]: 2025-11-22 07:48:01.390 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:01 np0005531888 podman[218697]: 2025-11-22 07:48:01.682787978 +0000 UTC m=+0.053752470 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:48:04 np0005531888 podman[218722]: 2025-11-22 07:48:04.685808221 +0000 UTC m=+0.057527032 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:48:05 np0005531888 nova_compute[186788]: 2025-11-22 07:48:05.105 186792 DEBUG nova.compute.manager [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:05 np0005531888 nova_compute[186788]: 2025-11-22 07:48:05.286 186792 INFO nova.compute.manager [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] instance snapshotting#033[00m
Nov 22 02:48:05 np0005531888 nova_compute[186788]: 2025-11-22 07:48:05.534 186792 INFO nova.virt.libvirt.driver [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Beginning live snapshot process#033[00m
Nov 22 02:48:05 np0005531888 nova_compute[186788]: 2025-11-22 07:48:05.615 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:05 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:48:05 np0005531888 nova_compute[186788]: 2025-11-22 07:48:05.855 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:05 np0005531888 nova_compute[186788]: 2025-11-22 07:48:05.918 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:05 np0005531888 nova_compute[186788]: 2025-11-22 07:48:05.920 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:05 np0005531888 nova_compute[186788]: 2025-11-22 07:48:05.995 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json -f qcow2" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.010 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.080 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.081 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpa0ym93nh/d323eec9efe64d56854875958b53c276.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.126 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpa0ym93nh/d323eec9efe64d56854875958b53c276.delta 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.128 186792 INFO nova.virt.libvirt.driver [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.188 186792 DEBUG nova.virt.libvirt.guest [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.392 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.693 186792 DEBUG nova.virt.libvirt.guest [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.697 186792 INFO nova.virt.libvirt.driver [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.740 186792 DEBUG nova.privsep.utils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.741 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpa0ym93nh/d323eec9efe64d56854875958b53c276.delta /var/lib/nova/instances/snapshots/tmpa0ym93nh/d323eec9efe64d56854875958b53c276 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.906 186792 DEBUG oslo_concurrency.processutils [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpa0ym93nh/d323eec9efe64d56854875958b53c276.delta /var/lib/nova/instances/snapshots/tmpa0ym93nh/d323eec9efe64d56854875958b53c276" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:06 np0005531888 nova_compute[186788]: 2025-11-22 07:48:06.907 186792 INFO nova.virt.libvirt.driver [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:48:10 np0005531888 nova_compute[186788]: 2025-11-22 07:48:10.619 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:10 np0005531888 nova_compute[186788]: 2025-11-22 07:48:10.982 186792 INFO nova.virt.libvirt.driver [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Snapshot image upload complete#033[00m
Nov 22 02:48:10 np0005531888 nova_compute[186788]: 2025-11-22 07:48:10.982 186792 INFO nova.compute.manager [None req-72bb999e-0970-437d-a376-2aed6e423a39 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Took 5.69 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:48:11 np0005531888 nova_compute[186788]: 2025-11-22 07:48:11.392 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:11 np0005531888 podman[218777]: 2025-11-22 07:48:11.690537648 +0000 UTC m=+0.062490152 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:48:14 np0005531888 podman[218797]: 2025-11-22 07:48:14.680143776 +0000 UTC m=+0.056314393 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:48:15 np0005531888 nova_compute[186788]: 2025-11-22 07:48:15.623 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:16 np0005531888 nova_compute[186788]: 2025-11-22 07:48:16.394 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:16 np0005531888 podman[218821]: 2025-11-22 07:48:16.685500452 +0000 UTC m=+0.058100745 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 02:48:19 np0005531888 nova_compute[186788]: 2025-11-22 07:48:19.860 186792 DEBUG nova.compute.manager [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:19 np0005531888 nova_compute[186788]: 2025-11-22 07:48:19.934 186792 INFO nova.compute.manager [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] instance snapshotting#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.197 186792 INFO nova.virt.libvirt.driver [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Beginning live snapshot process#033[00m
Nov 22 02:48:20 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.387 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.451 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.452 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.516 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.531 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.595 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.597 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpm9y1itfy/6ccfa62804804a9b9e5c30d092077ddf.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.626 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.646 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpm9y1itfy/6ccfa62804804a9b9e5c30d092077ddf.delta 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.647 186792 INFO nova.virt.libvirt.driver [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:48:20 np0005531888 nova_compute[186788]: 2025-11-22 07:48:20.705 186792 DEBUG nova.virt.libvirt.guest [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] COPY block job progress, current cursor: 0 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:48:21 np0005531888 nova_compute[186788]: 2025-11-22 07:48:21.210 186792 DEBUG nova.virt.libvirt.guest [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] COPY block job progress, current cursor: 10420224 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:48:21 np0005531888 nova_compute[186788]: 2025-11-22 07:48:21.395 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:21 np0005531888 nova_compute[186788]: 2025-11-22 07:48:21.713 186792 DEBUG nova.virt.libvirt.guest [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] COPY block job progress, current cursor: 75366400 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:48:21 np0005531888 nova_compute[186788]: 2025-11-22 07:48:21.716 186792 INFO nova.virt.libvirt.driver [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:48:21 np0005531888 nova_compute[186788]: 2025-11-22 07:48:21.761 186792 DEBUG nova.privsep.utils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:48:21 np0005531888 nova_compute[186788]: 2025-11-22 07:48:21.761 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpm9y1itfy/6ccfa62804804a9b9e5c30d092077ddf.delta /var/lib/nova/instances/snapshots/tmpm9y1itfy/6ccfa62804804a9b9e5c30d092077ddf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:23 np0005531888 nova_compute[186788]: 2025-11-22 07:48:23.140 186792 DEBUG oslo_concurrency.processutils [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpm9y1itfy/6ccfa62804804a9b9e5c30d092077ddf.delta /var/lib/nova/instances/snapshots/tmpm9y1itfy/6ccfa62804804a9b9e5c30d092077ddf" returned: 0 in 1.379s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:23 np0005531888 nova_compute[186788]: 2025-11-22 07:48:23.147 186792 INFO nova.virt.libvirt.driver [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:48:25 np0005531888 nova_compute[186788]: 2025-11-22 07:48:25.630 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:26 np0005531888 nova_compute[186788]: 2025-11-22 07:48:26.397 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:26 np0005531888 nova_compute[186788]: 2025-11-22 07:48:26.755 186792 INFO nova.virt.libvirt.driver [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Snapshot image upload complete#033[00m
Nov 22 02:48:26 np0005531888 nova_compute[186788]: 2025-11-22 07:48:26.756 186792 INFO nova.compute.manager [None req-ccd39b10-7fdc-427a-ba88-c2d1befe7c15 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Took 6.81 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:48:27 np0005531888 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 02:48:27 np0005531888 podman[218876]: 2025-11-22 07:48:27.507373529 +0000 UTC m=+0.065032813 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:48:27 np0005531888 podman[218877]: 2025-11-22 07:48:27.536107775 +0000 UTC m=+0.089323262 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 02:48:30 np0005531888 nova_compute[186788]: 2025-11-22 07:48:30.634 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:31 np0005531888 nova_compute[186788]: 2025-11-22 07:48:31.399 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:32 np0005531888 podman[218921]: 2025-11-22 07:48:32.67960094 +0000 UTC m=+0.050310457 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:48:35 np0005531888 nova_compute[186788]: 2025-11-22 07:48:35.638 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:35 np0005531888 podman[218946]: 2025-11-22 07:48:35.700966076 +0000 UTC m=+0.077504865 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 22 02:48:35 np0005531888 nova_compute[186788]: 2025-11-22 07:48:35.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:35 np0005531888 nova_compute[186788]: 2025-11-22 07:48:35.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:48:35 np0005531888 nova_compute[186788]: 2025-11-22 07:48:35.979 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:48:35 np0005531888 nova_compute[186788]: 2025-11-22 07:48:35.980 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:35 np0005531888 nova_compute[186788]: 2025-11-22 07:48:35.980 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:48:36 np0005531888 nova_compute[186788]: 2025-11-22 07:48:36.401 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:48:36.800 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:48:36.800 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:48:36.801 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.839 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000027', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '072c26a765bb4c6081d04d313aceda15', 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'hostId': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.840 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.842 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.842 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.842 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.842 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1105610867>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1105610867>]
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.842 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.873 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.read.latency volume: 891384151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.874 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.read.latency volume: 59371052 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f568fbb-d501-4a5c-8232-beca4690d704', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 891384151, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.842819', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a7344e04-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': '335f158b8dd63237b5b02b77e55b770b5b2680b23b4ec2fe7ed6dcac9a334d83'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59371052, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.842819', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a7345f7a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': '1c977d76e0ae687111e034a4e1f24946d7044f1dd8842aa9a9b9306a51a1d499'}]}, 'timestamp': '2025-11-22 07:48:36.875007', '_unique_id': 'f4c74d65024641bea54b42297ede014f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.877 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.877 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.read.requests volume: 1125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.877 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '345abd14-d8a9-4aab-bd9c-f3ef9a6bc8c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1125, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.877632', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a734d1f8-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': 'e756ec740803ac2e52c93f92b16b6729e105a65928c59b1ea86850acf613c0c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.877632', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a734da7c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': '69fe77096ac61596491a132b26be11acae7c657ecee2675937efe62ce7f32ddc'}]}, 'timestamp': '2025-11-22 07:48:36.878097', '_unique_id': '31e9216206254881bb61c9f38ad1bc5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.879 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.879 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1105610867>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1105610867>]
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.890 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.890 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab6348eb-15ec-4613-b3eb-4e837689cf98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.880057', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a736c76a-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.579567115, 'message_signature': '3cb2feab19fc5aba5e78608143973caa9c9e58ab763d2ab9fc6ba25fe2866c5c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.880057', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a736d638-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.579567115, 'message_signature': 'e7a89bc026faa0e22f5d363aff1cbb4329a98ac5f4ea742d2ef83891df36b660'}]}, 'timestamp': '2025-11-22 07:48:36.891144', '_unique_id': '51c7c7b80653407d9dbca6882f703f48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.893 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.read.bytes volume: 30968320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.893 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62b49458-8c31-4f7f-8ec6-545d3776b371', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30968320, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.893460', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a7373dee-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': '0dfc70a9a5cde386dd1ac07baf868f073fbcd604734ac31234ea30be3b6f9bfa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.893460', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a737494c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': 'd2a06211734bcadf3c56c650def80d698a72ba955e07c730e4d18117c8408fcf'}]}, 'timestamp': '2025-11-22 07:48:36.894073', '_unique_id': 'c465b06c7d8b4406be01c42f80197382'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.895 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.895 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1105610867>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1105610867>]
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.896 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.911 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/cpu volume: 12630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cafa0e5-7ee5-460d-be71-ae645617fa56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12630000000, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'timestamp': '2025-11-22T07:48:36.896167', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a739f41c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.610482562, 'message_signature': '4817cca7952e1312f3c69bc76e81b31ad387c6d77f406fd2a4c1eec4ca3fd22f'}]}, 'timestamp': '2025-11-22 07:48:36.911594', '_unique_id': '4c0558ca7f644fc399ac52729da1f576'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.913 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.913 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93504b59-1226-4ac6-8eef-ef1a32ba6faa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.913406', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a73a49f8-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.579567115, 'message_signature': '8652dd37d95821d0c1b429fb85b041166aca906e0a9a3ac544b32f6c6de33cec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.913406', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a73a55b0-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.579567115, 'message_signature': '8fd12fcd8fd8d8a473852a4d03fd1804db3d3ca828db632fe97bdeb72dc48be0'}]}, 'timestamp': '2025-11-22 07:48:36.914012', '_unique_id': '98d29559373544ddb896407a79b02db1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.915 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.915 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2be002ad-f014-4c8f-b248-d94c218fc049', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.915456', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a73a98e0-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.579567115, 'message_signature': '0ec31057a50ed802c3760ebc0ce07d0056dba7792c1ad47cd556c1af04d4d508'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.915456', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a73aa380-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.579567115, 'message_signature': '652638c4a017786e0aa273aa73f3dbcfd90eadc297f49dfbe525d214a734c165'}]}, 'timestamp': '2025-11-22 07:48:36.916049', '_unique_id': 'bb9c26fc1a3440848466ac9826b397b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.917 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.write.bytes volume: 72851456 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.917 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90a5690d-9ed5-4a6a-b77f-b574059d5841', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72851456, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.917669', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a73aee6c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': 'eaa367f53ecc372639487fe41ffba43a945ad8c5e2e68f28f1b3b64f931906e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.917669', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a73af934-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': '457ececd821cffd0dabf6e2439451549dc177fd58c7db554655e276c6e2ce227'}]}, 'timestamp': '2025-11-22 07:48:36.918232', '_unique_id': 'fb9504399fd341c3a6a346f83a2a605f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.919 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.919 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1105610867>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1105610867>]
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.920 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.920 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/memory.usage volume: 41.234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a53cf28-bc86-4c47-a50a-412e15ae800f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 41.234375, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'timestamp': '2025-11-22T07:48:36.920230', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a73b5302-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.610482562, 'message_signature': '6a878d4166d9c4a4f6da8e8c5a31d79e10dab739af543d9415195c16dfc5fad3'}]}, 'timestamp': '2025-11-22 07:48:36.920552', '_unique_id': '85fa91f33ab642eb9b7897e5ff1f5e26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.922 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.write.requests volume: 307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.922 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbb0205b-d6bc-4460-9238-1aa8aa22dc24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 307, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.921985', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a73b95d8-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': '1f18acc98d0e6fc2cfb5b4e8a2a152a19c6de397e5704e9a02d42bc61244671a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.921985', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a73b9e5c-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': '6cf982f5d74e52151a5c339076cdbbdc25baa9a5f4109c7a89679e684b429f97'}]}, 'timestamp': '2025-11-22 07:48:36.922468', '_unique_id': 'f6fd2b5ba45745a28b4c6c5ec4cd4166'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.923 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.write.latency volume: 2960297653 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.924 12 DEBUG ceilometer.compute.pollsters [-] c3009d0a-92f7-42b9-a930-9bdb9e70bd08/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96d5db8f-8667-4402-b6ef-7b28879c64b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2960297653, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-vda', 'timestamp': '2025-11-22T07:48:36.923877', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a73be146-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': '19981e819c5bd590145110fdec5a8a461446bb088925f444175b2039af1a6f66'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08-sda', 'timestamp': '2025-11-22T07:48:36.923877', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1105610867', 'name': 'instance-00000027', 'instance_id': 'c3009d0a-92f7-42b9-a930-9bdb9e70bd08', 'instance_type': 'm1.nano', 'host': '00006139a4578f6b61cbba5578c61a7c298b39e51e102e423c4354c8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a73bec04-c777-11f0-941d-fa163e6775e5', 'monotonic_time': 4457.542311233, 'message_signature': 'a269139161ddd3ee00129c1bc95574d44e14b05a592c1e18f1efe6bc2c730bb0'}]}, 'timestamp': '2025-11-22 07:48:36.924413', '_unique_id': '43fec8c1d34348c6a8857efe4a8c1038'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:48:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531888 nova_compute[186788]: 2025-11-22 07:48:36.991 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:36 np0005531888 nova_compute[186788]: 2025-11-22 07:48:36.992 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:48:36 np0005531888 nova_compute[186788]: 2025-11-22 07:48:36.992 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:48:37 np0005531888 nova_compute[186788]: 2025-11-22 07:48:37.517 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-c3009d0a-92f7-42b9-a930-9bdb9e70bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:48:37 np0005531888 nova_compute[186788]: 2025-11-22 07:48:37.518 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-c3009d0a-92f7-42b9-a930-9bdb9e70bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:48:37 np0005531888 nova_compute[186788]: 2025-11-22 07:48:37.518 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:48:37 np0005531888 nova_compute[186788]: 2025-11-22 07:48:37.518 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c3009d0a-92f7-42b9-a930-9bdb9e70bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:48:37 np0005531888 nova_compute[186788]: 2025-11-22 07:48:37.693 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:48:38 np0005531888 nova_compute[186788]: 2025-11-22 07:48:38.123 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:48:38 np0005531888 nova_compute[186788]: 2025-11-22 07:48:38.136 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-c3009d0a-92f7-42b9-a930-9bdb9e70bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:48:38 np0005531888 nova_compute[186788]: 2025-11-22 07:48:38.137 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:48:38 np0005531888 nova_compute[186788]: 2025-11-22 07:48:38.137 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:39 np0005531888 nova_compute[186788]: 2025-11-22 07:48:39.106 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:39 np0005531888 nova_compute[186788]: 2025-11-22 07:48:39.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:40 np0005531888 nova_compute[186788]: 2025-11-22 07:48:40.641 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:40 np0005531888 nova_compute[186788]: 2025-11-22 07:48:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:41 np0005531888 nova_compute[186788]: 2025-11-22 07:48:41.402 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:41 np0005531888 nova_compute[186788]: 2025-11-22 07:48:41.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:42 np0005531888 podman[218965]: 2025-11-22 07:48:42.708140891 +0000 UTC m=+0.084621607 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.904 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.905 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.906 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.906 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.906 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.912 186792 INFO nova.compute.manager [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Terminating instance#033[00m
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.917 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "refresh_cache-c3009d0a-92f7-42b9-a930-9bdb9e70bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.917 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquired lock "refresh_cache-c3009d0a-92f7-42b9-a930-9bdb9e70bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:48:42 np0005531888 nova_compute[186788]: 2025-11-22 07:48:42.917 186792 DEBUG nova.network.neutron [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.122 186792 DEBUG nova.network.neutron [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.462 186792 DEBUG nova.network.neutron [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.474 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Releasing lock "refresh_cache-c3009d0a-92f7-42b9-a930-9bdb9e70bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.475 186792 DEBUG nova.compute.manager [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:48:43 np0005531888 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000027.scope: Deactivated successfully.
Nov 22 02:48:43 np0005531888 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000027.scope: Consumed 15.834s CPU time.
Nov 22 02:48:43 np0005531888 systemd-machined[153106]: Machine qemu-18-instance-00000027 terminated.
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.732 186792 INFO nova.virt.libvirt.driver [-] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Instance destroyed successfully.#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.734 186792 DEBUG nova.objects.instance [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lazy-loading 'resources' on Instance uuid c3009d0a-92f7-42b9-a930-9bdb9e70bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.746 186792 INFO nova.virt.libvirt.driver [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Deleting instance files /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08_del#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.747 186792 INFO nova.virt.libvirt.driver [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Deletion of /var/lib/nova/instances/c3009d0a-92f7-42b9-a930-9bdb9e70bd08_del complete#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.850 186792 INFO nova.compute.manager [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.850 186792 DEBUG oslo.service.loopingcall [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.851 186792 DEBUG nova.compute.manager [-] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.851 186792 DEBUG nova.network.neutron [-] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:48:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:48:43.940 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.942 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:48:43.943 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:48:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:48:43.944 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:43 np0005531888 nova_compute[186788]: 2025-11-22 07:48:43.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.165 186792 DEBUG nova.network.neutron [-] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.185 186792 DEBUG nova.network.neutron [-] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.214 186792 INFO nova.compute.manager [-] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Took 0.36 seconds to deallocate network for instance.#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.295 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.295 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.738 186792 DEBUG nova.compute.provider_tree [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.749 186792 DEBUG nova.scheduler.client.report [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.776 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:44 np0005531888 nova_compute[186788]: 2025-11-22 07:48:44.945 186792 INFO nova.scheduler.client.report [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Deleted allocations for instance c3009d0a-92f7-42b9-a930-9bdb9e70bd08#033[00m
Nov 22 02:48:45 np0005531888 nova_compute[186788]: 2025-11-22 07:48:45.054 186792 DEBUG oslo_concurrency.lockutils [None req-eaae3610-5be1-4e26-947f-d4785fee1f09 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "c3009d0a-92f7-42b9-a930-9bdb9e70bd08" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:45 np0005531888 nova_compute[186788]: 2025-11-22 07:48:45.644 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:45 np0005531888 podman[218995]: 2025-11-22 07:48:45.696305193 +0000 UTC m=+0.065625109 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:48:45 np0005531888 nova_compute[186788]: 2025-11-22 07:48:45.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:45 np0005531888 nova_compute[186788]: 2025-11-22 07:48:45.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:45 np0005531888 nova_compute[186788]: 2025-11-22 07:48:45.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:45 np0005531888 nova_compute[186788]: 2025-11-22 07:48:45.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:45 np0005531888 nova_compute[186788]: 2025-11-22 07:48:45.987 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.187 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.188 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5761MB free_disk=73.45678329467773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.188 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.188 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.235 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.235 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.263 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.284 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.307 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.308 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:46 np0005531888 nova_compute[186788]: 2025-11-22 07:48:46.404 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:47 np0005531888 nova_compute[186788]: 2025-11-22 07:48:47.309 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:47 np0005531888 podman[219019]: 2025-11-22 07:48:47.691503134 +0000 UTC m=+0.061355465 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Nov 22 02:48:48 np0005531888 ovn_controller[95067]: 2025-11-22T07:48:48Z|00111|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 02:48:50 np0005531888 nova_compute[186788]: 2025-11-22 07:48:50.647 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:50 np0005531888 nova_compute[186788]: 2025-11-22 07:48:50.662 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:50 np0005531888 nova_compute[186788]: 2025-11-22 07:48:50.662 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:50 np0005531888 nova_compute[186788]: 2025-11-22 07:48:50.704 186792 DEBUG nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:48:50 np0005531888 nova_compute[186788]: 2025-11-22 07:48:50.796 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:50 np0005531888 nova_compute[186788]: 2025-11-22 07:48:50.797 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:50 np0005531888 nova_compute[186788]: 2025-11-22 07:48:50.806 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:48:50 np0005531888 nova_compute[186788]: 2025-11-22 07:48:50.807 186792 INFO nova.compute.claims [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.008 186792 DEBUG nova.compute.provider_tree [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.022 186792 DEBUG nova.scheduler.client.report [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.052 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.053 186792 DEBUG nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.179 186792 DEBUG nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.180 186792 DEBUG nova.network.neutron [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.199 186792 INFO nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.218 186792 DEBUG nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.406 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.567 186792 DEBUG nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.568 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.569 186792 INFO nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Creating image(s)#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.570 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "/var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.570 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "/var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.571 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "/var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.587 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.650 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.651 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.652 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.668 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.734 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.735 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.847 186792 DEBUG nova.network.neutron [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:48:51 np0005531888 nova_compute[186788]: 2025-11-22 07:48:51.847 186792 DEBUG nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.087 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk 1073741824" returned: 0 in 0.352s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.088 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.089 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.162 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.164 186792 DEBUG nova.virt.disk.api [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Checking if we can resize image /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.164 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.232 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.233 186792 DEBUG nova.virt.disk.api [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Cannot resize image /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.234 186792 DEBUG nova.objects.instance [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lazy-loading 'migration_context' on Instance uuid 1dbcaf74-462a-4823-aaf7-9e50bd39d204 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.247 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.248 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Ensure instance console log exists: /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.248 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.248 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.249 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.250 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.256 186792 WARNING nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.261 186792 DEBUG nova.virt.libvirt.host [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.261 186792 DEBUG nova.virt.libvirt.host [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.268 186792 DEBUG nova.virt.libvirt.host [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.269 186792 DEBUG nova.virt.libvirt.host [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.270 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.271 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.271 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.271 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.272 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.272 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.272 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.272 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.272 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.273 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.273 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.273 186792 DEBUG nova.virt.hardware [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.278 186792 DEBUG nova.objects.instance [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1dbcaf74-462a-4823-aaf7-9e50bd39d204 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.289 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <uuid>1dbcaf74-462a-4823-aaf7-9e50bd39d204</uuid>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <name>instance-0000002b</name>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerDiagnosticsNegativeTest-server-1380114239</nova:name>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:48:52</nova:creationTime>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:        <nova:user uuid="4d57acfa12d34a58bfad7d04ddde6554">tempest-ServerDiagnosticsNegativeTest-1412417355-project-member</nova:user>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:        <nova:project uuid="dc75cac9c18c4b49b8e7545c27132a41">tempest-ServerDiagnosticsNegativeTest-1412417355</nova:project>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <entry name="serial">1dbcaf74-462a-4823-aaf7-9e50bd39d204</entry>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <entry name="uuid">1dbcaf74-462a-4823-aaf7-9e50bd39d204</entry>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk.config"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/console.log" append="off"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:48:52 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:48:52 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:48:52 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:48:52 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.838 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.839 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:48:52 np0005531888 nova_compute[186788]: 2025-11-22 07:48:52.839 186792 INFO nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Using config drive#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.044 186792 INFO nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Creating config drive at /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk.config#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.049 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1bd3u0bg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.175 186792 DEBUG oslo_concurrency.processutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1bd3u0bg" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:53 np0005531888 systemd-machined[153106]: New machine qemu-19-instance-0000002b.
Nov 22 02:48:53 np0005531888 systemd[1]: Started Virtual Machine qemu-19-instance-0000002b.
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.653 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797733.6521578, 1dbcaf74-462a-4823-aaf7-9e50bd39d204 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.653 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.657 186792 DEBUG nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.657 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.662 186792 INFO nova.virt.libvirt.driver [-] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Instance spawned successfully.#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.663 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.675 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.685 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.689 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.689 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.690 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.690 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.691 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.691 186792 DEBUG nova.virt.libvirt.driver [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.715 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.715 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797733.6588593, 1dbcaf74-462a-4823-aaf7-9e50bd39d204 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.716 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] VM Started (Lifecycle Event)#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.735 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.739 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.760 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.859 186792 INFO nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Took 2.29 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:48:53 np0005531888 nova_compute[186788]: 2025-11-22 07:48:53.859 186792 DEBUG nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:54 np0005531888 nova_compute[186788]: 2025-11-22 07:48:54.035 186792 INFO nova.compute.manager [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Took 3.27 seconds to build instance.#033[00m
Nov 22 02:48:54 np0005531888 nova_compute[186788]: 2025-11-22 07:48:54.194 186792 DEBUG oslo_concurrency.lockutils [None req-a0939c4d-4aa2-4bb2-8da9-d14a2070a280 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.564 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.565 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.565 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.566 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.566 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.574 186792 INFO nova.compute.manager [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Terminating instance#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.583 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "refresh_cache-1dbcaf74-462a-4823-aaf7-9e50bd39d204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.583 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquired lock "refresh_cache-1dbcaf74-462a-4823-aaf7-9e50bd39d204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.584 186792 DEBUG nova.network.neutron [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.651 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:55 np0005531888 nova_compute[186788]: 2025-11-22 07:48:55.796 186792 DEBUG nova.network.neutron [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.213 186792 DEBUG nova.network.neutron [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.225 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Releasing lock "refresh_cache-1dbcaf74-462a-4823-aaf7-9e50bd39d204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.226 186792 DEBUG nova.compute.manager [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:48:56 np0005531888 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Nov 22 02:48:56 np0005531888 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002b.scope: Consumed 2.958s CPU time.
Nov 22 02:48:56 np0005531888 systemd-machined[153106]: Machine qemu-19-instance-0000002b terminated.
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.407 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.475 186792 INFO nova.virt.libvirt.driver [-] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Instance destroyed successfully.#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.476 186792 DEBUG nova.objects.instance [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lazy-loading 'resources' on Instance uuid 1dbcaf74-462a-4823-aaf7-9e50bd39d204 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.490 186792 INFO nova.virt.libvirt.driver [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Deleting instance files /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204_del#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.491 186792 INFO nova.virt.libvirt.driver [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Deletion of /var/lib/nova/instances/1dbcaf74-462a-4823-aaf7-9e50bd39d204_del complete#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.711 186792 INFO nova.compute.manager [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.711 186792 DEBUG oslo.service.loopingcall [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.712 186792 DEBUG nova.compute.manager [-] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:48:56 np0005531888 nova_compute[186788]: 2025-11-22 07:48:56.712 186792 DEBUG nova.network.neutron [-] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.131 186792 DEBUG nova.network.neutron [-] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.146 186792 DEBUG nova.network.neutron [-] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.164 186792 INFO nova.compute.manager [-] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Took 0.45 seconds to deallocate network for instance.#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.407 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.408 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.466 186792 DEBUG nova.compute.provider_tree [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.477 186792 DEBUG nova.scheduler.client.report [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.524 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.624 186792 INFO nova.scheduler.client.report [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Deleted allocations for instance 1dbcaf74-462a-4823-aaf7-9e50bd39d204#033[00m
Nov 22 02:48:57 np0005531888 podman[219090]: 2025-11-22 07:48:57.703267331 +0000 UTC m=+0.070876346 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:48:57 np0005531888 nova_compute[186788]: 2025-11-22 07:48:57.762 186792 DEBUG oslo_concurrency.lockutils [None req-8416146d-d0e4-4783-a4fe-d97b6ba545e9 4d57acfa12d34a58bfad7d04ddde6554 dc75cac9c18c4b49b8e7545c27132a41 - - default default] Lock "1dbcaf74-462a-4823-aaf7-9e50bd39d204" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:57 np0005531888 podman[219091]: 2025-11-22 07:48:57.766468649 +0000 UTC m=+0.129173045 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:48:58 np0005531888 nova_compute[186788]: 2025-11-22 07:48:58.731 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797723.7295275, c3009d0a-92f7-42b9-a930-9bdb9e70bd08 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:48:58 np0005531888 nova_compute[186788]: 2025-11-22 07:48:58.731 186792 INFO nova.compute.manager [-] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:48:58 np0005531888 nova_compute[186788]: 2025-11-22 07:48:58.752 186792 DEBUG nova.compute.manager [None req-7345755a-cadb-4847-b355-0b79c723355d - - - - - -] [instance: c3009d0a-92f7-42b9-a930-9bdb9e70bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:00 np0005531888 nova_compute[186788]: 2025-11-22 07:49:00.655 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:01 np0005531888 nova_compute[186788]: 2025-11-22 07:49:01.448 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:03 np0005531888 podman[219132]: 2025-11-22 07:49:03.692469618 +0000 UTC m=+0.062575154 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:49:05 np0005531888 nova_compute[186788]: 2025-11-22 07:49:05.658 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:06 np0005531888 nova_compute[186788]: 2025-11-22 07:49:06.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:06 np0005531888 nova_compute[186788]: 2025-11-22 07:49:06.651 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:06 np0005531888 nova_compute[186788]: 2025-11-22 07:49:06.651 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:06 np0005531888 podman[219158]: 2025-11-22 07:49:06.708579217 +0000 UTC m=+0.060003541 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:49:06 np0005531888 nova_compute[186788]: 2025-11-22 07:49:06.747 186792 DEBUG nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:49:06 np0005531888 nova_compute[186788]: 2025-11-22 07:49:06.855 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:06 np0005531888 nova_compute[186788]: 2025-11-22 07:49:06.855 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:06 np0005531888 nova_compute[186788]: 2025-11-22 07:49:06.864 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:49:06 np0005531888 nova_compute[186788]: 2025-11-22 07:49:06.865 186792 INFO nova.compute.claims [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.005 186792 DEBUG nova.compute.provider_tree [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.023 186792 DEBUG nova.scheduler.client.report [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.203 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.204 186792 DEBUG nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.360 186792 DEBUG nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.420 186792 INFO nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.453 186792 DEBUG nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.570 186792 DEBUG nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.571 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.572 186792 INFO nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating image(s)#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.573 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.573 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.574 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.591 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.656 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.657 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.658 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.670 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.730 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.732 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.775 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.776 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.777 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.847 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.849 186792 DEBUG nova.virt.disk.api [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Checking if we can resize image /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.850 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.913 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.914 186792 DEBUG nova.virt.disk.api [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Cannot resize image /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.915 186792 DEBUG nova.objects.instance [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'migration_context' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.932 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.932 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Ensure instance console log exists: /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.933 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.933 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.933 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.935 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.941 186792 WARNING nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.947 186792 DEBUG nova.virt.libvirt.host [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.948 186792 DEBUG nova.virt.libvirt.host [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.952 186792 DEBUG nova.virt.libvirt.host [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.952 186792 DEBUG nova.virt.libvirt.host [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.955 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.955 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.956 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.956 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.957 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.957 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.957 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.957 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.957 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.958 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.958 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.958 186792 DEBUG nova.virt.hardware [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:49:07 np0005531888 nova_compute[186788]: 2025-11-22 07:49:07.963 186792 DEBUG nova.objects.instance [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'pci_devices' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:08 np0005531888 nova_compute[186788]: 2025-11-22 07:49:08.000 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <uuid>d9e99a21-c535-4d0b-a093-fd29f52db0d3</uuid>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <name>instance-0000002c</name>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersAdmin275Test-server-533653631</nova:name>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:49:07</nova:creationTime>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:        <nova:user uuid="242a09363b3f43f3890b84468fa9845e">tempest-ServersAdmin275Test-1098126455-project-member</nova:user>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:        <nova:project uuid="7547ad7b81b047bca7813a5a55487129">tempest-ServersAdmin275Test-1098126455</nova:project>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <entry name="serial">d9e99a21-c535-4d0b-a093-fd29f52db0d3</entry>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <entry name="uuid">d9e99a21-c535-4d0b-a093-fd29f52db0d3</entry>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/console.log" append="off"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:49:08 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:49:08 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:49:08 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:49:08 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:49:08 np0005531888 nova_compute[186788]: 2025-11-22 07:49:08.061 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:49:08 np0005531888 nova_compute[186788]: 2025-11-22 07:49:08.061 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:49:08 np0005531888 nova_compute[186788]: 2025-11-22 07:49:08.062 186792 INFO nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Using config drive#033[00m
Nov 22 02:49:08 np0005531888 nova_compute[186788]: 2025-11-22 07:49:08.368 186792 INFO nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating config drive at /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config#033[00m
Nov 22 02:49:08 np0005531888 nova_compute[186788]: 2025-11-22 07:49:08.374 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxkntqpw5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:08 np0005531888 nova_compute[186788]: 2025-11-22 07:49:08.505 186792 DEBUG oslo_concurrency.processutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxkntqpw5" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:08 np0005531888 systemd-machined[153106]: New machine qemu-20-instance-0000002c.
Nov 22 02:49:08 np0005531888 systemd[1]: Started Virtual Machine qemu-20-instance-0000002c.
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.222 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797749.2216218, d9e99a21-c535-4d0b-a093-fd29f52db0d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.224 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.226 186792 DEBUG nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.227 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.231 186792 INFO nova.virt.libvirt.driver [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance spawned successfully.#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.232 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.250 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.257 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.263 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.264 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.264 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.265 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.265 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.266 186792 DEBUG nova.virt.libvirt.driver [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.287 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.287 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797749.2220318, d9e99a21-c535-4d0b-a093-fd29f52db0d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.288 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] VM Started (Lifecycle Event)#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.317 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.322 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.350 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.354 186792 INFO nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Took 1.78 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.355 186792 DEBUG nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.447 186792 INFO nova.compute.manager [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Took 2.63 seconds to build instance.#033[00m
Nov 22 02:49:09 np0005531888 nova_compute[186788]: 2025-11-22 07:49:09.467 186792 DEBUG oslo_concurrency.lockutils [None req-b820bc55-43de-43b3-96cf-a7ad24f4d69b 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:10 np0005531888 nova_compute[186788]: 2025-11-22 07:49:10.662 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:11 np0005531888 nova_compute[186788]: 2025-11-22 07:49:11.452 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:11 np0005531888 nova_compute[186788]: 2025-11-22 07:49:11.474 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797736.4734912, 1dbcaf74-462a-4823-aaf7-9e50bd39d204 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:49:11 np0005531888 nova_compute[186788]: 2025-11-22 07:49:11.475 186792 INFO nova.compute.manager [-] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:49:11 np0005531888 nova_compute[186788]: 2025-11-22 07:49:11.502 186792 DEBUG nova.compute.manager [None req-4fa8d2fc-b5fb-41b2-acb0-19838acf88a1 - - - - - -] [instance: 1dbcaf74-462a-4823-aaf7-9e50bd39d204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:12 np0005531888 nova_compute[186788]: 2025-11-22 07:49:12.242 186792 INFO nova.compute.manager [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Rebuilding instance#033[00m
Nov 22 02:49:12 np0005531888 nova_compute[186788]: 2025-11-22 07:49:12.566 186792 DEBUG nova.compute.manager [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:12 np0005531888 nova_compute[186788]: 2025-11-22 07:49:12.676 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'pci_requests' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:12 np0005531888 nova_compute[186788]: 2025-11-22 07:49:12.686 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'pci_devices' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:12 np0005531888 nova_compute[186788]: 2025-11-22 07:49:12.695 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'resources' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:12 np0005531888 nova_compute[186788]: 2025-11-22 07:49:12.703 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'migration_context' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:12 np0005531888 nova_compute[186788]: 2025-11-22 07:49:12.712 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:49:12 np0005531888 nova_compute[186788]: 2025-11-22 07:49:12.717 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:49:13 np0005531888 podman[219221]: 2025-11-22 07:49:13.720731914 +0000 UTC m=+0.082032146 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 02:49:15 np0005531888 nova_compute[186788]: 2025-11-22 07:49:15.709 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:16 np0005531888 nova_compute[186788]: 2025-11-22 07:49:16.456 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:16 np0005531888 podman[219242]: 2025-11-22 07:49:16.693344081 +0000 UTC m=+0.059858719 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:49:18 np0005531888 podman[219266]: 2025-11-22 07:49:18.69636867 +0000 UTC m=+0.065020913 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64)
Nov 22 02:49:20 np0005531888 nova_compute[186788]: 2025-11-22 07:49:20.712 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:21 np0005531888 nova_compute[186788]: 2025-11-22 07:49:21.456 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:22 np0005531888 nova_compute[186788]: 2025-11-22 07:49:22.773 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:49:25 np0005531888 nova_compute[186788]: 2025-11-22 07:49:25.716 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:26 np0005531888 nova_compute[186788]: 2025-11-22 07:49:26.459 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:28 np0005531888 podman[219309]: 2025-11-22 07:49:28.697771146 +0000 UTC m=+0.067697618 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 22 02:49:28 np0005531888 podman[219310]: 2025-11-22 07:49:28.733300246 +0000 UTC m=+0.094579739 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:49:30 np0005531888 nova_compute[186788]: 2025-11-22 07:49:30.720 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:31 np0005531888 nova_compute[186788]: 2025-11-22 07:49:31.461 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:33 np0005531888 nova_compute[186788]: 2025-11-22 07:49:33.833 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:49:34 np0005531888 podman[219356]: 2025-11-22 07:49:34.682457755 +0000 UTC m=+0.052478931 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:49:35 np0005531888 nova_compute[186788]: 2025-11-22 07:49:35.724 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:36 np0005531888 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Nov 22 02:49:36 np0005531888 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002c.scope: Consumed 16.133s CPU time.
Nov 22 02:49:36 np0005531888 systemd-machined[153106]: Machine qemu-20-instance-0000002c terminated.
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.463 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:49:36.800 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:49:36.801 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:49:36.801 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.851 186792 INFO nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance shutdown successfully after 24 seconds.#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.859 186792 INFO nova.virt.libvirt.driver [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance destroyed successfully.#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.865 186792 INFO nova.virt.libvirt.driver [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance destroyed successfully.#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.866 186792 INFO nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Deleting instance files /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3_del#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.867 186792 INFO nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Deletion of /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3_del complete#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-d9e99a21-c535-4d0b-a093-fd29f52db0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-d9e99a21-c535-4d0b-a093-fd29f52db0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.983 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:49:36 np0005531888 nova_compute[186788]: 2025-11-22 07:49:36.983 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.102 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.102 186792 INFO nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating image(s)#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.103 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.104 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.104 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.105 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.105 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.256 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.576 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.591 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-d9e99a21-c535-4d0b-a093-fd29f52db0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:49:37 np0005531888 nova_compute[186788]: 2025-11-22 07:49:37.592 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:49:37 np0005531888 podman[219389]: 2025-11-22 07:49:37.711941807 +0000 UTC m=+0.084450244 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:49:38 np0005531888 nova_compute[186788]: 2025-11-22 07:49:38.649 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:38 np0005531888 nova_compute[186788]: 2025-11-22 07:49:38.714 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:38 np0005531888 nova_compute[186788]: 2025-11-22 07:49:38.716 186792 DEBUG nova.virt.images [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] 360f90ca-2ddb-4e60-a48e-364e3b48bd96 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 02:49:38 np0005531888 nova_compute[186788]: 2025-11-22 07:49:38.717 186792 DEBUG nova.privsep.utils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:49:38 np0005531888 nova_compute[186788]: 2025-11-22 07:49:38.717 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.109 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.115 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.186 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.188 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.206 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.264 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.266 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.267 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.283 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.342 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.344 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.387 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.389 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.389 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.455 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.456 186792 DEBUG nova.virt.disk.api [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Checking if we can resize image /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.456 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.523 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.524 186792 DEBUG nova.virt.disk.api [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Cannot resize image /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.525 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.525 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Ensure instance console log exists: /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.526 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.526 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.527 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.529 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.537 186792 WARNING nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.544 186792 DEBUG nova.virt.libvirt.host [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.545 186792 DEBUG nova.virt.libvirt.host [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.551 186792 DEBUG nova.virt.libvirt.host [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.552 186792 DEBUG nova.virt.libvirt.host [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.553 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.554 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.554 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.554 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.555 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.555 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.555 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.555 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.556 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.556 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.556 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.556 186792 DEBUG nova.virt.hardware [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.557 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'vcpu_model' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.583 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <uuid>d9e99a21-c535-4d0b-a093-fd29f52db0d3</uuid>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <name>instance-0000002c</name>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersAdmin275Test-server-533653631</nova:name>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:49:39</nova:creationTime>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:        <nova:user uuid="242a09363b3f43f3890b84468fa9845e">tempest-ServersAdmin275Test-1098126455-project-member</nova:user>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:        <nova:project uuid="7547ad7b81b047bca7813a5a55487129">tempest-ServersAdmin275Test-1098126455</nova:project>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <entry name="serial">d9e99a21-c535-4d0b-a093-fd29f52db0d3</entry>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <entry name="uuid">d9e99a21-c535-4d0b-a093-fd29f52db0d3</entry>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/console.log" append="off"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:49:39 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:49:39 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:49:39 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:49:39 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.645 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.646 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.647 186792 INFO nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Using config drive#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.662 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'ec2_ids' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:39 np0005531888 nova_compute[186788]: 2025-11-22 07:49:39.696 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'keypairs' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.275 186792 INFO nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating config drive at /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.281 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphn1we5nm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.411 186792 DEBUG oslo_concurrency.processutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphn1we5nm" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:40 np0005531888 systemd-machined[153106]: New machine qemu-21-instance-0000002c.
Nov 22 02:49:40 np0005531888 systemd[1]: Started Virtual Machine qemu-21-instance-0000002c.
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.585 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.728 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.890 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for d9e99a21-c535-4d0b-a093-fd29f52db0d3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.891 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797780.8892775, d9e99a21-c535-4d0b-a093-fd29f52db0d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.891 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.893 186792 DEBUG nova.compute.manager [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.894 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.898 186792 INFO nova.virt.libvirt.driver [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance spawned successfully.#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.899 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.913 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.921 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.922 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.922 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.923 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.923 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.923 186792 DEBUG nova.virt.libvirt.driver [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.926 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.955 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.955 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797780.891002, d9e99a21-c535-4d0b-a093-fd29f52db0d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.956 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] VM Started (Lifecycle Event)#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.987 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:40 np0005531888 nova_compute[186788]: 2025-11-22 07:49:40.991 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:49:41 np0005531888 nova_compute[186788]: 2025-11-22 07:49:41.012 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 02:49:41 np0005531888 nova_compute[186788]: 2025-11-22 07:49:41.014 186792 DEBUG nova.compute.manager [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:41 np0005531888 nova_compute[186788]: 2025-11-22 07:49:41.086 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:41 np0005531888 nova_compute[186788]: 2025-11-22 07:49:41.087 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:41 np0005531888 nova_compute[186788]: 2025-11-22 07:49:41.087 186792 DEBUG nova.objects.instance [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:49:41 np0005531888 nova_compute[186788]: 2025-11-22 07:49:41.162 186792 DEBUG oslo_concurrency.lockutils [None req-d6681cc2-dbc0-483d-b231-ba567dc92e59 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:41 np0005531888 nova_compute[186788]: 2025-11-22 07:49:41.465 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:43 np0005531888 nova_compute[186788]: 2025-11-22 07:49:43.625 186792 INFO nova.compute.manager [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Rebuilding instance#033[00m
Nov 22 02:49:43 np0005531888 nova_compute[186788]: 2025-11-22 07:49:43.950 186792 DEBUG nova.compute.manager [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:43 np0005531888 nova_compute[186788]: 2025-11-22 07:49:43.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:43 np0005531888 nova_compute[186788]: 2025-11-22 07:49:43.957 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:44 np0005531888 nova_compute[186788]: 2025-11-22 07:49:44.015 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lazy-loading 'pci_requests' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:44 np0005531888 nova_compute[186788]: 2025-11-22 07:49:44.028 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lazy-loading 'pci_devices' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:44 np0005531888 nova_compute[186788]: 2025-11-22 07:49:44.042 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lazy-loading 'resources' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:44 np0005531888 nova_compute[186788]: 2025-11-22 07:49:44.056 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lazy-loading 'migration_context' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:44 np0005531888 nova_compute[186788]: 2025-11-22 07:49:44.072 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:49:44 np0005531888 nova_compute[186788]: 2025-11-22 07:49:44.077 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:49:44 np0005531888 podman[219465]: 2025-11-22 07:49:44.703450114 +0000 UTC m=+0.070686641 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 02:49:45 np0005531888 nova_compute[186788]: 2025-11-22 07:49:45.734 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:45 np0005531888 nova_compute[186788]: 2025-11-22 07:49:45.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:45 np0005531888 nova_compute[186788]: 2025-11-22 07:49:45.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:49:46 np0005531888 nova_compute[186788]: 2025-11-22 07:49:46.467 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:46 np0005531888 nova_compute[186788]: 2025-11-22 07:49:46.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:46 np0005531888 nova_compute[186788]: 2025-11-22 07:49:46.973 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:46 np0005531888 nova_compute[186788]: 2025-11-22 07:49:46.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:46 np0005531888 nova_compute[186788]: 2025-11-22 07:49:46.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:46 np0005531888 nova_compute[186788]: 2025-11-22 07:49:46.975 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.060 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:47 np0005531888 podman[219485]: 2025-11-22 07:49:47.110454123 +0000 UTC m=+0.087005245 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.131 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.133 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.214 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.424 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.426 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5603MB free_disk=73.42206954956055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.426 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.427 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.502 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance d9e99a21-c535-4d0b-a093-fd29f52db0d3 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.502 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.503 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.522 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.542 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.543 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.560 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.582 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.620 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.634 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.809 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:49:47 np0005531888 nova_compute[186788]: 2025-11-22 07:49:47.810 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:49 np0005531888 podman[219516]: 2025-11-22 07:49:49.690006354 +0000 UTC m=+0.064242183 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public)
Nov 22 02:49:49 np0005531888 nova_compute[186788]: 2025-11-22 07:49:49.806 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:49 np0005531888 nova_compute[186788]: 2025-11-22 07:49:49.827 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:50 np0005531888 nova_compute[186788]: 2025-11-22 07:49:50.738 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:51 np0005531888 nova_compute[186788]: 2025-11-22 07:49:51.472 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:54 np0005531888 nova_compute[186788]: 2025-11-22 07:49:54.129 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:49:55 np0005531888 nova_compute[186788]: 2025-11-22 07:49:55.742 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:49:55.837 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:49:55 np0005531888 nova_compute[186788]: 2025-11-22 07:49:55.837 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:49:55.839 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:49:56 np0005531888 nova_compute[186788]: 2025-11-22 07:49:56.473 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:59 np0005531888 podman[219558]: 2025-11-22 07:49:59.709533824 +0000 UTC m=+0.081040849 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:49:59 np0005531888 podman[219559]: 2025-11-22 07:49:59.740635073 +0000 UTC m=+0.107348213 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 02:50:00 np0005531888 nova_compute[186788]: 2025-11-22 07:50:00.745 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:01 np0005531888 nova_compute[186788]: 2025-11-22 07:50:01.475 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:01.842 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:05 np0005531888 nova_compute[186788]: 2025-11-22 07:50:05.180 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:50:05 np0005531888 podman[219604]: 2025-11-22 07:50:05.692762131 +0000 UTC m=+0.053963639 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:50:05 np0005531888 nova_compute[186788]: 2025-11-22 07:50:05.786 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:06 np0005531888 nova_compute[186788]: 2025-11-22 07:50:06.477 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:07 np0005531888 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Nov 22 02:50:07 np0005531888 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002c.scope: Consumed 15.770s CPU time.
Nov 22 02:50:07 np0005531888 systemd-machined[153106]: Machine qemu-21-instance-0000002c terminated.
Nov 22 02:50:08 np0005531888 nova_compute[186788]: 2025-11-22 07:50:08.197 186792 INFO nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance shutdown successfully after 24 seconds.#033[00m
Nov 22 02:50:08 np0005531888 nova_compute[186788]: 2025-11-22 07:50:08.203 186792 INFO nova.virt.libvirt.driver [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance destroyed successfully.#033[00m
Nov 22 02:50:08 np0005531888 nova_compute[186788]: 2025-11-22 07:50:08.207 186792 INFO nova.virt.libvirt.driver [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance destroyed successfully.#033[00m
Nov 22 02:50:08 np0005531888 nova_compute[186788]: 2025-11-22 07:50:08.208 186792 INFO nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Deleting instance files /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3_del#033[00m
Nov 22 02:50:08 np0005531888 nova_compute[186788]: 2025-11-22 07:50:08.209 186792 INFO nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Deletion of /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3_del complete#033[00m
Nov 22 02:50:08 np0005531888 podman[219637]: 2025-11-22 07:50:08.673492857 +0000 UTC m=+0.052500714 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.029 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.029 186792 INFO nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating image(s)#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.031 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Acquiring lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.031 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.032 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lock "/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.045 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.121 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.122 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.123 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.136 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.200 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.201 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.245 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.247 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.247 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.307 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.309 186792 DEBUG nova.virt.disk.api [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Checking if we can resize image /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.309 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.375 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.376 186792 DEBUG nova.virt.disk.api [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Cannot resize image /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.377 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.377 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Ensure instance console log exists: /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.378 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.378 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.379 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.381 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.387 186792 WARNING nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.396 186792 DEBUG nova.virt.libvirt.host [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.397 186792 DEBUG nova.virt.libvirt.host [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.401 186792 DEBUG nova.virt.libvirt.host [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.401 186792 DEBUG nova.virt.libvirt.host [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.403 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.404 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.404 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.404 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.405 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.405 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.405 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.405 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.406 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.406 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.406 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.407 186792 DEBUG nova.virt.hardware [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.407 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lazy-loading 'vcpu_model' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.433 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <uuid>d9e99a21-c535-4d0b-a093-fd29f52db0d3</uuid>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <name>instance-0000002c</name>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersAdmin275Test-server-533653631</nova:name>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:50:09</nova:creationTime>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:        <nova:user uuid="242a09363b3f43f3890b84468fa9845e">tempest-ServersAdmin275Test-1098126455-project-member</nova:user>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:        <nova:project uuid="7547ad7b81b047bca7813a5a55487129">tempest-ServersAdmin275Test-1098126455</nova:project>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <entry name="serial">d9e99a21-c535-4d0b-a093-fd29f52db0d3</entry>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <entry name="uuid">d9e99a21-c535-4d0b-a093-fd29f52db0d3</entry>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/console.log" append="off"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:50:09 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:50:09 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:50:09 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:50:09 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.524 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.525 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.525 186792 INFO nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Using config drive#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.549 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lazy-loading 'ec2_ids' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:09 np0005531888 nova_compute[186788]: 2025-11-22 07:50:09.593 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lazy-loading 'keypairs' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:10 np0005531888 nova_compute[186788]: 2025-11-22 07:50:10.253 186792 INFO nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Creating config drive at /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config#033[00m
Nov 22 02:50:10 np0005531888 nova_compute[186788]: 2025-11-22 07:50:10.258 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprwqc8mr4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:10 np0005531888 nova_compute[186788]: 2025-11-22 07:50:10.388 186792 DEBUG oslo_concurrency.processutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprwqc8mr4" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:10 np0005531888 systemd-machined[153106]: New machine qemu-22-instance-0000002c.
Nov 22 02:50:10 np0005531888 systemd[1]: Started Virtual Machine qemu-22-instance-0000002c.
Nov 22 02:50:10 np0005531888 nova_compute[186788]: 2025-11-22 07:50:10.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:10 np0005531888 nova_compute[186788]: 2025-11-22 07:50:10.998 186792 DEBUG nova.compute.manager [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:10.999 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.000 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for d9e99a21-c535-4d0b-a093-fd29f52db0d3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.000 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797810.998081, d9e99a21-c535-4d0b-a093-fd29f52db0d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.000 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.010 186792 INFO nova.virt.libvirt.driver [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance spawned successfully.#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.011 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.026 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.033 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.036 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.037 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.037 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.038 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.038 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.039 186792 DEBUG nova.virt.libvirt.driver [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.070 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.071 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797810.9997988, d9e99a21-c535-4d0b-a093-fd29f52db0d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.071 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] VM Started (Lifecycle Event)#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.094 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.098 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.113 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.130 186792 DEBUG nova.compute.manager [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.210 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.210 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.211 186792 DEBUG nova.objects.instance [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.284 186792 DEBUG oslo_concurrency.lockutils [None req-ce18af72-f809-4b99-87ab-41c799d64a79 19481fe1b017429aa0dfcad0fcd20f1b b1d26a47e4804217a570aee67d03d21a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.480 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.854 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.856 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.856 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.856 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.857 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.864 186792 INFO nova.compute.manager [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Terminating instance#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.871 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "refresh_cache-d9e99a21-c535-4d0b-a093-fd29f52db0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.872 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquired lock "refresh_cache-d9e99a21-c535-4d0b-a093-fd29f52db0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:11 np0005531888 nova_compute[186788]: 2025-11-22 07:50:11.872 186792 DEBUG nova.network.neutron [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.151 186792 DEBUG nova.network.neutron [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.472 186792 DEBUG nova.network.neutron [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.486 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Releasing lock "refresh_cache-d9e99a21-c535-4d0b-a093-fd29f52db0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.487 186792 DEBUG nova.compute.manager [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:50:12 np0005531888 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Nov 22 02:50:12 np0005531888 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002c.scope: Consumed 2.063s CPU time.
Nov 22 02:50:12 np0005531888 systemd-machined[153106]: Machine qemu-22-instance-0000002c terminated.
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.736 186792 INFO nova.virt.libvirt.driver [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance destroyed successfully.#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.737 186792 DEBUG nova.objects.instance [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lazy-loading 'resources' on Instance uuid d9e99a21-c535-4d0b-a093-fd29f52db0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.749 186792 INFO nova.virt.libvirt.driver [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Deleting instance files /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3_del#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.751 186792 INFO nova.virt.libvirt.driver [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Deletion of /var/lib/nova/instances/d9e99a21-c535-4d0b-a093-fd29f52db0d3_del complete#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.845 186792 INFO nova.compute.manager [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.846 186792 DEBUG oslo.service.loopingcall [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.846 186792 DEBUG nova.compute.manager [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:50:12 np0005531888 nova_compute[186788]: 2025-11-22 07:50:12.846 186792 DEBUG nova.network.neutron [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.402 186792 DEBUG nova.network.neutron [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.418 186792 DEBUG nova.network.neutron [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.431 186792 INFO nova.compute.manager [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Took 0.58 seconds to deallocate network for instance.#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.548 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.549 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.609 186792 DEBUG nova.compute.provider_tree [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.625 186792 DEBUG nova.scheduler.client.report [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.662 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.704 186792 INFO nova.scheduler.client.report [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Deleted allocations for instance d9e99a21-c535-4d0b-a093-fd29f52db0d3#033[00m
Nov 22 02:50:13 np0005531888 nova_compute[186788]: 2025-11-22 07:50:13.798 186792 DEBUG oslo_concurrency.lockutils [None req-4238fc75-eafc-4a4f-a9de-56655e642a55 242a09363b3f43f3890b84468fa9845e 7547ad7b81b047bca7813a5a55487129 - - default default] Lock "d9e99a21-c535-4d0b-a093-fd29f52db0d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:15 np0005531888 podman[219709]: 2025-11-22 07:50:15.68783241 +0000 UTC m=+0.062170596 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 22 02:50:15 np0005531888 nova_compute[186788]: 2025-11-22 07:50:15.795 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:16 np0005531888 nova_compute[186788]: 2025-11-22 07:50:16.481 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:17 np0005531888 podman[219729]: 2025-11-22 07:50:17.684035679 +0000 UTC m=+0.054770299 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:50:20 np0005531888 podman[219753]: 2025-11-22 07:50:20.692049282 +0000 UTC m=+0.062244389 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41)
Nov 22 02:50:20 np0005531888 nova_compute[186788]: 2025-11-22 07:50:20.800 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:21 np0005531888 nova_compute[186788]: 2025-11-22 07:50:21.482 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:25 np0005531888 nova_compute[186788]: 2025-11-22 07:50:25.803 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.310 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "6d11506a-3658-40b5-ab70-cf036fa5b543" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.311 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.334 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.452 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.453 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.460 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.461 186792 INFO nova.compute.claims [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.483 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.583 186792 DEBUG nova.compute.provider_tree [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.596 186792 DEBUG nova.scheduler.client.report [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.619 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.620 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.690 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.690 186792 DEBUG nova.network.neutron [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.711 186792 INFO nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.735 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.860 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.863 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.863 186792 INFO nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Creating image(s)#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.864 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "/var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.864 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.865 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.881 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.918 186792 DEBUG nova.policy [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.946 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.947 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.947 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:26 np0005531888 nova_compute[186788]: 2025-11-22 07:50:26.959 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.021 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.023 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.096 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk 1073741824" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.098 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.098 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.157 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.159 186792 DEBUG nova.virt.disk.api [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Checking if we can resize image /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.159 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.219 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.220 186792 DEBUG nova.virt.disk.api [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Cannot resize image /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.221 186792 DEBUG nova.objects.instance [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d11506a-3658-40b5-ab70-cf036fa5b543 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.232 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.233 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Ensure instance console log exists: /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.234 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.234 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.234 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.736 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797812.7345898, d9e99a21-c535-4d0b-a093-fd29f52db0d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.737 186792 INFO nova.compute.manager [-] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:50:27 np0005531888 nova_compute[186788]: 2025-11-22 07:50:27.753 186792 DEBUG nova.compute.manager [None req-b6469e9c-b6ca-4a13-8b1a-cf7eb26e26c4 - - - - - -] [instance: d9e99a21-c535-4d0b-a093-fd29f52db0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:28 np0005531888 nova_compute[186788]: 2025-11-22 07:50:28.270 186792 DEBUG nova.network.neutron [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Successfully created port: b16926eb-7b26-4c46-9d49-b192a9fde0df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.541 186792 DEBUG nova.network.neutron [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Successfully updated port: b16926eb-7b26-4c46-9d49-b192a9fde0df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.564 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-6d11506a-3658-40b5-ab70-cf036fa5b543" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.565 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-6d11506a-3658-40b5-ab70-cf036fa5b543" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.565 186792 DEBUG nova.network.neutron [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.656 186792 DEBUG nova.compute.manager [req-6df31a93-4d44-4fd0-99d1-76b1b45078b3 req-add67ff1-8512-4042-96d3-da4da1dd1fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received event network-changed-b16926eb-7b26-4c46-9d49-b192a9fde0df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.657 186792 DEBUG nova.compute.manager [req-6df31a93-4d44-4fd0-99d1-76b1b45078b3 req-add67ff1-8512-4042-96d3-da4da1dd1fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Refreshing instance network info cache due to event network-changed-b16926eb-7b26-4c46-9d49-b192a9fde0df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.657 186792 DEBUG oslo_concurrency.lockutils [req-6df31a93-4d44-4fd0-99d1-76b1b45078b3 req-add67ff1-8512-4042-96d3-da4da1dd1fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6d11506a-3658-40b5-ab70-cf036fa5b543" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:30 np0005531888 podman[219790]: 2025-11-22 07:50:30.708441583 +0000 UTC m=+0.071891819 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible)
Nov 22 02:50:30 np0005531888 podman[219791]: 2025-11-22 07:50:30.745978966 +0000 UTC m=+0.101308617 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.746 186792 DEBUG nova.network.neutron [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:50:30 np0005531888 nova_compute[186788]: 2025-11-22 07:50:30.805 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:31 np0005531888 nova_compute[186788]: 2025-11-22 07:50:31.486 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.629 186792 DEBUG nova.network.neutron [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Updating instance_info_cache with network_info: [{"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.648 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-6d11506a-3658-40b5-ab70-cf036fa5b543" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.648 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Instance network_info: |[{"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.649 186792 DEBUG oslo_concurrency.lockutils [req-6df31a93-4d44-4fd0-99d1-76b1b45078b3 req-add67ff1-8512-4042-96d3-da4da1dd1fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6d11506a-3658-40b5-ab70-cf036fa5b543" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.649 186792 DEBUG nova.network.neutron [req-6df31a93-4d44-4fd0-99d1-76b1b45078b3 req-add67ff1-8512-4042-96d3-da4da1dd1fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Refreshing network info cache for port b16926eb-7b26-4c46-9d49-b192a9fde0df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.653 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Start _get_guest_xml network_info=[{"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.658 186792 WARNING nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.666 186792 DEBUG nova.virt.libvirt.host [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.667 186792 DEBUG nova.virt.libvirt.host [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.672 186792 DEBUG nova.virt.libvirt.host [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.672 186792 DEBUG nova.virt.libvirt.host [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.673 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.674 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.674 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.674 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.674 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.675 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.675 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.675 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.675 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.676 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.676 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.676 186792 DEBUG nova.virt.hardware [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.680 186792 DEBUG nova.virt.libvirt.vif [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:50:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-945838049',display_name='tempest-DeleteServersTestJSON-server-945838049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-945838049',id=47,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-6u3zkf7z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:50:26Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=6d11506a-3658-40b5-ab70-cf036fa5b543,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.680 186792 DEBUG nova.network.os_vif_util [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.681 186792 DEBUG nova.network.os_vif_util [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:ac:f0,bridge_name='br-int',has_traffic_filtering=True,id=b16926eb-7b26-4c46-9d49-b192a9fde0df,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb16926eb-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.681 186792 DEBUG nova.objects.instance [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d11506a-3658-40b5-ab70-cf036fa5b543 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.697 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <uuid>6d11506a-3658-40b5-ab70-cf036fa5b543</uuid>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <name>instance-0000002f</name>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <nova:name>tempest-DeleteServersTestJSON-server-945838049</nova:name>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:50:32</nova:creationTime>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        <nova:user uuid="57077a1511bf46d897beb6fd5eedfa67">tempest-DeleteServersTestJSON-550712359-project-member</nova:user>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        <nova:project uuid="6b68db2b61a54aeaa8ac219f44ed3e75">tempest-DeleteServersTestJSON-550712359</nova:project>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        <nova:port uuid="b16926eb-7b26-4c46-9d49-b192a9fde0df">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <entry name="serial">6d11506a-3658-40b5-ab70-cf036fa5b543</entry>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <entry name="uuid">6d11506a-3658-40b5-ab70-cf036fa5b543</entry>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk.config"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:9c:ac:f0"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <target dev="tapb16926eb-7b"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/console.log" append="off"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:50:32 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:50:32 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:50:32 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:50:32 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.699 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Preparing to wait for external event network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.699 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.699 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.700 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.700 186792 DEBUG nova.virt.libvirt.vif [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:50:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-945838049',display_name='tempest-DeleteServersTestJSON-server-945838049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-945838049',id=47,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-6u3zkf7z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:50:26Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=6d11506a-3658-40b5-ab70-cf036fa5b543,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.700 186792 DEBUG nova.network.os_vif_util [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.701 186792 DEBUG nova.network.os_vif_util [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:ac:f0,bridge_name='br-int',has_traffic_filtering=True,id=b16926eb-7b26-4c46-9d49-b192a9fde0df,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb16926eb-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.701 186792 DEBUG os_vif [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:ac:f0,bridge_name='br-int',has_traffic_filtering=True,id=b16926eb-7b26-4c46-9d49-b192a9fde0df,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb16926eb-7b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.702 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.702 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.703 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.706 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.706 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb16926eb-7b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.706 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb16926eb-7b, col_values=(('external_ids', {'iface-id': 'b16926eb-7b26-4c46-9d49-b192a9fde0df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9c:ac:f0', 'vm-uuid': '6d11506a-3658-40b5-ab70-cf036fa5b543'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.708 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:32 np0005531888 NetworkManager[55166]: <info>  [1763797832.7092] manager: (tapb16926eb-7b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.711 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.719 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:32 np0005531888 nova_compute[186788]: 2025-11-22 07:50:32.721 186792 INFO os_vif [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:ac:f0,bridge_name='br-int',has_traffic_filtering=True,id=b16926eb-7b26-4c46-9d49-b192a9fde0df,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb16926eb-7b')#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.033 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.033 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.034 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No VIF found with MAC fa:16:3e:9c:ac:f0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.034 186792 INFO nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Using config drive#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.445 186792 INFO nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Creating config drive at /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk.config#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.450 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqxrlvfxw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.576 186792 DEBUG oslo_concurrency.processutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqxrlvfxw" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:33 np0005531888 kernel: tapb16926eb-7b: entered promiscuous mode
Nov 22 02:50:33 np0005531888 NetworkManager[55166]: <info>  [1763797833.6492] manager: (tapb16926eb-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.652 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.656 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:33 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:33Z|00112|binding|INFO|Claiming lport b16926eb-7b26-4c46-9d49-b192a9fde0df for this chassis.
Nov 22 02:50:33 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:33Z|00113|binding|INFO|b16926eb-7b26-4c46-9d49-b192a9fde0df: Claiming fa:16:3e:9c:ac:f0 10.100.0.6
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.668 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:ac:f0 10.100.0.6'], port_security=['fa:16:3e:9c:ac:f0 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6d11506a-3658-40b5-ab70-cf036fa5b543', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=b16926eb-7b26-4c46-9d49-b192a9fde0df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.670 104023 INFO neutron.agent.ovn.metadata.agent [-] Port b16926eb-7b26-4c46-9d49-b192a9fde0df in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c bound to our chassis#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.672 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5e910dbb-27d1-4915-8b74-d0538d33c33c#033[00m
Nov 22 02:50:33 np0005531888 systemd-udevd[219853]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.688 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1f009b24-91ee-4339-8fc5-c6f6263908b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.689 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5e910dbb-21 in ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.691 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5e910dbb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.691 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b317f427-6948-4fc0-b68a-90849ce10697]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.693 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[daefbec9-509d-4648-8027-6bf284ef1b56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 systemd-machined[153106]: New machine qemu-23-instance-0000002f.
Nov 22 02:50:33 np0005531888 NetworkManager[55166]: <info>  [1763797833.6999] device (tapb16926eb-7b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:50:33 np0005531888 NetworkManager[55166]: <info>  [1763797833.7008] device (tapb16926eb-7b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.707 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3cfa90-d293-42c0-847a-976545de3bdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.712 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:33 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:33Z|00114|binding|INFO|Setting lport b16926eb-7b26-4c46-9d49-b192a9fde0df ovn-installed in OVS
Nov 22 02:50:33 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:33Z|00115|binding|INFO|Setting lport b16926eb-7b26-4c46-9d49-b192a9fde0df up in Southbound
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.717 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:33 np0005531888 systemd[1]: Started Virtual Machine qemu-23-instance-0000002f.
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.732 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4b44d077-609e-4b17-9407-c70b6991d0a8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.769 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[be7eb1a8-1596-4f33-9d0f-8a9b1475e549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.775 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6a20f6e1-d478-49b2-9ac2-ad30f019cd72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 NetworkManager[55166]: <info>  [1763797833.7762] manager: (tap5e910dbb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.811 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[dda179f9-7478-43a9-b70d-a7e5f5aedf09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.814 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[77e25efc-b0b7-42c6-9198-32314f755252]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 NetworkManager[55166]: <info>  [1763797833.8379] device (tap5e910dbb-20): carrier: link connected
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.844 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[52081812-11bd-4861-8798-4e4dae1a3775]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.866 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[deae2c95-2a0a-403c-ad07-617489ba7e55]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457448, 'reachable_time': 19595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219888, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.885 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[187ffab5-cd9a-4262-8bd5-138af4709b5b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e859'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457448, 'tstamp': 457448}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219889, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.901 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[628df7a7-48b5-416c-a492-5d83c52d7630]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457448, 'reachable_time': 19595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219890, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:33.939 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f88a675a-dd2a-4f5b-98bb-165ec2ae55cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.952 186792 DEBUG nova.compute.manager [req-5194f3c1-ca31-44cc-bfe6-8b22cf69d88d req-918a1499-67f2-4615-8e80-1f61cdf15b9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received event network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.953 186792 DEBUG oslo_concurrency.lockutils [req-5194f3c1-ca31-44cc-bfe6-8b22cf69d88d req-918a1499-67f2-4615-8e80-1f61cdf15b9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.953 186792 DEBUG oslo_concurrency.lockutils [req-5194f3c1-ca31-44cc-bfe6-8b22cf69d88d req-918a1499-67f2-4615-8e80-1f61cdf15b9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.953 186792 DEBUG oslo_concurrency.lockutils [req-5194f3c1-ca31-44cc-bfe6-8b22cf69d88d req-918a1499-67f2-4615-8e80-1f61cdf15b9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:33 np0005531888 nova_compute[186788]: 2025-11-22 07:50:33.954 186792 DEBUG nova.compute.manager [req-5194f3c1-ca31-44cc-bfe6-8b22cf69d88d req-918a1499-67f2-4615-8e80-1f61cdf15b9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Processing event network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.011 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb58269-fee5-4ff1-9010-0a4002b2623b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.013 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.014 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.014 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e910dbb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:34 np0005531888 kernel: tap5e910dbb-20: entered promiscuous mode
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.016 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:34 np0005531888 NetworkManager[55166]: <info>  [1763797834.0169] manager: (tap5e910dbb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.019 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5e910dbb-20, col_values=(('external_ids', {'iface-id': 'df80c07a-3ea3-4dde-8219-31b028a556e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.021 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:34 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:34Z|00116|binding|INFO|Releasing lport df80c07a-3ea3-4dde-8219-31b028a556e5 from this chassis (sb_readonly=0)
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.022 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.023 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[63d7d1d2-44b1-4b44-a0f7-4d609c7c2dd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.025 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:50:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:34.027 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'env', 'PROCESS_TAG=haproxy-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5e910dbb-27d1-4915-8b74-d0538d33c33c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.033 186792 DEBUG nova.network.neutron [req-6df31a93-4d44-4fd0-99d1-76b1b45078b3 req-add67ff1-8512-4042-96d3-da4da1dd1fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Updated VIF entry in instance network info cache for port b16926eb-7b26-4c46-9d49-b192a9fde0df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.033 186792 DEBUG nova.network.neutron [req-6df31a93-4d44-4fd0-99d1-76b1b45078b3 req-add67ff1-8512-4042-96d3-da4da1dd1fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Updating instance_info_cache with network_info: [{"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.035 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.050 186792 DEBUG oslo_concurrency.lockutils [req-6df31a93-4d44-4fd0-99d1-76b1b45078b3 req-add67ff1-8512-4042-96d3-da4da1dd1fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6d11506a-3658-40b5-ab70-cf036fa5b543" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.250 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.251 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797834.2495894, 6d11506a-3658-40b5-ab70-cf036fa5b543 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.251 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] VM Started (Lifecycle Event)#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.255 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.259 186792 INFO nova.virt.libvirt.driver [-] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Instance spawned successfully.#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.259 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.268 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.275 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.280 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.281 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.282 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.282 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.282 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.283 186792 DEBUG nova.virt.libvirt.driver [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.303 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.303 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797834.25024, 6d11506a-3658-40b5-ab70-cf036fa5b543 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.303 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.329 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.335 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797834.2529929, 6d11506a-3658-40b5-ab70-cf036fa5b543 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.336 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.360 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.364 186792 INFO nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Took 7.50 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.365 186792 DEBUG nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.368 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.396 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.450 186792 INFO nova.compute.manager [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Took 8.04 seconds to build instance.#033[00m
Nov 22 02:50:34 np0005531888 nova_compute[186788]: 2025-11-22 07:50:34.465 186792 DEBUG oslo_concurrency.lockutils [None req-a0413897-e5ed-4810-876b-478a5e2a029b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:34 np0005531888 podman[219929]: 2025-11-22 07:50:34.479272632 +0000 UTC m=+0.076388418 container create 6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:50:34 np0005531888 systemd[1]: Started libpod-conmon-6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d.scope.
Nov 22 02:50:34 np0005531888 podman[219929]: 2025-11-22 07:50:34.429763661 +0000 UTC m=+0.026879467 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:50:34 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:50:34 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc7b0bddf193ccc940eff65086f41faee491aa040ab5b3b36bc09bda716eebe6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:50:34 np0005531888 podman[219929]: 2025-11-22 07:50:34.584654027 +0000 UTC m=+0.181769833 container init 6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 02:50:34 np0005531888 podman[219929]: 2025-11-22 07:50:34.593444858 +0000 UTC m=+0.190560644 container start 6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:50:34 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[219942]: [NOTICE]   (219948) : New worker (219950) forked
Nov 22 02:50:34 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[219942]: [NOTICE]   (219948) : Loading success.
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.243 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "6d11506a-3658-40b5-ab70-cf036fa5b543" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.244 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.245 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.245 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.246 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.255 186792 INFO nova.compute.manager [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Terminating instance#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.262 186792 DEBUG nova.compute.manager [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:50:35 np0005531888 kernel: tapb16926eb-7b (unregistering): left promiscuous mode
Nov 22 02:50:35 np0005531888 NetworkManager[55166]: <info>  [1763797835.2818] device (tapb16926eb-7b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.293 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:35Z|00117|binding|INFO|Releasing lport b16926eb-7b26-4c46-9d49-b192a9fde0df from this chassis (sb_readonly=0)
Nov 22 02:50:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:35Z|00118|binding|INFO|Setting lport b16926eb-7b26-4c46-9d49-b192a9fde0df down in Southbound
Nov 22 02:50:35 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:35Z|00119|binding|INFO|Removing iface tapb16926eb-7b ovn-installed in OVS
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.302 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:ac:f0 10.100.0.6'], port_security=['fa:16:3e:9c:ac:f0 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6d11506a-3658-40b5-ab70-cf036fa5b543', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=b16926eb-7b26-4c46-9d49-b192a9fde0df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.304 104023 INFO neutron.agent.ovn.metadata.agent [-] Port b16926eb-7b26-4c46-9d49-b192a9fde0df in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c unbound from our chassis#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.306 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e910dbb-27d1-4915-8b74-d0538d33c33c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.308 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[816b51ca-987d-44a6-bd6d-2c464d171649]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.309 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace which is not needed anymore#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.311 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531888 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Nov 22 02:50:35 np0005531888 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002f.scope: Consumed 1.489s CPU time.
Nov 22 02:50:35 np0005531888 systemd-machined[153106]: Machine qemu-23-instance-0000002f terminated.
Nov 22 02:50:35 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[219942]: [NOTICE]   (219948) : haproxy version is 2.8.14-c23fe91
Nov 22 02:50:35 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[219942]: [NOTICE]   (219948) : path to executable is /usr/sbin/haproxy
Nov 22 02:50:35 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[219942]: [WARNING]  (219948) : Exiting Master process...
Nov 22 02:50:35 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[219942]: [ALERT]    (219948) : Current worker (219950) exited with code 143 (Terminated)
Nov 22 02:50:35 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[219942]: [WARNING]  (219948) : All workers exited. Exiting... (0)
Nov 22 02:50:35 np0005531888 systemd[1]: libpod-6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d.scope: Deactivated successfully.
Nov 22 02:50:35 np0005531888 podman[219982]: 2025-11-22 07:50:35.46838017 +0000 UTC m=+0.053425366 container died 6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 02:50:35 np0005531888 NetworkManager[55166]: <info>  [1763797835.4917] manager: (tapb16926eb-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Nov 22 02:50:35 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d-userdata-shm.mount: Deactivated successfully.
Nov 22 02:50:35 np0005531888 systemd[1]: var-lib-containers-storage-overlay-dc7b0bddf193ccc940eff65086f41faee491aa040ab5b3b36bc09bda716eebe6-merged.mount: Deactivated successfully.
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.545 186792 INFO nova.virt.libvirt.driver [-] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Instance destroyed successfully.#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.546 186792 DEBUG nova.objects.instance [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'resources' on Instance uuid 6d11506a-3658-40b5-ab70-cf036fa5b543 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.659 186792 DEBUG nova.virt.libvirt.vif [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:50:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-945838049',display_name='tempest-DeleteServersTestJSON-server-945838049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-945838049',id=47,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:50:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-6u3zkf7z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:50:34Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=6d11506a-3658-40b5-ab70-cf036fa5b543,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.660 186792 DEBUG nova.network.os_vif_util [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "address": "fa:16:3e:9c:ac:f0", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb16926eb-7b", "ovs_interfaceid": "b16926eb-7b26-4c46-9d49-b192a9fde0df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.660 186792 DEBUG nova.network.os_vif_util [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:ac:f0,bridge_name='br-int',has_traffic_filtering=True,id=b16926eb-7b26-4c46-9d49-b192a9fde0df,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb16926eb-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.661 186792 DEBUG os_vif [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:ac:f0,bridge_name='br-int',has_traffic_filtering=True,id=b16926eb-7b26-4c46-9d49-b192a9fde0df,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb16926eb-7b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.663 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.663 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb16926eb-7b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.668 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.671 186792 INFO os_vif [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:ac:f0,bridge_name='br-int',has_traffic_filtering=True,id=b16926eb-7b26-4c46-9d49-b192a9fde0df,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb16926eb-7b')#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.671 186792 INFO nova.virt.libvirt.driver [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Deleting instance files /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543_del#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.672 186792 INFO nova.virt.libvirt.driver [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Deletion of /var/lib/nova/instances/6d11506a-3658-40b5-ab70-cf036fa5b543_del complete#033[00m
Nov 22 02:50:35 np0005531888 podman[219982]: 2025-11-22 07:50:35.69418788 +0000 UTC m=+0.279233086 container cleanup 6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 02:50:35 np0005531888 systemd[1]: libpod-conmon-6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d.scope: Deactivated successfully.
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.786 186792 INFO nova.compute.manager [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Took 0.52 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.787 186792 DEBUG oslo.service.loopingcall [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.787 186792 DEBUG nova.compute.manager [-] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.788 186792 DEBUG nova.network.neutron [-] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:50:35 np0005531888 podman[220031]: 2025-11-22 07:50:35.795364684 +0000 UTC m=+0.073248003 container remove 6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:50:35 np0005531888 podman[220032]: 2025-11-22 07:50:35.798211943 +0000 UTC m=+0.068811247 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.802 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7398955d-dac4-4092-917f-3c651c256186]: (4, ('Sat Nov 22 07:50:35 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d)\n6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d\nSat Nov 22 07:50:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d)\n6afc04e5ad47bd6f582aa39d259de6477498d2e15957ea421cef7b4deb72680d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.804 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bbd5b0-6d61-4ef2-a237-731a4e3e45ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.805 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:35 np0005531888 kernel: tap5e910dbb-20: left promiscuous mode
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.810 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531888 nova_compute[186788]: 2025-11-22 07:50:35.821 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.825 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0afb871e-063f-4dc1-ab2a-2039cc787eb1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.855 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f343f144-b1b6-4219-a1f7-c6dd9a61eaaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.856 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3b04b5-2293-46d1-9c39-63ca5a2cd222]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.872 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8c98384b-6ff2-4e39-a5ff-a8106390e1ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457440, 'reachable_time': 36592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220071, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.874 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:50:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:35.874 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[e37e70a5-2687-4cf8-b4ac-469c6ff6cdf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:35 np0005531888 systemd[1]: run-netns-ovnmeta\x2d5e910dbb\x2d27d1\x2d4915\x2d8b74\x2dd0538d33c33c.mount: Deactivated successfully.
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.489 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:36.803 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:36.803 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:36.804 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.819 186792 DEBUG nova.compute.manager [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received event network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.820 186792 DEBUG oslo_concurrency.lockutils [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.820 186792 DEBUG oslo_concurrency.lockutils [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.821 186792 DEBUG oslo_concurrency.lockutils [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.821 186792 DEBUG nova.compute.manager [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] No waiting events found dispatching network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.821 186792 WARNING nova.compute.manager [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received unexpected event network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.821 186792 DEBUG nova.compute.manager [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received event network-vif-unplugged-b16926eb-7b26-4c46-9d49-b192a9fde0df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.822 186792 DEBUG oslo_concurrency.lockutils [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.822 186792 DEBUG oslo_concurrency.lockutils [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.822 186792 DEBUG oslo_concurrency.lockutils [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.822 186792 DEBUG nova.compute.manager [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] No waiting events found dispatching network-vif-unplugged-b16926eb-7b26-4c46-9d49-b192a9fde0df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:50:36 np0005531888 nova_compute[186788]: 2025-11-22 07:50:36.822 186792 DEBUG nova.compute.manager [req-b44576f5-a774-43c2-acca-f1a522bb75ee req-8dd6641b-3118-45ad-b271-0d23685eb4a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received event network-vif-unplugged-b16926eb-7b26-4c46-9d49-b192a9fde0df for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.837 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:50:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:50:37 np0005531888 nova_compute[186788]: 2025-11-22 07:50:37.984 186792 DEBUG nova.network.neutron [-] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.009 186792 INFO nova.compute.manager [-] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Took 2.22 seconds to deallocate network for instance.#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.129 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.130 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.195 186792 DEBUG nova.compute.provider_tree [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.210 186792 DEBUG nova.scheduler.client.report [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.241 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.281 186792 INFO nova.scheduler.client.report [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Deleted allocations for instance 6d11506a-3658-40b5-ab70-cf036fa5b543#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.429 186792 DEBUG oslo_concurrency.lockutils [None req-217bb2eb-f580-4963-afce-3a21969c0ee9 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.911 186792 DEBUG nova.compute.manager [req-c4b856f5-a8e3-4f64-9826-40e2f367f1a6 req-658dcab2-4759-40c9-8b2e-f427eba7b607 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received event network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.912 186792 DEBUG oslo_concurrency.lockutils [req-c4b856f5-a8e3-4f64-9826-40e2f367f1a6 req-658dcab2-4759-40c9-8b2e-f427eba7b607 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.912 186792 DEBUG oslo_concurrency.lockutils [req-c4b856f5-a8e3-4f64-9826-40e2f367f1a6 req-658dcab2-4759-40c9-8b2e-f427eba7b607 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.913 186792 DEBUG oslo_concurrency.lockutils [req-c4b856f5-a8e3-4f64-9826-40e2f367f1a6 req-658dcab2-4759-40c9-8b2e-f427eba7b607 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d11506a-3658-40b5-ab70-cf036fa5b543-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.913 186792 DEBUG nova.compute.manager [req-c4b856f5-a8e3-4f64-9826-40e2f367f1a6 req-658dcab2-4759-40c9-8b2e-f427eba7b607 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] No waiting events found dispatching network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.913 186792 WARNING nova.compute.manager [req-c4b856f5-a8e3-4f64-9826-40e2f367f1a6 req-658dcab2-4759-40c9-8b2e-f427eba7b607 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received unexpected event network-vif-plugged-b16926eb-7b26-4c46-9d49-b192a9fde0df for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:50:38 np0005531888 nova_compute[186788]: 2025-11-22 07:50:38.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:50:39 np0005531888 podman[220072]: 2025-11-22 07:50:39.677892349 +0000 UTC m=+0.047974705 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:50:39 np0005531888 nova_compute[186788]: 2025-11-22 07:50:39.961 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:40 np0005531888 nova_compute[186788]: 2025-11-22 07:50:40.114 186792 DEBUG nova.compute.manager [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Received event network-vif-deleted-b16926eb-7b26-4c46-9d49-b192a9fde0df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:40 np0005531888 nova_compute[186788]: 2025-11-22 07:50:40.669 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:40 np0005531888 nova_compute[186788]: 2025-11-22 07:50:40.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:41 np0005531888 nova_compute[186788]: 2025-11-22 07:50:41.491 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:41 np0005531888 nova_compute[186788]: 2025-11-22 07:50:41.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:43 np0005531888 nova_compute[186788]: 2025-11-22 07:50:43.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:44 np0005531888 nova_compute[186788]: 2025-11-22 07:50:44.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.142 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.142 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.162 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.247 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.248 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.255 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.255 186792 INFO nova.compute.claims [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.386 186792 DEBUG nova.compute.provider_tree [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.398 186792 DEBUG nova.scheduler.client.report [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.417 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.418 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.658 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.658 186792 DEBUG nova.network.neutron [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.672 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.719 186792 INFO nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.769 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.886 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.889 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.889 186792 INFO nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Creating image(s)#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.891 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "/var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.892 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.893 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.906 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.983 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.984 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.984 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:45 np0005531888 nova_compute[186788]: 2025-11-22 07:50:45.996 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.062 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.064 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.119 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.121 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.122 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.185 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.187 186792 DEBUG nova.virt.disk.api [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Checking if we can resize image /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.187 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.240 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.242 186792 DEBUG nova.virt.disk.api [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Cannot resize image /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.242 186792 DEBUG nova.objects.instance [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'migration_context' on Instance uuid eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.256 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.257 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Ensure instance console log exists: /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.257 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.257 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.258 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.493 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:46 np0005531888 podman[220106]: 2025-11-22 07:50:46.688994684 +0000 UTC m=+0.064310407 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:50:46 np0005531888 nova_compute[186788]: 2025-11-22 07:50:46.720 186792 DEBUG nova.policy [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:50:47 np0005531888 nova_compute[186788]: 2025-11-22 07:50:47.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:47 np0005531888 nova_compute[186788]: 2025-11-22 07:50:47.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:50:48 np0005531888 nova_compute[186788]: 2025-11-22 07:50:48.028 186792 DEBUG nova.network.neutron [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Successfully created port: 07dba905-852e-401e-94ed-e43c30165aa8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:50:48 np0005531888 podman[220126]: 2025-11-22 07:50:48.708949824 +0000 UTC m=+0.083438008 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:50:48 np0005531888 nova_compute[186788]: 2025-11-22 07:50:48.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:48 np0005531888 nova_compute[186788]: 2025-11-22 07:50:48.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:48 np0005531888 nova_compute[186788]: 2025-11-22 07:50:48.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:48 np0005531888 nova_compute[186788]: 2025-11-22 07:50:48.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:48 np0005531888 nova_compute[186788]: 2025-11-22 07:50:48.977 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.125 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.127 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5737MB free_disk=73.42208480834961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.127 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.127 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.209 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.209 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.210 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.252 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.274 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.303 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.304 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.416 186792 DEBUG nova.network.neutron [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Successfully updated port: 07dba905-852e-401e-94ed-e43c30165aa8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.432 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.432 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.432 186792 DEBUG nova.network.neutron [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.523 186792 DEBUG nova.compute.manager [req-54024f6f-5c97-47f9-8777-8f4ba7cb0554 req-0d346a2b-fc6c-4c79-8b71-f41a0e2d26cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received event network-changed-07dba905-852e-401e-94ed-e43c30165aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.524 186792 DEBUG nova.compute.manager [req-54024f6f-5c97-47f9-8777-8f4ba7cb0554 req-0d346a2b-fc6c-4c79-8b71-f41a0e2d26cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Refreshing instance network info cache due to event network-changed-07dba905-852e-401e-94ed-e43c30165aa8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.524 186792 DEBUG oslo_concurrency.lockutils [req-54024f6f-5c97-47f9-8777-8f4ba7cb0554 req-0d346a2b-fc6c-4c79-8b71-f41a0e2d26cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:49 np0005531888 nova_compute[186788]: 2025-11-22 07:50:49.586 186792 DEBUG nova.network.neutron [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:50:50 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.304 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:50 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.544 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797835.5431643, 6d11506a-3658-40b5-ab70-cf036fa5b543 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:50 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.545 186792 INFO nova.compute.manager [-] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:50:50 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.617 186792 DEBUG nova.compute.manager [None req-3e56df51-ff19-401f-a511-e78302500217 - - - - - -] [instance: 6d11506a-3658-40b5-ab70-cf036fa5b543] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:50 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.676 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.959 186792 DEBUG nova.network.neutron [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Updating instance_info_cache with network_info: [{"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.987 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.988 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Instance network_info: |[{"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.989 186792 DEBUG oslo_concurrency.lockutils [req-54024f6f-5c97-47f9-8777-8f4ba7cb0554 req-0d346a2b-fc6c-4c79-8b71-f41a0e2d26cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.990 186792 DEBUG nova.network.neutron [req-54024f6f-5c97-47f9-8777-8f4ba7cb0554 req-0d346a2b-fc6c-4c79-8b71-f41a0e2d26cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Refreshing network info cache for port 07dba905-852e-401e-94ed-e43c30165aa8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:50.995 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Start _get_guest_xml network_info=[{"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.633 186792 WARNING nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.645 186792 DEBUG nova.virt.libvirt.host [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.646 186792 DEBUG nova.virt.libvirt.host [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.651 186792 DEBUG nova.virt.libvirt.host [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.652 186792 DEBUG nova.virt.libvirt.host [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.655 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.655 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.655 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.656 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.656 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.656 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.656 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.657 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.657 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.657 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.657 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.658 186792 DEBUG nova.virt.hardware [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.665 186792 DEBUG nova.virt.libvirt.vif [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:50:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1974740348',display_name='tempest-DeleteServersTestJSON-server-1974740348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1974740348',id=51,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-45nm5hoh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:50:45Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.665 186792 DEBUG nova.network.os_vif_util [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.666 186792 DEBUG nova.network.os_vif_util [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:5e:53,bridge_name='br-int',has_traffic_filtering=True,id=07dba905-852e-401e-94ed-e43c30165aa8,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07dba905-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.667 186792 DEBUG nova.objects.instance [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.669 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.702 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <uuid>eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e</uuid>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <name>instance-00000033</name>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <nova:name>tempest-DeleteServersTestJSON-server-1974740348</nova:name>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:50:52</nova:creationTime>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        <nova:user uuid="57077a1511bf46d897beb6fd5eedfa67">tempest-DeleteServersTestJSON-550712359-project-member</nova:user>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        <nova:project uuid="6b68db2b61a54aeaa8ac219f44ed3e75">tempest-DeleteServersTestJSON-550712359</nova:project>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        <nova:port uuid="07dba905-852e-401e-94ed-e43c30165aa8">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <entry name="serial">eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e</entry>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <entry name="uuid">eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e</entry>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk.config"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:28:5e:53"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <target dev="tap07dba905-85"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/console.log" append="off"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:50:52 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:50:52 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:50:52 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:50:52 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.703 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Preparing to wait for external event network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.703 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.703 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.703 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.704 186792 DEBUG nova.virt.libvirt.vif [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:50:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1974740348',display_name='tempest-DeleteServersTestJSON-server-1974740348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1974740348',id=51,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-45nm5hoh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:50:45Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.704 186792 DEBUG nova.network.os_vif_util [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.705 186792 DEBUG nova.network.os_vif_util [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:5e:53,bridge_name='br-int',has_traffic_filtering=True,id=07dba905-852e-401e-94ed-e43c30165aa8,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07dba905-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.705 186792 DEBUG os_vif [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:5e:53,bridge_name='br-int',has_traffic_filtering=True,id=07dba905-852e-401e-94ed-e43c30165aa8,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07dba905-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.705 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.706 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.706 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.713 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.714 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07dba905-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.714 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap07dba905-85, col_values=(('external_ids', {'iface-id': '07dba905-852e-401e-94ed-e43c30165aa8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:5e:53', 'vm-uuid': 'eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:52 np0005531888 podman[220149]: 2025-11-22 07:50:52.715321236 +0000 UTC m=+0.074294557 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.717 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:52 np0005531888 NetworkManager[55166]: <info>  [1763797852.7188] manager: (tap07dba905-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.720 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.729 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.730 186792 INFO os_vif [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:5e:53,bridge_name='br-int',has_traffic_filtering=True,id=07dba905-852e-401e-94ed-e43c30165aa8,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07dba905-85')#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.785 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.786 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.786 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No VIF found with MAC fa:16:3e:28:5e:53, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:50:52 np0005531888 nova_compute[186788]: 2025-11-22 07:50:52.786 186792 INFO nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Using config drive#033[00m
Nov 22 02:50:53 np0005531888 nova_compute[186788]: 2025-11-22 07:50:53.697 186792 DEBUG nova.network.neutron [req-54024f6f-5c97-47f9-8777-8f4ba7cb0554 req-0d346a2b-fc6c-4c79-8b71-f41a0e2d26cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Updated VIF entry in instance network info cache for port 07dba905-852e-401e-94ed-e43c30165aa8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:50:53 np0005531888 nova_compute[186788]: 2025-11-22 07:50:53.697 186792 DEBUG nova.network.neutron [req-54024f6f-5c97-47f9-8777-8f4ba7cb0554 req-0d346a2b-fc6c-4c79-8b71-f41a0e2d26cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Updating instance_info_cache with network_info: [{"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:53 np0005531888 nova_compute[186788]: 2025-11-22 07:50:53.711 186792 DEBUG oslo_concurrency.lockutils [req-54024f6f-5c97-47f9-8777-8f4ba7cb0554 req-0d346a2b-fc6c-4c79-8b71-f41a0e2d26cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:53 np0005531888 nova_compute[186788]: 2025-11-22 07:50:53.847 186792 INFO nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Creating config drive at /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk.config#033[00m
Nov 22 02:50:53 np0005531888 nova_compute[186788]: 2025-11-22 07:50:53.857 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbix3tu5g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.000 186792 DEBUG oslo_concurrency.processutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbix3tu5g" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:54 np0005531888 NetworkManager[55166]: <info>  [1763797854.0983] manager: (tap07dba905-85): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Nov 22 02:50:54 np0005531888 kernel: tap07dba905-85: entered promiscuous mode
Nov 22 02:50:54 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:54Z|00120|binding|INFO|Claiming lport 07dba905-852e-401e-94ed-e43c30165aa8 for this chassis.
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.101 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:54 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:54Z|00121|binding|INFO|07dba905-852e-401e-94ed-e43c30165aa8: Claiming fa:16:3e:28:5e:53 10.100.0.8
Nov 22 02:50:54 np0005531888 systemd-udevd[220189]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:50:54 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:54Z|00122|binding|INFO|Setting lport 07dba905-852e-401e-94ed-e43c30165aa8 ovn-installed in OVS
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.138 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:54 np0005531888 NetworkManager[55166]: <info>  [1763797854.1478] device (tap07dba905-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:50:54 np0005531888 NetworkManager[55166]: <info>  [1763797854.1487] device (tap07dba905-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:50:54 np0005531888 systemd-machined[153106]: New machine qemu-24-instance-00000033.
Nov 22 02:50:54 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:54Z|00123|binding|INFO|Setting lport 07dba905-852e-401e-94ed-e43c30165aa8 up in Southbound
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.170 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:5e:53 10.100.0.8'], port_security=['fa:16:3e:28:5e:53 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=07dba905-852e-401e-94ed-e43c30165aa8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.172 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 07dba905-852e-401e-94ed-e43c30165aa8 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c bound to our chassis#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.174 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5e910dbb-27d1-4915-8b74-d0538d33c33c#033[00m
Nov 22 02:50:54 np0005531888 systemd[1]: Started Virtual Machine qemu-24-instance-00000033.
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.194 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[17e7f73f-b685-4ec0-98d5-12c06bd2ec61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.196 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5e910dbb-21 in ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.197 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5e910dbb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.198 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b3af49b9-6a7c-43af-a912-9fb9e3d6e043]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.199 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[041822ea-8dd2-48db-9101-900ed1f08c3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.221 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[c7943d17-7749-474c-a275-1093770fe591]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.235 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[58ba38a7-3084-4b93-91bd-93a9cd3ba5f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.272 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[384bfc4e-00f9-483b-a69b-01d5c9ad807e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 NetworkManager[55166]: <info>  [1763797854.2804] manager: (tap5e910dbb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.281 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[302ed297-9156-4623-8f60-be105c9af8c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.313 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6c327015-7a79-4d53-96f7-dabe90deee90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.317 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0e4b6b-8434-450f-879f-c12aee704b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 NetworkManager[55166]: <info>  [1763797854.3418] device (tap5e910dbb-20): carrier: link connected
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.346 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8d502efa-3775-46ac-a970-6bb08ae7183a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.364 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bc263112-4df7-4bc8-8922-25ffd424870f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459498, 'reachable_time': 43349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220225, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.378 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d89e8d50-5bde-465b-b994-eb721c9d28e3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e859'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459498, 'tstamp': 459498}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220226, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.395 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd45fe3-40ba-4ca4-9fe9-1d85f6d35e79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459498, 'reachable_time': 43349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220227, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.429 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[72d6abf1-3a40-4c2a-8dda-ac7933d70981]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.498 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0c73211c-a959-4611-b99d-552d7114a9f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.500 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.500 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.501 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e910dbb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.503 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:54 np0005531888 NetworkManager[55166]: <info>  [1763797854.5036] manager: (tap5e910dbb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Nov 22 02:50:54 np0005531888 kernel: tap5e910dbb-20: entered promiscuous mode
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.505 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.506 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5e910dbb-20, col_values=(('external_ids', {'iface-id': 'df80c07a-3ea3-4dde-8219-31b028a556e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.508 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:54 np0005531888 ovn_controller[95067]: 2025-11-22T07:50:54Z|00124|binding|INFO|Releasing lport df80c07a-3ea3-4dde-8219-31b028a556e5 from this chassis (sb_readonly=0)
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.526 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.531 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.532 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1df146a7-d0d4-4c6e-980e-aefcd507eab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.534 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:50:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:54.535 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'env', 'PROCESS_TAG=haproxy-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5e910dbb-27d1-4915-8b74-d0538d33c33c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.580 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797854.5791266, eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.580 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] VM Started (Lifecycle Event)#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.600 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.607 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797854.5798078, eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.608 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.626 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.630 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.653 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:50:54 np0005531888 podman[220265]: 2025-11-22 07:50:54.93960696 +0000 UTC m=+0.051291775 container create 0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.948 186792 DEBUG nova.compute.manager [req-5ffa4abc-e0aa-4d13-83f3-641d36f49ac7 req-7d047dc1-c408-4d5f-bb5a-32228ff39000 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received event network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.948 186792 DEBUG oslo_concurrency.lockutils [req-5ffa4abc-e0aa-4d13-83f3-641d36f49ac7 req-7d047dc1-c408-4d5f-bb5a-32228ff39000 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.949 186792 DEBUG oslo_concurrency.lockutils [req-5ffa4abc-e0aa-4d13-83f3-641d36f49ac7 req-7d047dc1-c408-4d5f-bb5a-32228ff39000 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.949 186792 DEBUG oslo_concurrency.lockutils [req-5ffa4abc-e0aa-4d13-83f3-641d36f49ac7 req-7d047dc1-c408-4d5f-bb5a-32228ff39000 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.949 186792 DEBUG nova.compute.manager [req-5ffa4abc-e0aa-4d13-83f3-641d36f49ac7 req-7d047dc1-c408-4d5f-bb5a-32228ff39000 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Processing event network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.950 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.955 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797854.9549043, eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.955 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.958 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.961 186792 INFO nova.virt.libvirt.driver [-] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Instance spawned successfully.#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.962 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:50:54 np0005531888 systemd[1]: Started libpod-conmon-0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56.scope.
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.976 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.985 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.989 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.990 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.990 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.995 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.996 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:54 np0005531888 nova_compute[186788]: 2025-11-22 07:50:54.996 186792 DEBUG nova.virt.libvirt.driver [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:55 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:50:55 np0005531888 podman[220265]: 2025-11-22 07:50:54.914421495 +0000 UTC m=+0.026106350 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:50:55 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/849d120739d7b0046093c30a62c7de03426187e52c83ab005fce58034707c71a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:50:55 np0005531888 podman[220265]: 2025-11-22 07:50:55.025007074 +0000 UTC m=+0.136691919 container init 0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:50:55 np0005531888 podman[220265]: 2025-11-22 07:50:55.032725809 +0000 UTC m=+0.144410634 container start 0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:50:55 np0005531888 nova_compute[186788]: 2025-11-22 07:50:55.036 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:50:55 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220280]: [NOTICE]   (220284) : New worker (220286) forked
Nov 22 02:50:55 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220280]: [NOTICE]   (220284) : Loading success.
Nov 22 02:50:55 np0005531888 nova_compute[186788]: 2025-11-22 07:50:55.130 186792 INFO nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Took 9.24 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:50:55 np0005531888 nova_compute[186788]: 2025-11-22 07:50:55.131 186792 DEBUG nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:55 np0005531888 nova_compute[186788]: 2025-11-22 07:50:55.221 186792 INFO nova.compute.manager [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Took 10.01 seconds to build instance.#033[00m
Nov 22 02:50:55 np0005531888 nova_compute[186788]: 2025-11-22 07:50:55.330 186792 DEBUG oslo_concurrency.lockutils [None req-c46338c1-ae13-4eeb-8047-6125829d42e3 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:57.039 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:50:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:57.051 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.040 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.054 186792 DEBUG nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received event network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.055 186792 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.056 186792 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.056 186792 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.056 186792 DEBUG nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] No waiting events found dispatching network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.056 186792 WARNING nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received unexpected event network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 for instance with vm_state active and task_state pausing.#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.127 186792 INFO nova.compute.manager [None req-34a93e56-fcfd-445a-85c7-8eb8229ddfc7 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Pausing#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.128 186792 DEBUG nova.objects.instance [None req-34a93e56-fcfd-445a-85c7-8eb8229ddfc7 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'flavor' on Instance uuid eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.174 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797857.1741922, eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.175 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.177 186792 DEBUG nova.compute.manager [None req-34a93e56-fcfd-445a-85c7-8eb8229ddfc7 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.215 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.220 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.661 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:57 np0005531888 nova_compute[186788]: 2025-11-22 07:50:57.717 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:50:59.058 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.095 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.097 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.097 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.098 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.098 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.106 186792 INFO nova.compute.manager [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Terminating instance#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.112 186792 DEBUG nova.compute.manager [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:51:00 np0005531888 kernel: tap07dba905-85 (unregistering): left promiscuous mode
Nov 22 02:51:00 np0005531888 NetworkManager[55166]: <info>  [1763797860.1345] device (tap07dba905-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:51:00 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:00Z|00125|binding|INFO|Releasing lport 07dba905-852e-401e-94ed-e43c30165aa8 from this chassis (sb_readonly=0)
Nov 22 02:51:00 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:00Z|00126|binding|INFO|Setting lport 07dba905-852e-401e-94ed-e43c30165aa8 down in Southbound
Nov 22 02:51:00 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:00Z|00127|binding|INFO|Removing iface tap07dba905-85 ovn-installed in OVS
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.150 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.169 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:5e:53 10.100.0.8'], port_security=['fa:16:3e:28:5e:53 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=07dba905-852e-401e-94ed-e43c30165aa8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.171 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 07dba905-852e-401e-94ed-e43c30165aa8 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c unbound from our chassis#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.173 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e910dbb-27d1-4915-8b74-d0538d33c33c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.176 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.175 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b1eeef92-eca9-401e-8b76-45ee9c874264]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.177 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace which is not needed anymore#033[00m
Nov 22 02:51:00 np0005531888 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000033.scope: Deactivated successfully.
Nov 22 02:51:00 np0005531888 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000033.scope: Consumed 2.673s CPU time.
Nov 22 02:51:00 np0005531888 systemd-machined[153106]: Machine qemu-24-instance-00000033 terminated.
Nov 22 02:51:00 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220280]: [NOTICE]   (220284) : haproxy version is 2.8.14-c23fe91
Nov 22 02:51:00 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220280]: [NOTICE]   (220284) : path to executable is /usr/sbin/haproxy
Nov 22 02:51:00 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220280]: [WARNING]  (220284) : Exiting Master process...
Nov 22 02:51:00 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220280]: [WARNING]  (220284) : Exiting Master process...
Nov 22 02:51:00 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220280]: [ALERT]    (220284) : Current worker (220286) exited with code 143 (Terminated)
Nov 22 02:51:00 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220280]: [WARNING]  (220284) : All workers exited. Exiting... (0)
Nov 22 02:51:00 np0005531888 systemd[1]: libpod-0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56.scope: Deactivated successfully.
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.335 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531888 podman[220319]: 2025-11-22 07:51:00.337713503 +0000 UTC m=+0.051702985 container died 0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56-userdata-shm.mount: Deactivated successfully.
Nov 22 02:51:00 np0005531888 systemd[1]: var-lib-containers-storage-overlay-849d120739d7b0046093c30a62c7de03426187e52c83ab005fce58034707c71a-merged.mount: Deactivated successfully.
Nov 22 02:51:00 np0005531888 podman[220319]: 2025-11-22 07:51:00.374685972 +0000 UTC m=+0.088675444 container cleanup 0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.383 186792 DEBUG nova.compute.manager [req-fb4759fe-d891-413b-9489-c6f55061ea10 req-da1ab55e-aca4-4a09-a8b7-91626a9b398b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received event network-vif-unplugged-07dba905-852e-401e-94ed-e43c30165aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.383 186792 DEBUG oslo_concurrency.lockutils [req-fb4759fe-d891-413b-9489-c6f55061ea10 req-da1ab55e-aca4-4a09-a8b7-91626a9b398b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.383 186792 DEBUG oslo_concurrency.lockutils [req-fb4759fe-d891-413b-9489-c6f55061ea10 req-da1ab55e-aca4-4a09-a8b7-91626a9b398b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.384 186792 DEBUG oslo_concurrency.lockutils [req-fb4759fe-d891-413b-9489-c6f55061ea10 req-da1ab55e-aca4-4a09-a8b7-91626a9b398b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.384 186792 DEBUG nova.compute.manager [req-fb4759fe-d891-413b-9489-c6f55061ea10 req-da1ab55e-aca4-4a09-a8b7-91626a9b398b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] No waiting events found dispatching network-vif-unplugged-07dba905-852e-401e-94ed-e43c30165aa8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.384 186792 DEBUG nova.compute.manager [req-fb4759fe-d891-413b-9489-c6f55061ea10 req-da1ab55e-aca4-4a09-a8b7-91626a9b398b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received event network-vif-unplugged-07dba905-852e-401e-94ed-e43c30165aa8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.388 186792 INFO nova.virt.libvirt.driver [-] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Instance destroyed successfully.#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.388 186792 DEBUG nova.objects.instance [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'resources' on Instance uuid eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:00 np0005531888 systemd[1]: libpod-conmon-0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56.scope: Deactivated successfully.
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.402 186792 DEBUG nova.virt.libvirt.vif [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:50:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1974740348',display_name='tempest-DeleteServersTestJSON-server-1974740348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1974740348',id=51,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:50:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-45nm5hoh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:50:57Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.403 186792 DEBUG nova.network.os_vif_util [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "07dba905-852e-401e-94ed-e43c30165aa8", "address": "fa:16:3e:28:5e:53", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07dba905-85", "ovs_interfaceid": "07dba905-852e-401e-94ed-e43c30165aa8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.404 186792 DEBUG nova.network.os_vif_util [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:5e:53,bridge_name='br-int',has_traffic_filtering=True,id=07dba905-852e-401e-94ed-e43c30165aa8,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07dba905-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.404 186792 DEBUG os_vif [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:5e:53,bridge_name='br-int',has_traffic_filtering=True,id=07dba905-852e-401e-94ed-e43c30165aa8,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07dba905-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.406 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07dba905-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.408 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.410 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.412 186792 INFO os_vif [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:5e:53,bridge_name='br-int',has_traffic_filtering=True,id=07dba905-852e-401e-94ed-e43c30165aa8,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07dba905-85')#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.413 186792 INFO nova.virt.libvirt.driver [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Deleting instance files /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e_del#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.414 186792 INFO nova.virt.libvirt.driver [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Deletion of /var/lib/nova/instances/eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e_del complete#033[00m
Nov 22 02:51:00 np0005531888 podman[220358]: 2025-11-22 07:51:00.448706332 +0000 UTC m=+0.049830539 container remove 0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.454 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[00f7c5f0-8f41-4239-9161-1f036439ced8]: (4, ('Sat Nov 22 07:51:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56)\n0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56\nSat Nov 22 07:51:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56)\n0168a464f6bdb8328518965b5d29482c40a3e5e03c52fc1ba80654ffe19b0f56\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.457 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[272206d4-47d7-43ee-be8e-79d3eb36d364]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.458 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:00 np0005531888 kernel: tap5e910dbb-20: left promiscuous mode
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.460 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.474 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.476 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4da99f8c-9977-4da4-a97c-c53f2f622fec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.487 186792 INFO nova.compute.manager [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.488 186792 DEBUG oslo.service.loopingcall [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.488 186792 DEBUG nova.compute.manager [-] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:51:00 np0005531888 nova_compute[186788]: 2025-11-22 07:51:00.488 186792 DEBUG nova.network.neutron [-] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.506 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5d5011-bdfe-411c-8f1a-2f29705d5483]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.508 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e8fe13-0685-4164-bc65-06ba429b8645]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.528 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e7778484-44bb-4530-8577-a959685a05d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459491, 'reachable_time': 38696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220373, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.532 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:51:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:00.532 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[6a89e364-6d49-4ece-8204-76d8b30f5f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531888 systemd[1]: run-netns-ovnmeta\x2d5e910dbb\x2d27d1\x2d4915\x2d8b74\x2dd0538d33c33c.mount: Deactivated successfully.
Nov 22 02:51:01 np0005531888 podman[220376]: 2025-11-22 07:51:01.190805039 +0000 UTC m=+0.069589224 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 02:51:01 np0005531888 podman[220377]: 2025-11-22 07:51:01.222618765 +0000 UTC m=+0.102098067 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.255 186792 DEBUG nova.network.neutron [-] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.276 186792 INFO nova.compute.manager [-] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Took 1.79 seconds to deallocate network for instance.#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.351 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.352 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.427 186792 DEBUG nova.compute.provider_tree [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.443 186792 DEBUG nova.scheduler.client.report [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.497 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.502 186792 DEBUG nova.compute.manager [req-565eeb8d-2619-457c-810a-f35113016260 req-4d09eb64-0d77-4d4c-bbbd-e80e6b980fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received event network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.503 186792 DEBUG oslo_concurrency.lockutils [req-565eeb8d-2619-457c-810a-f35113016260 req-4d09eb64-0d77-4d4c-bbbd-e80e6b980fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.503 186792 DEBUG oslo_concurrency.lockutils [req-565eeb8d-2619-457c-810a-f35113016260 req-4d09eb64-0d77-4d4c-bbbd-e80e6b980fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.503 186792 DEBUG oslo_concurrency.lockutils [req-565eeb8d-2619-457c-810a-f35113016260 req-4d09eb64-0d77-4d4c-bbbd-e80e6b980fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.504 186792 DEBUG nova.compute.manager [req-565eeb8d-2619-457c-810a-f35113016260 req-4d09eb64-0d77-4d4c-bbbd-e80e6b980fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] No waiting events found dispatching network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.504 186792 WARNING nova.compute.manager [req-565eeb8d-2619-457c-810a-f35113016260 req-4d09eb64-0d77-4d4c-bbbd-e80e6b980fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received unexpected event network-vif-plugged-07dba905-852e-401e-94ed-e43c30165aa8 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.558 186792 INFO nova.scheduler.client.report [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Deleted allocations for instance eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.664 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:02 np0005531888 nova_compute[186788]: 2025-11-22 07:51:02.789 186792 DEBUG oslo_concurrency.lockutils [None req-eb3dd376-03e3-4165-a887-8ce2243db30e 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:04 np0005531888 nova_compute[186788]: 2025-11-22 07:51:04.533 186792 DEBUG nova.compute.manager [req-617289c2-cec5-4e98-9995-0d9691663e9b req-08c55ef7-46b2-42d0-9ac5-2d60be80c554 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Received event network-vif-deleted-07dba905-852e-401e-94ed-e43c30165aa8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:05 np0005531888 nova_compute[186788]: 2025-11-22 07:51:05.409 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:06 np0005531888 podman[220417]: 2025-11-22 07:51:06.691619683 +0000 UTC m=+0.061013169 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:51:07 np0005531888 nova_compute[186788]: 2025-11-22 07:51:07.666 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.121 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "74d7bd73-125a-4539-8622-d0225143bd08" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.121 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.135 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.214 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.215 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.221 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.222 186792 INFO nova.compute.claims [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.393 186792 DEBUG nova.compute.provider_tree [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.409 186792 DEBUG nova.scheduler.client.report [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.432 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.433 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.492 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.492 186792 DEBUG nova.network.neutron [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.511 186792 INFO nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.531 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.631 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.632 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.633 186792 INFO nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Creating image(s)#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.634 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "/var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.634 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.635 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.647 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.710 186792 DEBUG nova.policy [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.719 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.720 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.721 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.732 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.789 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.790 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.861 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.863 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.863 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.930 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.931 186792 DEBUG nova.virt.disk.api [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Checking if we can resize image /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:51:08 np0005531888 nova_compute[186788]: 2025-11-22 07:51:08.932 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.004 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.005 186792 DEBUG nova.virt.disk.api [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Cannot resize image /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.006 186792 DEBUG nova.objects.instance [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'migration_context' on Instance uuid 74d7bd73-125a-4539-8622-d0225143bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.023 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.023 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Ensure instance console log exists: /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.024 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.024 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.025 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:09 np0005531888 nova_compute[186788]: 2025-11-22 07:51:09.889 186792 DEBUG nova.network.neutron [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Successfully created port: f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:51:10 np0005531888 nova_compute[186788]: 2025-11-22 07:51:10.413 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:10 np0005531888 podman[220456]: 2025-11-22 07:51:10.682666368 +0000 UTC m=+0.055162489 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 02:51:11 np0005531888 nova_compute[186788]: 2025-11-22 07:51:11.230 186792 DEBUG nova.network.neutron [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Successfully updated port: f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:51:11 np0005531888 nova_compute[186788]: 2025-11-22 07:51:11.259 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:51:11 np0005531888 nova_compute[186788]: 2025-11-22 07:51:11.260 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:51:11 np0005531888 nova_compute[186788]: 2025-11-22 07:51:11.260 186792 DEBUG nova.network.neutron [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:51:11 np0005531888 nova_compute[186788]: 2025-11-22 07:51:11.373 186792 DEBUG nova.compute.manager [req-ef204880-dde1-452f-a040-77a24f55ed98 req-b4202fb4-218d-41e2-a48e-3fd1a9d4eb18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received event network-changed-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:11 np0005531888 nova_compute[186788]: 2025-11-22 07:51:11.373 186792 DEBUG nova.compute.manager [req-ef204880-dde1-452f-a040-77a24f55ed98 req-b4202fb4-218d-41e2-a48e-3fd1a9d4eb18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Refreshing instance network info cache due to event network-changed-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:51:11 np0005531888 nova_compute[186788]: 2025-11-22 07:51:11.374 186792 DEBUG oslo_concurrency.lockutils [req-ef204880-dde1-452f-a040-77a24f55ed98 req-b4202fb4-218d-41e2-a48e-3fd1a9d4eb18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:51:11 np0005531888 nova_compute[186788]: 2025-11-22 07:51:11.448 186792 DEBUG nova.network.neutron [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.669 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.707 186792 DEBUG nova.network.neutron [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Updating instance_info_cache with network_info: [{"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.843 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.844 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance network_info: |[{"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.844 186792 DEBUG oslo_concurrency.lockutils [req-ef204880-dde1-452f-a040-77a24f55ed98 req-b4202fb4-218d-41e2-a48e-3fd1a9d4eb18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.845 186792 DEBUG nova.network.neutron [req-ef204880-dde1-452f-a040-77a24f55ed98 req-b4202fb4-218d-41e2-a48e-3fd1a9d4eb18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Refreshing network info cache for port f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.848 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Start _get_guest_xml network_info=[{"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.856 186792 WARNING nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.865 186792 DEBUG nova.virt.libvirt.host [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.865 186792 DEBUG nova.virt.libvirt.host [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.869 186792 DEBUG nova.virt.libvirt.host [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.870 186792 DEBUG nova.virt.libvirt.host [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.870 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.871 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.871 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.871 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.872 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.872 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.872 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.872 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.872 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.872 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.873 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.873 186792 DEBUG nova.virt.hardware [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.877 186792 DEBUG nova.virt.libvirt.vif [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-958897149',display_name='tempest-DeleteServersTestJSON-server-958897149',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-958897149',id=52,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-kqgg1fvg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:51:08Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=74d7bd73-125a-4539-8622-d0225143bd08,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.878 186792 DEBUG nova.network.os_vif_util [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.878 186792 DEBUG nova.network.os_vif_util [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:53:4b,bridge_name='br-int',has_traffic_filtering=True,id=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c93f0d-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.879 186792 DEBUG nova.objects.instance [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid 74d7bd73-125a-4539-8622-d0225143bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.906 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <uuid>74d7bd73-125a-4539-8622-d0225143bd08</uuid>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <name>instance-00000034</name>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <nova:name>tempest-DeleteServersTestJSON-server-958897149</nova:name>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:51:12</nova:creationTime>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        <nova:user uuid="57077a1511bf46d897beb6fd5eedfa67">tempest-DeleteServersTestJSON-550712359-project-member</nova:user>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        <nova:project uuid="6b68db2b61a54aeaa8ac219f44ed3e75">tempest-DeleteServersTestJSON-550712359</nova:project>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        <nova:port uuid="f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <entry name="serial">74d7bd73-125a-4539-8622-d0225143bd08</entry>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <entry name="uuid">74d7bd73-125a-4539-8622-d0225143bd08</entry>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk.config"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:3a:53:4b"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <target dev="tapf3c93f0d-0d"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/console.log" append="off"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:51:12 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:51:12 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:51:12 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:51:12 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.908 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Preparing to wait for external event network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.908 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "74d7bd73-125a-4539-8622-d0225143bd08-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.908 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.909 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.909 186792 DEBUG nova.virt.libvirt.vif [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-958897149',display_name='tempest-DeleteServersTestJSON-server-958897149',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-958897149',id=52,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-kqgg1fvg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:51:08Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=74d7bd73-125a-4539-8622-d0225143bd08,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.909 186792 DEBUG nova.network.os_vif_util [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.910 186792 DEBUG nova.network.os_vif_util [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:53:4b,bridge_name='br-int',has_traffic_filtering=True,id=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c93f0d-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.910 186792 DEBUG os_vif [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:53:4b,bridge_name='br-int',has_traffic_filtering=True,id=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c93f0d-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.911 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.911 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.912 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.915 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.915 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3c93f0d-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.915 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf3c93f0d-0d, col_values=(('external_ids', {'iface-id': 'f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:53:4b', 'vm-uuid': '74d7bd73-125a-4539-8622-d0225143bd08'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.917 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:12 np0005531888 NetworkManager[55166]: <info>  [1763797872.9182] manager: (tapf3c93f0d-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.919 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.928 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.929 186792 INFO os_vif [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:53:4b,bridge_name='br-int',has_traffic_filtering=True,id=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c93f0d-0d')#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.990 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.990 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.990 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No VIF found with MAC fa:16:3e:3a:53:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:51:12 np0005531888 nova_compute[186788]: 2025-11-22 07:51:12.991 186792 INFO nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Using config drive#033[00m
Nov 22 02:51:13 np0005531888 nova_compute[186788]: 2025-11-22 07:51:13.765 186792 INFO nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Creating config drive at /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk.config#033[00m
Nov 22 02:51:13 np0005531888 nova_compute[186788]: 2025-11-22 07:51:13.771 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1ncfmh_h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:13 np0005531888 nova_compute[186788]: 2025-11-22 07:51:13.897 186792 DEBUG oslo_concurrency.processutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1ncfmh_h" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:13 np0005531888 kernel: tapf3c93f0d-0d: entered promiscuous mode
Nov 22 02:51:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:13Z|00128|binding|INFO|Claiming lport f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 for this chassis.
Nov 22 02:51:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:13Z|00129|binding|INFO|f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027: Claiming fa:16:3e:3a:53:4b 10.100.0.5
Nov 22 02:51:13 np0005531888 nova_compute[186788]: 2025-11-22 07:51:13.952 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:13 np0005531888 NetworkManager[55166]: <info>  [1763797873.9539] manager: (tapf3c93f0d-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Nov 22 02:51:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:13Z|00130|binding|INFO|Setting lport f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 ovn-installed in OVS
Nov 22 02:51:13 np0005531888 nova_compute[186788]: 2025-11-22 07:51:13.966 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:13 np0005531888 nova_compute[186788]: 2025-11-22 07:51:13.968 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:13 np0005531888 systemd-udevd[220495]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:51:13 np0005531888 systemd-machined[153106]: New machine qemu-25-instance-00000034.
Nov 22 02:51:14 np0005531888 NetworkManager[55166]: <info>  [1763797874.0010] device (tapf3c93f0d-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:51:14 np0005531888 NetworkManager[55166]: <info>  [1763797874.0028] device (tapf3c93f0d-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:51:14 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:14Z|00131|binding|INFO|Setting lport f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 up in Southbound
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.008 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:53:4b 10.100.0.5'], port_security=['fa:16:3e:3a:53:4b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '74d7bd73-125a-4539-8622-d0225143bd08', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.010 104023 INFO neutron.agent.ovn.metadata.agent [-] Port f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c bound to our chassis#033[00m
Nov 22 02:51:14 np0005531888 systemd[1]: Started Virtual Machine qemu-25-instance-00000034.
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.011 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5e910dbb-27d1-4915-8b74-d0538d33c33c#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.023 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa3355a-942d-4b85-bfe5-626d0054f6a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.024 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5e910dbb-21 in ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.026 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5e910dbb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.026 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[98961507-e1c5-4da9-89bd-a2e0154d60f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.027 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e0cc1b-2996-4f7e-877c-8c221554f37f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.041 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[b72990ee-196a-4646-a05d-61e7493e8c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.066 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd3c91d-2e4a-4c6b-9f19-0f11e558cd88]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.095 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[dd21fb6e-cc18-41f9-afd0-18bc724d56e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 NetworkManager[55166]: <info>  [1763797874.1028] manager: (tap5e910dbb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.102 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[68118530-0498-4978-b77e-6834b9cffd5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 systemd-udevd[220498]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.134 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[216e449f-8674-43be-8000-8dd4f29cad39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.137 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2ae32c-bbd6-4287-be3d-7698d8239b28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 NetworkManager[55166]: <info>  [1763797874.1622] device (tap5e910dbb-20): carrier: link connected
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.166 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[58ffd87c-e859-4c67-9059-1ad337ee80a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.182 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1e352b3e-49c8-45c2-a8fe-591b84dab2cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461480, 'reachable_time': 21459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220529, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.197 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[06b8a3b7-e31c-418a-958c-1c4fcaa6c248]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e859'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461480, 'tstamp': 461480}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220530, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.212 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2d1b06cc-10f6-4974-b6e1-fc4fbfca8905]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461480, 'reachable_time': 21459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220531, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.243 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a380a5ca-313b-428e-9567-55664e77ecc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.305 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[436b2ec7-7caa-43f6-a221-72c325334538]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.307 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.307 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.308 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e910dbb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:14 np0005531888 NetworkManager[55166]: <info>  [1763797874.3102] manager: (tap5e910dbb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Nov 22 02:51:14 np0005531888 kernel: tap5e910dbb-20: entered promiscuous mode
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.315 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5e910dbb-20, col_values=(('external_ids', {'iface-id': 'df80c07a-3ea3-4dde-8219-31b028a556e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:14 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:14Z|00132|binding|INFO|Releasing lport df80c07a-3ea3-4dde-8219-31b028a556e5 from this chassis (sb_readonly=0)
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.316 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.320 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.320 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[615c9997-3eb4-40de-87bc-f111acf4a688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.321 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:51:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:14.322 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'env', 'PROCESS_TAG=haproxy-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5e910dbb-27d1-4915-8b74-d0538d33c33c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.328 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.657 186792 DEBUG nova.compute.manager [req-857a6261-1142-47f4-b5a8-5b68d675da7e req-2f92b403-fdad-4595-a365-e2e93f21e8e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received event network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.657 186792 DEBUG oslo_concurrency.lockutils [req-857a6261-1142-47f4-b5a8-5b68d675da7e req-2f92b403-fdad-4595-a365-e2e93f21e8e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "74d7bd73-125a-4539-8622-d0225143bd08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.657 186792 DEBUG oslo_concurrency.lockutils [req-857a6261-1142-47f4-b5a8-5b68d675da7e req-2f92b403-fdad-4595-a365-e2e93f21e8e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.658 186792 DEBUG oslo_concurrency.lockutils [req-857a6261-1142-47f4-b5a8-5b68d675da7e req-2f92b403-fdad-4595-a365-e2e93f21e8e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.658 186792 DEBUG nova.compute.manager [req-857a6261-1142-47f4-b5a8-5b68d675da7e req-2f92b403-fdad-4595-a365-e2e93f21e8e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Processing event network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:51:14 np0005531888 podman[220563]: 2025-11-22 07:51:14.739478053 +0000 UTC m=+0.054295927 container create 50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 02:51:14 np0005531888 systemd[1]: Started libpod-conmon-50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744.scope.
Nov 22 02:51:14 np0005531888 podman[220563]: 2025-11-22 07:51:14.710546287 +0000 UTC m=+0.025364181 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:51:14 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:51:14 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/948612ec9c9cec77f1a92fed73ae8ff7ae66c9004518eaf41ce474ed75bd2b2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:51:14 np0005531888 podman[220563]: 2025-11-22 07:51:14.832978092 +0000 UTC m=+0.147795996 container init 50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 02:51:14 np0005531888 podman[220563]: 2025-11-22 07:51:14.839803206 +0000 UTC m=+0.154621090 container start 50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:51:14 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220579]: [NOTICE]   (220583) : New worker (220585) forked
Nov 22 02:51:14 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220579]: [NOTICE]   (220583) : Loading success.
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.878 186792 DEBUG nova.network.neutron [req-ef204880-dde1-452f-a040-77a24f55ed98 req-b4202fb4-218d-41e2-a48e-3fd1a9d4eb18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Updated VIF entry in instance network info cache for port f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.879 186792 DEBUG nova.network.neutron [req-ef204880-dde1-452f-a040-77a24f55ed98 req-b4202fb4-218d-41e2-a48e-3fd1a9d4eb18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Updating instance_info_cache with network_info: [{"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:14 np0005531888 nova_compute[186788]: 2025-11-22 07:51:14.895 186792 DEBUG oslo_concurrency.lockutils [req-ef204880-dde1-452f-a040-77a24f55ed98 req-b4202fb4-218d-41e2-a48e-3fd1a9d4eb18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.387 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797860.3858075, eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.387 186792 INFO nova.compute.manager [-] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.416 186792 DEBUG nova.compute.manager [None req-389d57ef-58cc-4678-bca5-ed4b09eecee5 - - - - - -] [instance: eeac187a-dd8a-4eb3-9f69-6f5c1e708e9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.744 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797875.74369, 74d7bd73-125a-4539-8622-d0225143bd08 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.744 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] VM Started (Lifecycle Event)#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.748 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.752 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.757 186792 INFO nova.virt.libvirt.driver [-] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance spawned successfully.#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.758 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.773 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.780 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.783 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.784 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.784 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.785 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.785 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.786 186792 DEBUG nova.virt.libvirt.driver [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.815 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.815 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797875.7438293, 74d7bd73-125a-4539-8622-d0225143bd08 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.816 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.844 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.849 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797875.75137, 74d7bd73-125a-4539-8622-d0225143bd08 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.849 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.888 186792 INFO nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Took 7.26 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.889 186792 DEBUG nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.893 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.900 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.937 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.980 186792 INFO nova.compute.manager [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Took 7.80 seconds to build instance.#033[00m
Nov 22 02:51:15 np0005531888 nova_compute[186788]: 2025-11-22 07:51:15.997 186792 DEBUG oslo_concurrency.lockutils [None req-3f290c7a-6cfa-40d0-ac7c-4c304dabc158 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.876s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:16 np0005531888 nova_compute[186788]: 2025-11-22 07:51:16.749 186792 DEBUG nova.compute.manager [req-7c52cf62-0e32-4b85-8dfa-89399d4b67b9 req-eed648a3-2f02-49d9-97e5-028a321d5a4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received event network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:16 np0005531888 nova_compute[186788]: 2025-11-22 07:51:16.750 186792 DEBUG oslo_concurrency.lockutils [req-7c52cf62-0e32-4b85-8dfa-89399d4b67b9 req-eed648a3-2f02-49d9-97e5-028a321d5a4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "74d7bd73-125a-4539-8622-d0225143bd08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:16 np0005531888 nova_compute[186788]: 2025-11-22 07:51:16.750 186792 DEBUG oslo_concurrency.lockutils [req-7c52cf62-0e32-4b85-8dfa-89399d4b67b9 req-eed648a3-2f02-49d9-97e5-028a321d5a4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:16 np0005531888 nova_compute[186788]: 2025-11-22 07:51:16.750 186792 DEBUG oslo_concurrency.lockutils [req-7c52cf62-0e32-4b85-8dfa-89399d4b67b9 req-eed648a3-2f02-49d9-97e5-028a321d5a4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:16 np0005531888 nova_compute[186788]: 2025-11-22 07:51:16.751 186792 DEBUG nova.compute.manager [req-7c52cf62-0e32-4b85-8dfa-89399d4b67b9 req-eed648a3-2f02-49d9-97e5-028a321d5a4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] No waiting events found dispatching network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:51:16 np0005531888 nova_compute[186788]: 2025-11-22 07:51:16.751 186792 WARNING nova.compute.manager [req-7c52cf62-0e32-4b85-8dfa-89399d4b67b9 req-eed648a3-2f02-49d9-97e5-028a321d5a4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received unexpected event network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:51:17 np0005531888 nova_compute[186788]: 2025-11-22 07:51:17.671 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:17 np0005531888 podman[220601]: 2025-11-22 07:51:17.699217075 +0000 UTC m=+0.065808164 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:51:17 np0005531888 nova_compute[186788]: 2025-11-22 07:51:17.918 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:18 np0005531888 nova_compute[186788]: 2025-11-22 07:51:18.738 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "74d7bd73-125a-4539-8622-d0225143bd08" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:18 np0005531888 nova_compute[186788]: 2025-11-22 07:51:18.739 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:18 np0005531888 nova_compute[186788]: 2025-11-22 07:51:18.739 186792 INFO nova.compute.manager [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Shelving#033[00m
Nov 22 02:51:18 np0005531888 nova_compute[186788]: 2025-11-22 07:51:18.769 186792 DEBUG nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:51:19 np0005531888 podman[220621]: 2025-11-22 07:51:19.693679091 +0000 UTC m=+0.065593809 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:51:22 np0005531888 nova_compute[186788]: 2025-11-22 07:51:22.673 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:22 np0005531888 nova_compute[186788]: 2025-11-22 07:51:22.920 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:23 np0005531888 podman[220646]: 2025-11-22 07:51:23.685414562 +0000 UTC m=+0.063355175 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Nov 22 02:51:27 np0005531888 nova_compute[186788]: 2025-11-22 07:51:27.675 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:27 np0005531888 nova_compute[186788]: 2025-11-22 07:51:27.922 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.097 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.098 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.098 186792 INFO nova.compute.manager [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Unshelving#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.243 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.243 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.249 186792 DEBUG nova.objects.instance [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'pci_requests' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.268 186792 DEBUG nova.objects.instance [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'numa_topology' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.287 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.288 186792 INFO nova.compute.claims [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.508 186792 DEBUG nova.compute.provider_tree [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.529 186792 DEBUG nova.scheduler.client.report [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.555 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.716 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.717 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquired lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.717 186792 DEBUG nova.network.neutron [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.819 186792 DEBUG nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:51:28 np0005531888 nova_compute[186788]: 2025-11-22 07:51:28.948 186792 DEBUG nova.network.neutron [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.689 186792 DEBUG nova.network.neutron [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.706 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Releasing lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.707 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.708 186792 INFO nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating image(s)#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.708 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.709 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.709 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.710 186792 DEBUG nova.objects.instance [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.718 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "91b50f83eaa261e984af5000107bf50c6f917c53" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:29 np0005531888 nova_compute[186788]: 2025-11-22 07:51:29.719 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "91b50f83eaa261e984af5000107bf50c6f917c53" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:30Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3a:53:4b 10.100.0.5
Nov 22 02:51:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:30Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:53:4b 10.100.0.5
Nov 22 02:51:31 np0005531888 podman[220680]: 2025-11-22 07:51:31.699599572 +0000 UTC m=+0.070827934 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:51:31 np0005531888 podman[220681]: 2025-11-22 07:51:31.71987555 +0000 UTC m=+0.087189088 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.184 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.257 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53.part --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.258 186792 DEBUG nova.virt.images [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] 24d66fd8-ff66-4cac-ab5a-96ed855aa97a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.261 186792 DEBUG nova.privsep.utils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.261 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53.part /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.674 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53.part /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53.converted" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.687 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.707 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.755 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53.converted --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.756 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "91b50f83eaa261e984af5000107bf50c6f917c53" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.771 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.833 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.835 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "91b50f83eaa261e984af5000107bf50c6f917c53" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.835 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "91b50f83eaa261e984af5000107bf50c6f917c53" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.847 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.921 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.922 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53,backing_fmt=raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.960 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53,backing_fmt=raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.961 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "91b50f83eaa261e984af5000107bf50c6f917c53" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:32 np0005531888 nova_compute[186788]: 2025-11-22 07:51:32.962 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:33 np0005531888 nova_compute[186788]: 2025-11-22 07:51:33.023 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:33 np0005531888 nova_compute[186788]: 2025-11-22 07:51:33.025 186792 DEBUG nova.objects.instance [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'migration_context' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:33 np0005531888 nova_compute[186788]: 2025-11-22 07:51:33.036 186792 INFO nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Rebasing disk image.#033[00m
Nov 22 02:51:33 np0005531888 nova_compute[186788]: 2025-11-22 07:51:33.037 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:33 np0005531888 nova_compute[186788]: 2025-11-22 07:51:33.096 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:33 np0005531888 nova_compute[186788]: 2025-11-22 07:51:33.098 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.594 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk" returned: 0 in 1.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.596 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.596 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Ensure instance console log exists: /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.597 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.598 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.598 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.599 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='4d479f032bc674e06fc090516610379b',container_format='bare',created_at=2025-11-22T07:51:05Z,direct_url=<?>,disk_format='qcow2',id=24d66fd8-ff66-4cac-ab5a-96ed855aa97a,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1801097299-shelved',owner='d31cb5bd32934c45b774fafa62a8eb01',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-22T07:51:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.604 186792 WARNING nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.608 186792 DEBUG nova.virt.libvirt.host [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.609 186792 DEBUG nova.virt.libvirt.host [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.613 186792 DEBUG nova.virt.libvirt.host [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.614 186792 DEBUG nova.virt.libvirt.host [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.616 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.616 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='4d479f032bc674e06fc090516610379b',container_format='bare',created_at=2025-11-22T07:51:05Z,direct_url=<?>,disk_format='qcow2',id=24d66fd8-ff66-4cac-ab5a-96ed855aa97a,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1801097299-shelved',owner='d31cb5bd32934c45b774fafa62a8eb01',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-22T07:51:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.617 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.617 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.617 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.617 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.618 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.618 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.618 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.619 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.619 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.619 186792 DEBUG nova.virt.hardware [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.619 186792 DEBUG nova.objects.instance [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.638 186792 DEBUG nova.objects.instance [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'pci_devices' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.660 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <uuid>fccb17bd-32d9-4428-8812-0fbb6f93afa4</uuid>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <name>instance-00000030</name>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1801097299</nova:name>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:51:34</nova:creationTime>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:        <nova:user uuid="a2e51707e7f64c0793f0a8feeb6c40e6">tempest-UnshelveToHostMultiNodesTest-1261470077-project-member</nova:user>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:        <nova:project uuid="d31cb5bd32934c45b774fafa62a8eb01">tempest-UnshelveToHostMultiNodesTest-1261470077</nova:project>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="24d66fd8-ff66-4cac-ab5a-96ed855aa97a"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <entry name="serial">fccb17bd-32d9-4428-8812-0fbb6f93afa4</entry>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <entry name="uuid">fccb17bd-32d9-4428-8812-0fbb6f93afa4</entry>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/console.log" append="off"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <input type="keyboard" bus="usb"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:51:34 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:51:34 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:51:34 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:51:34 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.730 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.731 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.732 186792 INFO nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Using config drive#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.747 186792 DEBUG nova.objects.instance [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'ec2_ids' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:34 np0005531888 nova_compute[186788]: 2025-11-22 07:51:34.822 186792 DEBUG nova.objects.instance [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'keypairs' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.045 186792 INFO nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating config drive at /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.051 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp97fcst7w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.181 186792 DEBUG oslo_concurrency.processutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp97fcst7w" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:35 np0005531888 systemd-machined[153106]: New machine qemu-26-instance-00000030.
Nov 22 02:51:35 np0005531888 systemd[1]: Started Virtual Machine qemu-26-instance-00000030.
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.706 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797895.7056527, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.707 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.710 186792 DEBUG nova.compute.manager [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.711 186792 DEBUG nova.virt.libvirt.driver [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.715 186792 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance spawned successfully.#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.838 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.844 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.875 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.876 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797895.7084596, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.876 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Started (Lifecycle Event)#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.909 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.915 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:51:35 np0005531888 nova_compute[186788]: 2025-11-22 07:51:35.947 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:51:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:36.804 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:36.805 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:36.806 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:36 np0005531888 nova_compute[186788]: 2025-11-22 07:51:36.906 186792 DEBUG nova.compute.manager [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:36 np0005531888 nova_compute[186788]: 2025-11-22 07:51:36.994 186792 DEBUG oslo_concurrency.lockutils [None req-5bf6dfea-52f8-4679-b573-c83438fd2259 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 8.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:37 np0005531888 nova_compute[186788]: 2025-11-22 07:51:37.680 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:37 np0005531888 podman[220789]: 2025-11-22 07:51:37.754015119 +0000 UTC m=+0.101729127 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:51:37 np0005531888 nova_compute[186788]: 2025-11-22 07:51:37.943 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.298 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.299 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.299 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.299 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.300 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.307 186792 INFO nova.compute.manager [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Terminating instance#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.313 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.314 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquired lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.314 186792 DEBUG nova.network.neutron [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.507 186792 DEBUG nova.network.neutron [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.809 186792 DEBUG nova.network.neutron [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.825 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Releasing lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:51:38 np0005531888 nova_compute[186788]: 2025-11-22 07:51:38.827 186792 DEBUG nova.compute.manager [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:51:38 np0005531888 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000030.scope: Deactivated successfully.
Nov 22 02:51:38 np0005531888 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000030.scope: Consumed 3.610s CPU time.
Nov 22 02:51:38 np0005531888 systemd-machined[153106]: Machine qemu-26-instance-00000030 terminated.
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.092 186792 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance destroyed successfully.#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.093 186792 DEBUG nova.objects.instance [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lazy-loading 'resources' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.123 186792 INFO nova.virt.libvirt.driver [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Deleting instance files /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4_del#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.130 186792 INFO nova.virt.libvirt.driver [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Deletion of /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4_del complete#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.266 186792 INFO nova.compute.manager [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.267 186792 DEBUG oslo.service.loopingcall [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.267 186792 DEBUG nova.compute.manager [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.267 186792 DEBUG nova.network.neutron [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.820 186792 DEBUG nova.network.neutron [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.859 186792 DEBUG nova.network.neutron [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.884 186792 INFO nova.compute.manager [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Took 0.62 seconds to deallocate network for instance.#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.905 186792 DEBUG nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.991 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.993 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:51:39 np0005531888 nova_compute[186788]: 2025-11-22 07:51:39.993 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 74d7bd73-125a-4539-8622-d0225143bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:40 np0005531888 nova_compute[186788]: 2025-11-22 07:51:40.043 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:40 np0005531888 nova_compute[186788]: 2025-11-22 07:51:40.044 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:40 np0005531888 nova_compute[186788]: 2025-11-22 07:51:40.150 186792 DEBUG nova.compute.provider_tree [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:40 np0005531888 nova_compute[186788]: 2025-11-22 07:51:40.176 186792 DEBUG nova.scheduler.client.report [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:40 np0005531888 nova_compute[186788]: 2025-11-22 07:51:40.236 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:40 np0005531888 nova_compute[186788]: 2025-11-22 07:51:40.273 186792 INFO nova.scheduler.client.report [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Deleted allocations for instance fccb17bd-32d9-4428-8812-0fbb6f93afa4#033[00m
Nov 22 02:51:40 np0005531888 nova_compute[186788]: 2025-11-22 07:51:40.363 186792 DEBUG oslo_concurrency.lockutils [None req-eb20eb5b-321c-40a4-8b4b-f851ec84b746 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:41 np0005531888 podman[220825]: 2025-11-22 07:51:41.70829639 +0000 UTC m=+0.074029411 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.120 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Updating instance_info_cache with network_info: [{"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.148 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.148 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.148 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:42 np0005531888 kernel: tapf3c93f0d-0d (unregistering): left promiscuous mode
Nov 22 02:51:42 np0005531888 NetworkManager[55166]: <info>  [1763797902.1655] device (tapf3c93f0d-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:51:42 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:42Z|00133|binding|INFO|Releasing lport f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 from this chassis (sb_readonly=0)
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.171 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:42 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:42Z|00134|binding|INFO|Setting lport f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 down in Southbound
Nov 22 02:51:42 np0005531888 ovn_controller[95067]: 2025-11-22T07:51:42Z|00135|binding|INFO|Removing iface tapf3c93f0d-0d ovn-installed in OVS
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.174 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.180 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:53:4b 10.100.0.5'], port_security=['fa:16:3e:3a:53:4b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '74d7bd73-125a-4539-8622-d0225143bd08', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.181 104023 INFO neutron.agent.ovn.metadata.agent [-] Port f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c unbound from our chassis#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.183 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e910dbb-27d1-4915-8b74-d0538d33c33c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.185 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e925cbc2-8853-44b2-920d-4632cdff0d93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.186 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace which is not needed anymore#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.196 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:42 np0005531888 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Deactivated successfully.
Nov 22 02:51:42 np0005531888 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Consumed 15.719s CPU time.
Nov 22 02:51:42 np0005531888 systemd-machined[153106]: Machine qemu-25-instance-00000034 terminated.
Nov 22 02:51:42 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220579]: [NOTICE]   (220583) : haproxy version is 2.8.14-c23fe91
Nov 22 02:51:42 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220579]: [NOTICE]   (220583) : path to executable is /usr/sbin/haproxy
Nov 22 02:51:42 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220579]: [WARNING]  (220583) : Exiting Master process...
Nov 22 02:51:42 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220579]: [WARNING]  (220583) : Exiting Master process...
Nov 22 02:51:42 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220579]: [ALERT]    (220583) : Current worker (220585) exited with code 143 (Terminated)
Nov 22 02:51:42 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220579]: [WARNING]  (220583) : All workers exited. Exiting... (0)
Nov 22 02:51:42 np0005531888 systemd[1]: libpod-50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744.scope: Deactivated successfully.
Nov 22 02:51:42 np0005531888 podman[220869]: 2025-11-22 07:51:42.363044907 +0000 UTC m=+0.056039389 container died 50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 02:51:42 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744-userdata-shm.mount: Deactivated successfully.
Nov 22 02:51:42 np0005531888 systemd[1]: var-lib-containers-storage-overlay-948612ec9c9cec77f1a92fed73ae8ff7ae66c9004518eaf41ce474ed75bd2b2c-merged.mount: Deactivated successfully.
Nov 22 02:51:42 np0005531888 podman[220869]: 2025-11-22 07:51:42.427619199 +0000 UTC m=+0.120613681 container cleanup 50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:51:42 np0005531888 systemd[1]: libpod-conmon-50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744.scope: Deactivated successfully.
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.451 186792 DEBUG nova.compute.manager [req-10147cd6-43b7-429b-a3e8-d858045c84e7 req-4fb6bc59-af49-49a5-bd24-9f1557838b76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received event network-vif-unplugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.452 186792 DEBUG oslo_concurrency.lockutils [req-10147cd6-43b7-429b-a3e8-d858045c84e7 req-4fb6bc59-af49-49a5-bd24-9f1557838b76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "74d7bd73-125a-4539-8622-d0225143bd08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.452 186792 DEBUG oslo_concurrency.lockutils [req-10147cd6-43b7-429b-a3e8-d858045c84e7 req-4fb6bc59-af49-49a5-bd24-9f1557838b76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.452 186792 DEBUG oslo_concurrency.lockutils [req-10147cd6-43b7-429b-a3e8-d858045c84e7 req-4fb6bc59-af49-49a5-bd24-9f1557838b76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.452 186792 DEBUG nova.compute.manager [req-10147cd6-43b7-429b-a3e8-d858045c84e7 req-4fb6bc59-af49-49a5-bd24-9f1557838b76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] No waiting events found dispatching network-vif-unplugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.453 186792 WARNING nova.compute.manager [req-10147cd6-43b7-429b-a3e8-d858045c84e7 req-4fb6bc59-af49-49a5-bd24-9f1557838b76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received unexpected event network-vif-unplugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 for instance with vm_state active and task_state shelving.#033[00m
Nov 22 02:51:42 np0005531888 podman[220912]: 2025-11-22 07:51:42.507152602 +0000 UTC m=+0.054637505 container remove 50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.514 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[be609afd-9b2f-4095-bc01-e7d9b3a595af]: (4, ('Sat Nov 22 07:51:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744)\n50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744\nSat Nov 22 07:51:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744)\n50e25a83203b15a5935d53220d0deccd86408dffc6a5abfb603fa39f69cce744\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.517 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cb64b8a0-f319-4194-b299-e5c2909536eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.518 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.521 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:42 np0005531888 kernel: tap5e910dbb-20: left promiscuous mode
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.538 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.540 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f1a42d-bec7-4463-8863-f1b9302ee26e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.562 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[021c594e-a86a-42a0-a49e-117f42b211a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.564 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f97dc5-08f3-4fce-a037-6d1e200b3e45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.581 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ee64fa34-e780-4e80-881f-5c8160c88406]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461473, 'reachable_time': 22553, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220932, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.584 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:51:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:51:42.584 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d2e65b-d84d-4fea-a02a-ff1d4f5801f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:42 np0005531888 systemd[1]: run-netns-ovnmeta\x2d5e910dbb\x2d27d1\x2d4915\x2d8b74\x2dd0538d33c33c.mount: Deactivated successfully.
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.683 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.923 186792 INFO nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance shutdown successfully after 24 seconds.#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.931 186792 INFO nova.virt.libvirt.driver [-] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance destroyed successfully.#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.932 186792 DEBUG nova.objects.instance [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'numa_topology' on Instance uuid 74d7bd73-125a-4539-8622-d0225143bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.946 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:42 np0005531888 nova_compute[186788]: 2025-11-22 07:51:42.958 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:43 np0005531888 nova_compute[186788]: 2025-11-22 07:51:43.542 186792 INFO nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Beginning cold snapshot process#033[00m
Nov 22 02:51:43 np0005531888 nova_compute[186788]: 2025-11-22 07:51:43.761 186792 DEBUG nova.privsep.utils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:51:43 np0005531888 nova_compute[186788]: 2025-11-22 07:51:43.761 186792 DEBUG oslo_concurrency.processutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk /var/lib/nova/instances/snapshots/tmpw53g5etp/abcd92807c9241dabafb37f3261d669f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:43 np0005531888 nova_compute[186788]: 2025-11-22 07:51:43.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:44 np0005531888 nova_compute[186788]: 2025-11-22 07:51:44.431 186792 DEBUG oslo_concurrency.processutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk /var/lib/nova/instances/snapshots/tmpw53g5etp/abcd92807c9241dabafb37f3261d669f" returned: 0 in 0.670s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:44 np0005531888 nova_compute[186788]: 2025-11-22 07:51:44.432 186792 INFO nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:51:44 np0005531888 nova_compute[186788]: 2025-11-22 07:51:44.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:45 np0005531888 nova_compute[186788]: 2025-11-22 07:51:45.001 186792 DEBUG nova.compute.manager [req-f990cfa5-d621-46a2-8325-5f8e34292bb7 req-15d3c988-5863-41ca-bbd2-989e6fce2def 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received event network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:45 np0005531888 nova_compute[186788]: 2025-11-22 07:51:45.001 186792 DEBUG oslo_concurrency.lockutils [req-f990cfa5-d621-46a2-8325-5f8e34292bb7 req-15d3c988-5863-41ca-bbd2-989e6fce2def 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "74d7bd73-125a-4539-8622-d0225143bd08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:45 np0005531888 nova_compute[186788]: 2025-11-22 07:51:45.002 186792 DEBUG oslo_concurrency.lockutils [req-f990cfa5-d621-46a2-8325-5f8e34292bb7 req-15d3c988-5863-41ca-bbd2-989e6fce2def 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:45 np0005531888 nova_compute[186788]: 2025-11-22 07:51:45.003 186792 DEBUG oslo_concurrency.lockutils [req-f990cfa5-d621-46a2-8325-5f8e34292bb7 req-15d3c988-5863-41ca-bbd2-989e6fce2def 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:45 np0005531888 nova_compute[186788]: 2025-11-22 07:51:45.003 186792 DEBUG nova.compute.manager [req-f990cfa5-d621-46a2-8325-5f8e34292bb7 req-15d3c988-5863-41ca-bbd2-989e6fce2def 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] No waiting events found dispatching network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:51:45 np0005531888 nova_compute[186788]: 2025-11-22 07:51:45.004 186792 WARNING nova.compute.manager [req-f990cfa5-d621-46a2-8325-5f8e34292bb7 req-15d3c988-5863-41ca-bbd2-989e6fce2def 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received unexpected event network-vif-plugged-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Nov 22 02:51:47 np0005531888 nova_compute[186788]: 2025-11-22 07:51:47.685 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:47 np0005531888 nova_compute[186788]: 2025-11-22 07:51:47.947 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:47 np0005531888 nova_compute[186788]: 2025-11-22 07:51:47.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:47 np0005531888 nova_compute[186788]: 2025-11-22 07:51:47.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:51:48 np0005531888 nova_compute[186788]: 2025-11-22 07:51:48.475 186792 INFO nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Snapshot image upload complete#033[00m
Nov 22 02:51:48 np0005531888 nova_compute[186788]: 2025-11-22 07:51:48.476 186792 DEBUG nova.compute.manager [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:48 np0005531888 nova_compute[186788]: 2025-11-22 07:51:48.573 186792 INFO nova.compute.manager [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Shelve offloading#033[00m
Nov 22 02:51:48 np0005531888 nova_compute[186788]: 2025-11-22 07:51:48.590 186792 INFO nova.virt.libvirt.driver [-] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance destroyed successfully.#033[00m
Nov 22 02:51:48 np0005531888 nova_compute[186788]: 2025-11-22 07:51:48.591 186792 DEBUG nova.compute.manager [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:48 np0005531888 nova_compute[186788]: 2025-11-22 07:51:48.594 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:51:48 np0005531888 nova_compute[186788]: 2025-11-22 07:51:48.595 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:51:48 np0005531888 nova_compute[186788]: 2025-11-22 07:51:48.595 186792 DEBUG nova.network.neutron [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:51:48 np0005531888 podman[220944]: 2025-11-22 07:51:48.705733486 +0000 UTC m=+0.070510366 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:51:50 np0005531888 podman[220964]: 2025-11-22 07:51:50.715743587 +0000 UTC m=+0.087563387 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:51:50 np0005531888 nova_compute[186788]: 2025-11-22 07:51:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:50 np0005531888 nova_compute[186788]: 2025-11-22 07:51:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:50 np0005531888 nova_compute[186788]: 2025-11-22 07:51:50.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:50 np0005531888 nova_compute[186788]: 2025-11-22 07:51:50.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:50 np0005531888 nova_compute[186788]: 2025-11-22 07:51:50.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:50 np0005531888 nova_compute[186788]: 2025-11-22 07:51:50.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.051 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.106 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.107 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.170 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.305 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.307 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5775MB free_disk=73.32231521606445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.307 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.307 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.369 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 74d7bd73-125a-4539-8622-d0225143bd08 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.369 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.370 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.427 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.443 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.490 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.491 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.874 186792 DEBUG nova.network.neutron [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Updating instance_info_cache with network_info: [{"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:51 np0005531888 nova_compute[186788]: 2025-11-22 07:51:51.924 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:51:52 np0005531888 nova_compute[186788]: 2025-11-22 07:51:52.485 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:52 np0005531888 nova_compute[186788]: 2025-11-22 07:51:52.687 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:52 np0005531888 nova_compute[186788]: 2025-11-22 07:51:52.951 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.657 186792 INFO nova.virt.libvirt.driver [-] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Instance destroyed successfully.#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.658 186792 DEBUG nova.objects.instance [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'resources' on Instance uuid 74d7bd73-125a-4539-8622-d0225143bd08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.670 186792 DEBUG nova.virt.libvirt.vif [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-958897149',display_name='tempest-DeleteServersTestJSON-server-958897149',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-958897149',id=52,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:51:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-kqgg1fvg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member',shelved_at='2025-11-22T07:51:48.476099',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='44c7b849-0696-4492-89a2-c62f73f450a7'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:51:44Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=74d7bd73-125a-4539-8622-d0225143bd08,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.671 186792 DEBUG nova.network.os_vif_util [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.672 186792 DEBUG nova.network.os_vif_util [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:53:4b,bridge_name='br-int',has_traffic_filtering=True,id=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c93f0d-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.672 186792 DEBUG os_vif [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:53:4b,bridge_name='br-int',has_traffic_filtering=True,id=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c93f0d-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.674 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.675 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3c93f0d-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.678 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.679 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.684 186792 INFO os_vif [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:53:4b,bridge_name='br-int',has_traffic_filtering=True,id=f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c93f0d-0d')#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.685 186792 INFO nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Deleting instance files /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08_del#033[00m
Nov 22 02:51:53 np0005531888 nova_compute[186788]: 2025-11-22 07:51:53.690 186792 INFO nova.virt.libvirt.driver [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Deletion of /var/lib/nova/instances/74d7bd73-125a-4539-8622-d0225143bd08_del complete#033[00m
Nov 22 02:51:54 np0005531888 nova_compute[186788]: 2025-11-22 07:51:54.090 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797899.0891325, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:54 np0005531888 nova_compute[186788]: 2025-11-22 07:51:54.090 186792 INFO nova.compute.manager [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:51:54 np0005531888 nova_compute[186788]: 2025-11-22 07:51:54.302 186792 DEBUG nova.compute.manager [req-7dfd31f1-1eac-48e6-af30-79d3260a490a req-b526ca13-2c2d-4404-b115-d2b8c50b98a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Received event network-changed-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:54 np0005531888 nova_compute[186788]: 2025-11-22 07:51:54.302 186792 DEBUG nova.compute.manager [req-7dfd31f1-1eac-48e6-af30-79d3260a490a req-b526ca13-2c2d-4404-b115-d2b8c50b98a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Refreshing instance network info cache due to event network-changed-f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:51:54 np0005531888 nova_compute[186788]: 2025-11-22 07:51:54.303 186792 DEBUG oslo_concurrency.lockutils [req-7dfd31f1-1eac-48e6-af30-79d3260a490a req-b526ca13-2c2d-4404-b115-d2b8c50b98a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:51:54 np0005531888 nova_compute[186788]: 2025-11-22 07:51:54.303 186792 DEBUG oslo_concurrency.lockutils [req-7dfd31f1-1eac-48e6-af30-79d3260a490a req-b526ca13-2c2d-4404-b115-d2b8c50b98a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:51:54 np0005531888 nova_compute[186788]: 2025-11-22 07:51:54.303 186792 DEBUG nova.network.neutron [req-7dfd31f1-1eac-48e6-af30-79d3260a490a req-b526ca13-2c2d-4404-b115-d2b8c50b98a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Refreshing network info cache for port f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:51:54 np0005531888 podman[220994]: 2025-11-22 07:51:54.726613118 +0000 UTC m=+0.090279502 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, vcs-type=git, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:51:55 np0005531888 nova_compute[186788]: 2025-11-22 07:51:55.905 186792 DEBUG nova.compute.manager [None req-cada9136-5510-4ead-906f-eb6c6b2d91c4 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:56 np0005531888 nova_compute[186788]: 2025-11-22 07:51:56.642 186792 INFO nova.scheduler.client.report [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Deleted allocations for instance 74d7bd73-125a-4539-8622-d0225143bd08#033[00m
Nov 22 02:51:56 np0005531888 nova_compute[186788]: 2025-11-22 07:51:56.741 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:56 np0005531888 nova_compute[186788]: 2025-11-22 07:51:56.743 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:56 np0005531888 nova_compute[186788]: 2025-11-22 07:51:56.773 186792 DEBUG nova.compute.provider_tree [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:56 np0005531888 nova_compute[186788]: 2025-11-22 07:51:56.792 186792 DEBUG nova.scheduler.client.report [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:56 np0005531888 nova_compute[186788]: 2025-11-22 07:51:56.830 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:56 np0005531888 nova_compute[186788]: 2025-11-22 07:51:56.969 186792 DEBUG oslo_concurrency.lockutils [None req-8afa64ee-be63-4fc2-a4b9-05d189b8c4ba 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "74d7bd73-125a-4539-8622-d0225143bd08" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 38.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:57 np0005531888 nova_compute[186788]: 2025-11-22 07:51:57.458 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797902.4566753, 74d7bd73-125a-4539-8622-d0225143bd08 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:57 np0005531888 nova_compute[186788]: 2025-11-22 07:51:57.459 186792 INFO nova.compute.manager [-] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:51:57 np0005531888 nova_compute[186788]: 2025-11-22 07:51:57.480 186792 DEBUG nova.compute.manager [None req-df25f270-83a8-455e-a182-0495645a8368 - - - - - -] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:57 np0005531888 nova_compute[186788]: 2025-11-22 07:51:57.689 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:58 np0005531888 nova_compute[186788]: 2025-11-22 07:51:58.444 186792 DEBUG nova.network.neutron [req-7dfd31f1-1eac-48e6-af30-79d3260a490a req-b526ca13-2c2d-4404-b115-d2b8c50b98a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Updated VIF entry in instance network info cache for port f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:51:58 np0005531888 nova_compute[186788]: 2025-11-22 07:51:58.445 186792 DEBUG nova.network.neutron [req-7dfd31f1-1eac-48e6-af30-79d3260a490a req-b526ca13-2c2d-4404-b115-d2b8c50b98a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 74d7bd73-125a-4539-8622-d0225143bd08] Updating instance_info_cache with network_info: [{"id": "f3c93f0d-0d4e-4f46-9d6f-5585d8bb3027", "address": "fa:16:3e:3a:53:4b", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": null, "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapf3c93f0d-0d", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:58 np0005531888 nova_compute[186788]: 2025-11-22 07:51:58.520 186792 DEBUG oslo_concurrency.lockutils [req-7dfd31f1-1eac-48e6-af30-79d3260a490a req-b526ca13-2c2d-4404-b115-d2b8c50b98a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-74d7bd73-125a-4539-8622-d0225143bd08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:51:58 np0005531888 nova_compute[186788]: 2025-11-22 07:51:58.680 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:00.675 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:52:00 np0005531888 nova_compute[186788]: 2025-11-22 07:52:00.676 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:00.677 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:52:02 np0005531888 nova_compute[186788]: 2025-11-22 07:52:02.695 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:02 np0005531888 podman[221014]: 2025-11-22 07:52:02.70045809 +0000 UTC m=+0.061645884 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:52:02 np0005531888 podman[221015]: 2025-11-22 07:52:02.726743042 +0000 UTC m=+0.086944622 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 02:52:03 np0005531888 nova_compute[186788]: 2025-11-22 07:52:03.684 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:07.679 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:07 np0005531888 nova_compute[186788]: 2025-11-22 07:52:07.693 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:08 np0005531888 nova_compute[186788]: 2025-11-22 07:52:08.687 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:08 np0005531888 podman[221060]: 2025-11-22 07:52:08.720416478 +0000 UTC m=+0.082312843 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:52:12 np0005531888 podman[221086]: 2025-11-22 07:52:12.681445459 +0000 UTC m=+0.057933080 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:52:12 np0005531888 nova_compute[186788]: 2025-11-22 07:52:12.696 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:13 np0005531888 nova_compute[186788]: 2025-11-22 07:52:13.689 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:17 np0005531888 nova_compute[186788]: 2025-11-22 07:52:17.698 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:18 np0005531888 nova_compute[186788]: 2025-11-22 07:52:18.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:19 np0005531888 podman[221106]: 2025-11-22 07:52:19.695951902 +0000 UTC m=+0.061823016 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:52:21 np0005531888 podman[221126]: 2025-11-22 07:52:21.682701336 +0000 UTC m=+0.054344810 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:52:22 np0005531888 nova_compute[186788]: 2025-11-22 07:52:22.700 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:23 np0005531888 nova_compute[186788]: 2025-11-22 07:52:23.694 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:25 np0005531888 podman[221147]: 2025-11-22 07:52:25.68971003 +0000 UTC m=+0.061304975 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 02:52:27 np0005531888 nova_compute[186788]: 2025-11-22 07:52:27.702 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:28 np0005531888 nova_compute[186788]: 2025-11-22 07:52:28.696 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:32 np0005531888 nova_compute[186788]: 2025-11-22 07:52:32.703 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:33 np0005531888 podman[221170]: 2025-11-22 07:52:33.684621042 +0000 UTC m=+0.058074263 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 02:52:33 np0005531888 nova_compute[186788]: 2025-11-22 07:52:33.699 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:33 np0005531888 podman[221171]: 2025-11-22 07:52:33.721935722 +0000 UTC m=+0.092380157 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:52:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:36.805 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:36.805 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:36.805 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.838 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.839 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:52:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:52:37 np0005531888 nova_compute[186788]: 2025-11-22 07:52:37.705 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:38 np0005531888 nova_compute[186788]: 2025-11-22 07:52:38.702 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:39 np0005531888 podman[221215]: 2025-11-22 07:52:39.685550041 +0000 UTC m=+0.057264465 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:52:40 np0005531888 nova_compute[186788]: 2025-11-22 07:52:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:41.175 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:52:41 np0005531888 nova_compute[186788]: 2025-11-22 07:52:41.176 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:41.178 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:52:41 np0005531888 nova_compute[186788]: 2025-11-22 07:52:41.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:41 np0005531888 nova_compute[186788]: 2025-11-22 07:52:41.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:41 np0005531888 nova_compute[186788]: 2025-11-22 07:52:41.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:52:41 np0005531888 nova_compute[186788]: 2025-11-22 07:52:41.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:52:42 np0005531888 nova_compute[186788]: 2025-11-22 07:52:42.011 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:52:42 np0005531888 nova_compute[186788]: 2025-11-22 07:52:42.707 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:43 np0005531888 podman[221239]: 2025-11-22 07:52:43.689069067 +0000 UTC m=+0.057973821 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:52:43 np0005531888 nova_compute[186788]: 2025-11-22 07:52:43.704 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:43 np0005531888 nova_compute[186788]: 2025-11-22 07:52:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:43 np0005531888 nova_compute[186788]: 2025-11-22 07:52:43.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:52:44.180 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:44 np0005531888 nova_compute[186788]: 2025-11-22 07:52:44.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:47 np0005531888 nova_compute[186788]: 2025-11-22 07:52:47.709 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:48 np0005531888 nova_compute[186788]: 2025-11-22 07:52:48.707 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:49 np0005531888 nova_compute[186788]: 2025-11-22 07:52:49.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:49 np0005531888 nova_compute[186788]: 2025-11-22 07:52:49.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:52:50 np0005531888 podman[221259]: 2025-11-22 07:52:50.683555966 +0000 UTC m=+0.057773448 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 02:52:50 np0005531888 nova_compute[186788]: 2025-11-22 07:52:50.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:52 np0005531888 podman[221280]: 2025-11-22 07:52:52.699711805 +0000 UTC m=+0.072028705 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:52:52 np0005531888 nova_compute[186788]: 2025-11-22 07:52:52.711 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:52 np0005531888 nova_compute[186788]: 2025-11-22 07:52:52.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.004 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.004 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.005 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.005 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.216 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.218 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5756MB free_disk=73.35090637207031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.218 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.218 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.385 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.386 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.439 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.465 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.524 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.525 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:53 np0005531888 nova_compute[186788]: 2025-11-22 07:52:53.710 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.173 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.173 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.287 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.544 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.545 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.553 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.554 186792 INFO nova.compute.claims [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.740 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "0ca89548-45e7-4c83-bd8b-4447c7898213" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:54 np0005531888 nova_compute[186788]: 2025-11-22 07:52:54.741 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.141 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.244 186792 DEBUG nova.compute.provider_tree [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.258 186792 DEBUG nova.scheduler.client.report [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.616 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.617 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:52:56 np0005531888 podman[221304]: 2025-11-22 07:52:56.686551 +0000 UTC m=+0.060837875 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.688 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.689 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.695 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.696 186792 INFO nova.compute.claims [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.830 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.830 186792 DEBUG nova.network.neutron [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:52:56 np0005531888 nova_compute[186788]: 2025-11-22 07:52:56.976 186792 INFO nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.035 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.113 186792 DEBUG nova.compute.provider_tree [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.137 186792 DEBUG nova.scheduler.client.report [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.242 186792 DEBUG nova.policy [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.255 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.256 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.298 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.300 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.300 186792 INFO nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Creating image(s)#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.301 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "/var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.301 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.302 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.318 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.374 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.375 186792 DEBUG nova.network.neutron [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.391 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.391 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.392 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.403 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.424 186792 INFO nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.463 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.464 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.486 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.519 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.520 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.521 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.577 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.578 186792 DEBUG nova.virt.disk.api [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Checking if we can resize image /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.578 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.635 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.636 186792 DEBUG nova.virt.disk.api [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Cannot resize image /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.636 186792 DEBUG nova.objects.instance [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'migration_context' on Instance uuid e8d09e62-e93e-4700-bec1-fb9aff8683eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.672 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.673 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Ensure instance console log exists: /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.673 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.674 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.674 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.711 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.713 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.714 186792 INFO nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Creating image(s)#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.714 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "/var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.715 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.715 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.727 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.731 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.797 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.798 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.799 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.810 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.872 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.874 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.916 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.917 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.918 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.971 186792 DEBUG nova.policy [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.980 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.981 186792 DEBUG nova.virt.disk.api [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Checking if we can resize image /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:52:57 np0005531888 nova_compute[186788]: 2025-11-22 07:52:57.982 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.046 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.047 186792 DEBUG nova.virt.disk.api [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Cannot resize image /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.047 186792 DEBUG nova.objects.instance [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'migration_context' on Instance uuid 0ca89548-45e7-4c83-bd8b-4447c7898213 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.071 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.072 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Ensure instance console log exists: /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.072 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.073 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.073 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:58 np0005531888 nova_compute[186788]: 2025-11-22 07:52:58.712 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:59 np0005531888 nova_compute[186788]: 2025-11-22 07:52:59.001 186792 DEBUG nova.network.neutron [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Successfully created port: fe4fc124-3378-4a5c-a252-9c8c23b3164b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:52:59 np0005531888 nova_compute[186788]: 2025-11-22 07:52:59.607 186792 DEBUG nova.network.neutron [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Successfully created port: be123147-427c-4357-b495-4d7a782eeb33 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:53:01 np0005531888 nova_compute[186788]: 2025-11-22 07:53:01.599 186792 DEBUG nova.network.neutron [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Successfully updated port: fe4fc124-3378-4a5c-a252-9c8c23b3164b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:53:01 np0005531888 nova_compute[186788]: 2025-11-22 07:53:01.644 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:53:01 np0005531888 nova_compute[186788]: 2025-11-22 07:53:01.645 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquired lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:53:01 np0005531888 nova_compute[186788]: 2025-11-22 07:53:01.645 186792 DEBUG nova.network.neutron [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:53:01 np0005531888 nova_compute[186788]: 2025-11-22 07:53:01.809 186792 DEBUG nova.compute.manager [req-1451ab68-6b19-43df-92a3-e026e041cb6d req-23d6936c-3c26-4f25-b1c4-1c5ff120fb4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Received event network-changed-fe4fc124-3378-4a5c-a252-9c8c23b3164b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:01 np0005531888 nova_compute[186788]: 2025-11-22 07:53:01.810 186792 DEBUG nova.compute.manager [req-1451ab68-6b19-43df-92a3-e026e041cb6d req-23d6936c-3c26-4f25-b1c4-1c5ff120fb4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Refreshing instance network info cache due to event network-changed-fe4fc124-3378-4a5c-a252-9c8c23b3164b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:53:01 np0005531888 nova_compute[186788]: 2025-11-22 07:53:01.810 186792 DEBUG oslo_concurrency.lockutils [req-1451ab68-6b19-43df-92a3-e026e041cb6d req-23d6936c-3c26-4f25-b1c4-1c5ff120fb4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:53:02 np0005531888 nova_compute[186788]: 2025-11-22 07:53:02.027 186792 DEBUG nova.network.neutron [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:53:02 np0005531888 nova_compute[186788]: 2025-11-22 07:53:02.172 186792 DEBUG nova.network.neutron [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Successfully updated port: be123147-427c-4357-b495-4d7a782eeb33 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:53:02 np0005531888 nova_compute[186788]: 2025-11-22 07:53:02.191 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-0ca89548-45e7-4c83-bd8b-4447c7898213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:53:02 np0005531888 nova_compute[186788]: 2025-11-22 07:53:02.191 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-0ca89548-45e7-4c83-bd8b-4447c7898213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:53:02 np0005531888 nova_compute[186788]: 2025-11-22 07:53:02.191 186792 DEBUG nova.network.neutron [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:53:02 np0005531888 nova_compute[186788]: 2025-11-22 07:53:02.684 186792 DEBUG nova.network.neutron [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:53:02 np0005531888 nova_compute[186788]: 2025-11-22 07:53:02.716 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:03 np0005531888 nova_compute[186788]: 2025-11-22 07:53:03.715 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:04 np0005531888 nova_compute[186788]: 2025-11-22 07:53:04.015 186792 DEBUG nova.compute.manager [req-3de0e88e-45ac-4f04-a455-d0d38ff17f86 req-e267902d-8a13-4fc4-af34-2722767c5c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received event network-changed-be123147-427c-4357-b495-4d7a782eeb33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:04 np0005531888 nova_compute[186788]: 2025-11-22 07:53:04.015 186792 DEBUG nova.compute.manager [req-3de0e88e-45ac-4f04-a455-d0d38ff17f86 req-e267902d-8a13-4fc4-af34-2722767c5c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Refreshing instance network info cache due to event network-changed-be123147-427c-4357-b495-4d7a782eeb33. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:53:04 np0005531888 nova_compute[186788]: 2025-11-22 07:53:04.016 186792 DEBUG oslo_concurrency.lockutils [req-3de0e88e-45ac-4f04-a455-d0d38ff17f86 req-e267902d-8a13-4fc4-af34-2722767c5c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-0ca89548-45e7-4c83-bd8b-4447c7898213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:53:04 np0005531888 podman[221355]: 2025-11-22 07:53:04.692304694 +0000 UTC m=+0.062075482 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118)
Nov 22 02:53:04 np0005531888 podman[221356]: 2025-11-22 07:53:04.72494609 +0000 UTC m=+0.087587820 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 02:53:07 np0005531888 nova_compute[186788]: 2025-11-22 07:53:07.716 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:08 np0005531888 nova_compute[186788]: 2025-11-22 07:53:08.719 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.305 186792 DEBUG nova.network.neutron [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Updating instance_info_cache with network_info: [{"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.403 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Releasing lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.404 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Instance network_info: |[{"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.404 186792 DEBUG oslo_concurrency.lockutils [req-1451ab68-6b19-43df-92a3-e026e041cb6d req-23d6936c-3c26-4f25-b1c4-1c5ff120fb4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.405 186792 DEBUG nova.network.neutron [req-1451ab68-6b19-43df-92a3-e026e041cb6d req-23d6936c-3c26-4f25-b1c4-1c5ff120fb4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Refreshing network info cache for port fe4fc124-3378-4a5c-a252-9c8c23b3164b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.408 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Start _get_guest_xml network_info=[{"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.413 186792 WARNING nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.422 186792 DEBUG nova.virt.libvirt.host [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.423 186792 DEBUG nova.virt.libvirt.host [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.429 186792 DEBUG nova.virt.libvirt.host [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.430 186792 DEBUG nova.virt.libvirt.host [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.431 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.432 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.432 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.432 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.433 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.433 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.433 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.433 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.434 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.434 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.434 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.435 186792 DEBUG nova.virt.hardware [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.439 186792 DEBUG nova.virt.libvirt.vif [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:52:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-470238106',display_name='tempest-ImagesTestJSON-server-470238106',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-470238106',id=57,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-nlmpfle3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:52:57Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=e8d09e62-e93e-4700-bec1-fb9aff8683eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.440 186792 DEBUG nova.network.os_vif_util [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.441 186792 DEBUG nova.network.os_vif_util [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:14:49,bridge_name='br-int',has_traffic_filtering=True,id=fe4fc124-3378-4a5c-a252-9c8c23b3164b,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4fc124-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.443 186792 DEBUG nova.objects.instance [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'pci_devices' on Instance uuid e8d09e62-e93e-4700-bec1-fb9aff8683eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.476 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <uuid>e8d09e62-e93e-4700-bec1-fb9aff8683eb</uuid>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <name>instance-00000039</name>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <nova:name>tempest-ImagesTestJSON-server-470238106</nova:name>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:53:09</nova:creationTime>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        <nova:user uuid="1ac2d2381d294c96aff369941185056a">tempest-ImagesTestJSON-117614339-project-member</nova:user>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        <nova:project uuid="7ec4007dc8214caab4e2eb40f11fb3cd">tempest-ImagesTestJSON-117614339</nova:project>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        <nova:port uuid="fe4fc124-3378-4a5c-a252-9c8c23b3164b">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <entry name="serial">e8d09e62-e93e-4700-bec1-fb9aff8683eb</entry>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <entry name="uuid">e8d09e62-e93e-4700-bec1-fb9aff8683eb</entry>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk.config"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:a8:14:49"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <target dev="tapfe4fc124-33"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/console.log" append="off"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:53:09 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:53:09 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:53:09 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:53:09 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.477 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Preparing to wait for external event network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.478 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.478 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.478 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.479 186792 DEBUG nova.virt.libvirt.vif [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:52:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-470238106',display_name='tempest-ImagesTestJSON-server-470238106',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-470238106',id=57,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-nlmpfle3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:52:57Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=e8d09e62-e93e-4700-bec1-fb9aff8683eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.479 186792 DEBUG nova.network.os_vif_util [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.480 186792 DEBUG nova.network.os_vif_util [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:14:49,bridge_name='br-int',has_traffic_filtering=True,id=fe4fc124-3378-4a5c-a252-9c8c23b3164b,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4fc124-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.480 186792 DEBUG os_vif [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:14:49,bridge_name='br-int',has_traffic_filtering=True,id=fe4fc124-3378-4a5c-a252-9c8c23b3164b,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4fc124-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.481 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.481 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.482 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.484 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.484 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe4fc124-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.485 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe4fc124-33, col_values=(('external_ids', {'iface-id': 'fe4fc124-3378-4a5c-a252-9c8c23b3164b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:14:49', 'vm-uuid': 'e8d09e62-e93e-4700-bec1-fb9aff8683eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.487 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:09 np0005531888 NetworkManager[55166]: <info>  [1763797989.4884] manager: (tapfe4fc124-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.491 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.495 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.496 186792 INFO os_vif [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:14:49,bridge_name='br-int',has_traffic_filtering=True,id=fe4fc124-3378-4a5c-a252-9c8c23b3164b,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4fc124-33')#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.585 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.586 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.586 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No VIF found with MAC fa:16:3e:a8:14:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:53:09 np0005531888 nova_compute[186788]: 2025-11-22 07:53:09.587 186792 INFO nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Using config drive#033[00m
Nov 22 02:53:10 np0005531888 podman[221404]: 2025-11-22 07:53:10.675705871 +0000 UTC m=+0.050341331 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.097 186792 DEBUG nova.network.neutron [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Updating instance_info_cache with network_info: [{"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.342 186792 INFO nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Creating config drive at /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk.config#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.347 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl2afog_n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.486 186792 DEBUG oslo_concurrency.processutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl2afog_n" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:11 np0005531888 kernel: tapfe4fc124-33: entered promiscuous mode
Nov 22 02:53:11 np0005531888 NetworkManager[55166]: <info>  [1763797991.5659] manager: (tapfe4fc124-33): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Nov 22 02:53:11 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:11Z|00136|binding|INFO|Claiming lport fe4fc124-3378-4a5c-a252-9c8c23b3164b for this chassis.
Nov 22 02:53:11 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:11Z|00137|binding|INFO|fe4fc124-3378-4a5c-a252-9c8c23b3164b: Claiming fa:16:3e:a8:14:49 10.100.0.11
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.567 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.574 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.594 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-0ca89548-45e7-4c83-bd8b-4447c7898213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.596 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Instance network_info: |[{"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.596 186792 DEBUG oslo_concurrency.lockutils [req-3de0e88e-45ac-4f04-a455-d0d38ff17f86 req-e267902d-8a13-4fc4-af34-2722767c5c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-0ca89548-45e7-4c83-bd8b-4447c7898213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.597 186792 DEBUG nova.network.neutron [req-3de0e88e-45ac-4f04-a455-d0d38ff17f86 req-e267902d-8a13-4fc4-af34-2722767c5c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Refreshing network info cache for port be123147-427c-4357-b495-4d7a782eeb33 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:53:11 np0005531888 systemd-udevd[221445]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.600 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Start _get_guest_xml network_info=[{"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:53:11 np0005531888 systemd-machined[153106]: New machine qemu-27-instance-00000039.
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.608 186792 WARNING nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:53:11 np0005531888 NetworkManager[55166]: <info>  [1763797991.6146] device (tapfe4fc124-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:53:11 np0005531888 NetworkManager[55166]: <info>  [1763797991.6177] device (tapfe4fc124-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.617 186792 DEBUG nova.virt.libvirt.host [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.619 186792 DEBUG nova.virt.libvirt.host [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.627 186792 DEBUG nova.virt.libvirt.host [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.629 186792 DEBUG nova.virt.libvirt.host [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:53:11 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:11Z|00138|binding|INFO|Setting lport fe4fc124-3378-4a5c-a252-9c8c23b3164b ovn-installed in OVS
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.630 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.631 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.631 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.631 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.632 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.632 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.632 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.632 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.632 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.632 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.633 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.633 186792 DEBUG nova.virt.hardware [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:53:11 np0005531888 systemd[1]: Started Virtual Machine qemu-27-instance-00000039.
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.637 186792 DEBUG nova.virt.libvirt.vif [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:52:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1536496656',display_name='tempest-DeleteServersTestJSON-server-1536496656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1536496656',id=58,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-odo2xzim',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:52:57Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=0ca89548-45e7-4c83-bd8b-4447c7898213,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.638 186792 DEBUG nova.network.os_vif_util [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.638 186792 DEBUG nova.network.os_vif_util [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:74:09,bridge_name='br-int',has_traffic_filtering=True,id=be123147-427c-4357-b495-4d7a782eeb33,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe123147-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.639 186792 DEBUG nova.objects.instance [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0ca89548-45e7-4c83-bd8b-4447c7898213 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.640 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.676 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <uuid>0ca89548-45e7-4c83-bd8b-4447c7898213</uuid>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <name>instance-0000003a</name>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <nova:name>tempest-DeleteServersTestJSON-server-1536496656</nova:name>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:53:11</nova:creationTime>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        <nova:user uuid="57077a1511bf46d897beb6fd5eedfa67">tempest-DeleteServersTestJSON-550712359-project-member</nova:user>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        <nova:project uuid="6b68db2b61a54aeaa8ac219f44ed3e75">tempest-DeleteServersTestJSON-550712359</nova:project>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        <nova:port uuid="be123147-427c-4357-b495-4d7a782eeb33">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <entry name="serial">0ca89548-45e7-4c83-bd8b-4447c7898213</entry>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <entry name="uuid">0ca89548-45e7-4c83-bd8b-4447c7898213</entry>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk.config"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:5d:74:09"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <target dev="tapbe123147-42"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/console.log" append="off"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:53:11 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:53:11 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:53:11 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:53:11 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.676 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Preparing to wait for external event network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.676 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.677 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.677 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.678 186792 DEBUG nova.virt.libvirt.vif [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:52:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1536496656',display_name='tempest-DeleteServersTestJSON-server-1536496656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1536496656',id=58,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-odo2xzim',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:52:57Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=0ca89548-45e7-4c83-bd8b-4447c7898213,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.678 186792 DEBUG nova.network.os_vif_util [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.678 186792 DEBUG nova.network.os_vif_util [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:74:09,bridge_name='br-int',has_traffic_filtering=True,id=be123147-427c-4357-b495-4d7a782eeb33,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe123147-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.679 186792 DEBUG os_vif [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:74:09,bridge_name='br-int',has_traffic_filtering=True,id=be123147-427c-4357-b495-4d7a782eeb33,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe123147-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.679 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.680 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.680 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.682 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.683 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe123147-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.683 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe123147-42, col_values=(('external_ids', {'iface-id': 'be123147-427c-4357-b495-4d7a782eeb33', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:74:09', 'vm-uuid': '0ca89548-45e7-4c83-bd8b-4447c7898213'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.685 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:11 np0005531888 NetworkManager[55166]: <info>  [1763797991.6863] manager: (tapbe123147-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.687 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.691 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.692 186792 INFO os_vif [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:74:09,bridge_name='br-int',has_traffic_filtering=True,id=be123147-427c-4357-b495-4d7a782eeb33,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe123147-42')#033[00m
Nov 22 02:53:11 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:11Z|00139|binding|INFO|Setting lport fe4fc124-3378-4a5c-a252-9c8c23b3164b up in Southbound
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.717 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:14:49 10.100.0.11'], port_security=['fa:16:3e:a8:14:49 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e8d09e62-e93e-4700-bec1-fb9aff8683eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=fe4fc124-3378-4a5c-a252-9c8c23b3164b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.718 104023 INFO neutron.agent.ovn.metadata.agent [-] Port fe4fc124-3378-4a5c-a252-9c8c23b3164b in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a bound to our chassis#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.720 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.733 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2985941e-7453-4f79-b1d2-a9c082d39d31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.734 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdc6b9ee8-e1 in ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.736 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdc6b9ee8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.736 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fb704b86-93ad-437a-8b4e-e65aa63c6eaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.737 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[828aca58-d7cb-4be7-9355-d466cb393fb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.751 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[e8258090-e364-4998-8b0e-06ed43778c60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.775 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8fee1146-40b0-4038-9098-95ecd4d487c7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.798 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.799 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.799 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No VIF found with MAC fa:16:3e:5d:74:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:53:11 np0005531888 nova_compute[186788]: 2025-11-22 07:53:11.799 186792 INFO nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Using config drive#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.810 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c3426481-75f8-4af7-96a4-ab070318a93f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.815 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f27aa589-4073-42a0-a194-643bde031096]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 NetworkManager[55166]: <info>  [1763797991.8167] manager: (tapdc6b9ee8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/78)
Nov 22 02:53:11 np0005531888 systemd-udevd[221448]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.852 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4328b5be-ac67-4171-be82-2063abb4e286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.856 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf57b8a-6bb8-4ee8-8649-f0a86decbd4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 NetworkManager[55166]: <info>  [1763797991.8864] device (tapdc6b9ee8-e0): carrier: link connected
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.894 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2286ca5b-bda3-46ba-9ae1-95561ab03190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.921 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dd805c8d-3021-44fd-928e-50fe798b609d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473252, 'reachable_time': 24321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221484, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.940 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[53835066-cfbe-48a9-a78b-c72d3ea5ed7c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:d89c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473252, 'tstamp': 473252}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221492, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.958 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3512224b-2920-432e-91e1-0a136835f3ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473252, 'reachable_time': 24321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221493, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:11.992 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfbaec6-8712-40db-9c3b-90e299ac1eb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.023 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797992.0224397, e8d09e62-e93e-4700-bec1-fb9aff8683eb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.023 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] VM Started (Lifecycle Event)#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.055 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4aefaa43-81f2-41d6-a2c1-ecef53c46dfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.057 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.057 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.057 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc6b9ee8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:12 np0005531888 NetworkManager[55166]: <info>  [1763797992.0598] manager: (tapdc6b9ee8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.060 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:12 np0005531888 kernel: tapdc6b9ee8-e0: entered promiscuous mode
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.064 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.065 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdc6b9ee8-e0, col_values=(('external_ids', {'iface-id': '99cae854-daa9-4d08-8152-257a15e21bf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:12 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:12Z|00140|binding|INFO|Releasing lport 99cae854-daa9-4d08-8152-257a15e21bf8 from this chassis (sb_readonly=0)
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.069 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.070 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.075 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797992.0250678, e8d09e62-e93e-4700-bec1-fb9aff8683eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.076 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.079 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.082 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.083 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.084 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aca08070-bc8b-4f24-b296-f39c526194ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.085 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:12.085 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'env', 'PROCESS_TAG=haproxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.098 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.101 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.147 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:53:12 np0005531888 podman[221529]: 2025-11-22 07:53:12.486774476 +0000 UTC m=+0.025498238 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.603 186792 DEBUG nova.compute.manager [req-c84e7fca-6769-4abd-aab7-ae3bc74d068b req-b388b0c9-7a7a-48fd-abff-8f81bde4f5b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Received event network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.604 186792 DEBUG oslo_concurrency.lockutils [req-c84e7fca-6769-4abd-aab7-ae3bc74d068b req-b388b0c9-7a7a-48fd-abff-8f81bde4f5b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.604 186792 DEBUG oslo_concurrency.lockutils [req-c84e7fca-6769-4abd-aab7-ae3bc74d068b req-b388b0c9-7a7a-48fd-abff-8f81bde4f5b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.604 186792 DEBUG oslo_concurrency.lockutils [req-c84e7fca-6769-4abd-aab7-ae3bc74d068b req-b388b0c9-7a7a-48fd-abff-8f81bde4f5b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.605 186792 DEBUG nova.compute.manager [req-c84e7fca-6769-4abd-aab7-ae3bc74d068b req-b388b0c9-7a7a-48fd-abff-8f81bde4f5b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Processing event network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.605 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.609 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797992.6089087, e8d09e62-e93e-4700-bec1-fb9aff8683eb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.609 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.611 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.615 186792 INFO nova.virt.libvirt.driver [-] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Instance spawned successfully.#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.616 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.629 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.634 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.637 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.637 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.638 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.638 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.639 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.639 186792 DEBUG nova.virt.libvirt.driver [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:12 np0005531888 podman[221529]: 2025-11-22 07:53:12.652848131 +0000 UTC m=+0.191571873 container create f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.672 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.718 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.772 186792 INFO nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Took 15.47 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:53:12 np0005531888 nova_compute[186788]: 2025-11-22 07:53:12.773 186792 DEBUG nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:12 np0005531888 systemd[1]: Started libpod-conmon-f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc.scope.
Nov 22 02:53:12 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:53:12 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f45ceaeebb9eff22d79d27eda44481830fe3cc077dcb9354c0badb3d8089585d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:53:12 np0005531888 podman[221529]: 2025-11-22 07:53:12.94801398 +0000 UTC m=+0.486737752 container init f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:53:12 np0005531888 podman[221529]: 2025-11-22 07:53:12.953995534 +0000 UTC m=+0.492719276 container start f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 02:53:12 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[221545]: [NOTICE]   (221549) : New worker (221551) forked
Nov 22 02:53:12 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[221545]: [NOTICE]   (221549) : Loading success.
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.006 186792 INFO nova.compute.manager [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Took 18.49 seconds to build instance.#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.062 186792 INFO nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Creating config drive at /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk.config#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.067 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsje82rl5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.120 186792 DEBUG oslo_concurrency.lockutils [None req-9de6ce2d-837a-4043-accf-b830d57a9e05 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.946s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.203 186792 DEBUG oslo_concurrency.processutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsje82rl5" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:13 np0005531888 kernel: tapbe123147-42: entered promiscuous mode
Nov 22 02:53:13 np0005531888 NetworkManager[55166]: <info>  [1763797993.2778] manager: (tapbe123147-42): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 22 02:53:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:13Z|00141|binding|INFO|Claiming lport be123147-427c-4357-b495-4d7a782eeb33 for this chassis.
Nov 22 02:53:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:13Z|00142|binding|INFO|be123147-427c-4357-b495-4d7a782eeb33: Claiming fa:16:3e:5d:74:09 10.100.0.12
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.279 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:13 np0005531888 systemd-udevd[221471]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:53:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:13Z|00143|binding|INFO|Setting lport be123147-427c-4357-b495-4d7a782eeb33 ovn-installed in OVS
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.292 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.294 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:13 np0005531888 NetworkManager[55166]: <info>  [1763797993.3004] device (tapbe123147-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:53:13 np0005531888 NetworkManager[55166]: <info>  [1763797993.3019] device (tapbe123147-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:53:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:13Z|00144|binding|INFO|Setting lport be123147-427c-4357-b495-4d7a782eeb33 up in Southbound
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.309 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:74:09 10.100.0.12'], port_security=['fa:16:3e:5d:74:09 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0ca89548-45e7-4c83-bd8b-4447c7898213', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=be123147-427c-4357-b495-4d7a782eeb33) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.311 104023 INFO neutron.agent.ovn.metadata.agent [-] Port be123147-427c-4357-b495-4d7a782eeb33 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c bound to our chassis#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.313 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5e910dbb-27d1-4915-8b74-d0538d33c33c#033[00m
Nov 22 02:53:13 np0005531888 systemd-machined[153106]: New machine qemu-28-instance-0000003a.
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.328 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[efb0724e-5b48-4cfa-aa2e-bcd2b7b78008]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.329 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5e910dbb-21 in ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.331 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5e910dbb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.331 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f645c2a8-df12-4837-a8c2-ce1b24652fb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.333 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a5a516-c858-4fcc-a029-0db0c3b41f76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 systemd[1]: Started Virtual Machine qemu-28-instance-0000003a.
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.346 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[05940d4d-7534-4a73-b287-25ee7faa0669]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.377 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[df03718b-c307-4ac4-ba20-901b56fa2041]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.411 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5a9d4e4b-b585-494b-abdb-bf7db314dec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 NetworkManager[55166]: <info>  [1763797993.4194] manager: (tap5e910dbb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.418 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf64bee-373c-49f4-a265-36f2919889cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.452 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[47d690f3-9110-40da-bf83-73f9193cbf43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.455 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6128042f-13c4-4fc9-9d44-09b52dd0d913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 NetworkManager[55166]: <info>  [1763797993.4768] device (tap5e910dbb-20): carrier: link connected
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.481 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[33e46a97-a134-4ca2-ac35-f1b0ff160e73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.498 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[16e2fee1-7294-4b2e-8d98-0a28c3e51a97]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473412, 'reachable_time': 21333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221596, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.516 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bd7967eb-89a5-4b9e-8231-7f8481619878]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e859'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473412, 'tstamp': 473412}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221597, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.536 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a5dac353-b2e6-4f7f-bc54-2ae18dbbddc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473412, 'reachable_time': 21333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221598, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.566 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[df3e1ba1-bf44-416c-a0b4-a2a1c8647b44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.625 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a8283a7a-def7-4f53-9678-76cdbaff8b95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.628 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.628 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.629 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e910dbb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:13 np0005531888 NetworkManager[55166]: <info>  [1763797993.6328] manager: (tap5e910dbb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 22 02:53:13 np0005531888 kernel: tap5e910dbb-20: entered promiscuous mode
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.631 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.635 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5e910dbb-20, col_values=(('external_ids', {'iface-id': 'df80c07a-3ea3-4dde-8219-31b028a556e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:13Z|00145|binding|INFO|Releasing lport df80c07a-3ea3-4dde-8219-31b028a556e5 from this chassis (sb_readonly=0)
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.651 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.653 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.655 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7b01e13c-a2ec-4d9c-8ab4-d3016897920d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.656 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:53:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:13.657 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'env', 'PROCESS_TAG=haproxy-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5e910dbb-27d1-4915-8b74-d0538d33c33c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.827 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797993.8260465, 0ca89548-45e7-4c83-bd8b-4447c7898213 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.828 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] VM Started (Lifecycle Event)#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.868 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.873 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797993.8289702, 0ca89548-45e7-4c83-bd8b-4447c7898213 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.873 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.904 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.909 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:53:13 np0005531888 nova_compute[186788]: 2025-11-22 07:53:13.943 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:53:14 np0005531888 nova_compute[186788]: 2025-11-22 07:53:14.065 186792 DEBUG nova.network.neutron [req-1451ab68-6b19-43df-92a3-e026e041cb6d req-23d6936c-3c26-4f25-b1c4-1c5ff120fb4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Updated VIF entry in instance network info cache for port fe4fc124-3378-4a5c-a252-9c8c23b3164b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:53:14 np0005531888 nova_compute[186788]: 2025-11-22 07:53:14.066 186792 DEBUG nova.network.neutron [req-1451ab68-6b19-43df-92a3-e026e041cb6d req-23d6936c-3c26-4f25-b1c4-1c5ff120fb4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Updating instance_info_cache with network_info: [{"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:53:14 np0005531888 podman[221635]: 2025-11-22 07:53:14.025664243 +0000 UTC m=+0.032798291 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:53:14 np0005531888 nova_compute[186788]: 2025-11-22 07:53:14.125 186792 DEBUG oslo_concurrency.lockutils [req-1451ab68-6b19-43df-92a3-e026e041cb6d req-23d6936c-3c26-4f25-b1c4-1c5ff120fb4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:53:14 np0005531888 podman[221635]: 2025-11-22 07:53:14.130388744 +0000 UTC m=+0.137522762 container create 8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 02:53:14 np0005531888 systemd[1]: Started libpod-conmon-8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748.scope.
Nov 22 02:53:14 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:53:14 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d55e743e6e7c59ad925ec27aa4e450213f1d53d2b5d938a2d0aa93b10dc9e216/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:53:14 np0005531888 podman[221635]: 2025-11-22 07:53:14.24792111 +0000 UTC m=+0.255055148 container init 8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 02:53:14 np0005531888 podman[221647]: 2025-11-22 07:53:14.24887232 +0000 UTC m=+0.079594512 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:53:14 np0005531888 podman[221635]: 2025-11-22 07:53:14.256267625 +0000 UTC m=+0.263401643 container start 8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:53:14 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221658]: [NOTICE]   (221672) : New worker (221674) forked
Nov 22 02:53:14 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221658]: [NOTICE]   (221672) : Loading success.
Nov 22 02:53:15 np0005531888 nova_compute[186788]: 2025-11-22 07:53:15.108 186792 DEBUG nova.compute.manager [req-6813b071-e8e8-43aa-9c31-90947bd1de3e req-05cb68db-f696-4e3e-8c54-0f77aaf297ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Received event network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:15 np0005531888 nova_compute[186788]: 2025-11-22 07:53:15.108 186792 DEBUG oslo_concurrency.lockutils [req-6813b071-e8e8-43aa-9c31-90947bd1de3e req-05cb68db-f696-4e3e-8c54-0f77aaf297ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:15 np0005531888 nova_compute[186788]: 2025-11-22 07:53:15.109 186792 DEBUG oslo_concurrency.lockutils [req-6813b071-e8e8-43aa-9c31-90947bd1de3e req-05cb68db-f696-4e3e-8c54-0f77aaf297ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:15 np0005531888 nova_compute[186788]: 2025-11-22 07:53:15.109 186792 DEBUG oslo_concurrency.lockutils [req-6813b071-e8e8-43aa-9c31-90947bd1de3e req-05cb68db-f696-4e3e-8c54-0f77aaf297ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:15 np0005531888 nova_compute[186788]: 2025-11-22 07:53:15.109 186792 DEBUG nova.compute.manager [req-6813b071-e8e8-43aa-9c31-90947bd1de3e req-05cb68db-f696-4e3e-8c54-0f77aaf297ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] No waiting events found dispatching network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:53:15 np0005531888 nova_compute[186788]: 2025-11-22 07:53:15.110 186792 WARNING nova.compute.manager [req-6813b071-e8e8-43aa-9c31-90947bd1de3e req-05cb68db-f696-4e3e-8c54-0f77aaf297ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Received unexpected event network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b for instance with vm_state active and task_state None.#033[00m
Nov 22 02:53:16 np0005531888 nova_compute[186788]: 2025-11-22 07:53:16.007 186792 DEBUG nova.network.neutron [req-3de0e88e-45ac-4f04-a455-d0d38ff17f86 req-e267902d-8a13-4fc4-af34-2722767c5c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Updated VIF entry in instance network info cache for port be123147-427c-4357-b495-4d7a782eeb33. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:53:16 np0005531888 nova_compute[186788]: 2025-11-22 07:53:16.007 186792 DEBUG nova.network.neutron [req-3de0e88e-45ac-4f04-a455-d0d38ff17f86 req-e267902d-8a13-4fc4-af34-2722767c5c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Updating instance_info_cache with network_info: [{"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:53:16 np0005531888 nova_compute[186788]: 2025-11-22 07:53:16.032 186792 DEBUG oslo_concurrency.lockutils [req-3de0e88e-45ac-4f04-a455-d0d38ff17f86 req-e267902d-8a13-4fc4-af34-2722767c5c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-0ca89548-45e7-4c83-bd8b-4447c7898213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:53:16 np0005531888 nova_compute[186788]: 2025-11-22 07:53:16.685 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.272 186792 DEBUG nova.compute.manager [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received event network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.274 186792 DEBUG oslo_concurrency.lockutils [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.274 186792 DEBUG oslo_concurrency.lockutils [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.274 186792 DEBUG oslo_concurrency.lockutils [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.274 186792 DEBUG nova.compute.manager [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Processing event network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.275 186792 DEBUG nova.compute.manager [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received event network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.275 186792 DEBUG oslo_concurrency.lockutils [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.275 186792 DEBUG oslo_concurrency.lockutils [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.275 186792 DEBUG oslo_concurrency.lockutils [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.275 186792 DEBUG nova.compute.manager [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] No waiting events found dispatching network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.276 186792 WARNING nova.compute.manager [req-087c404e-ba58-4872-aa99-3ecb30766dfa req-c7c221d9-8d89-49d5-b38d-710a4d53bb3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received unexpected event network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.276 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.282 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797997.282331, 0ca89548-45e7-4c83-bd8b-4447c7898213 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.282 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.285 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.291 186792 INFO nova.virt.libvirt.driver [-] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Instance spawned successfully.#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.291 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.310 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.319 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.327 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.328 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.329 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.329 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.330 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.330 186792 DEBUG nova.virt.libvirt.driver [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.382 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.508 186792 INFO nova.compute.manager [None req-b3c9bc4f-a85e-4db7-aa24-1ed24896552e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Pausing#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.509 186792 DEBUG nova.objects.instance [None req-b3c9bc4f-a85e-4db7-aa24-1ed24896552e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'flavor' on Instance uuid e8d09e62-e93e-4700-bec1-fb9aff8683eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.564 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763797997.5645182, e8d09e62-e93e-4700-bec1-fb9aff8683eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.565 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.568 186792 DEBUG nova.compute.manager [None req-b3c9bc4f-a85e-4db7-aa24-1ed24896552e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.615 186792 INFO nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Took 19.90 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.616 186792 DEBUG nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.649 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.653 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.684 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.719 186792 INFO nova.compute.manager [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Took 21.07 seconds to build instance.#033[00m
Nov 22 02:53:17 np0005531888 nova_compute[186788]: 2025-11-22 07:53:17.720 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:18 np0005531888 nova_compute[186788]: 2025-11-22 07:53:18.080 186792 DEBUG oslo_concurrency.lockutils [None req-ce227763-019d-40cc-b9db-90329f10bf8c 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:20 np0005531888 nova_compute[186788]: 2025-11-22 07:53:20.051 186792 DEBUG nova.objects.instance [None req-70884aad-db32-4f13-b7fa-8aab612c12bc 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0ca89548-45e7-4c83-bd8b-4447c7898213 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:53:20 np0005531888 nova_compute[186788]: 2025-11-22 07:53:20.320 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798000.319747, 0ca89548-45e7-4c83-bd8b-4447c7898213 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:20 np0005531888 nova_compute[186788]: 2025-11-22 07:53:20.320 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:53:20 np0005531888 nova_compute[186788]: 2025-11-22 07:53:20.386 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:20 np0005531888 nova_compute[186788]: 2025-11-22 07:53:20.390 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:53:20 np0005531888 nova_compute[186788]: 2025-11-22 07:53:20.430 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 22 02:53:20 np0005531888 kernel: tapbe123147-42 (unregistering): left promiscuous mode
Nov 22 02:53:20 np0005531888 NetworkManager[55166]: <info>  [1763798000.9925] device (tapbe123147-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.007 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:21 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:21Z|00146|binding|INFO|Releasing lport be123147-427c-4357-b495-4d7a782eeb33 from this chassis (sb_readonly=0)
Nov 22 02:53:21 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:21Z|00147|binding|INFO|Setting lport be123147-427c-4357-b495-4d7a782eeb33 down in Southbound
Nov 22 02:53:21 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:21Z|00148|binding|INFO|Removing iface tapbe123147-42 ovn-installed in OVS
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.020 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:21 np0005531888 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Nov 22 02:53:21 np0005531888 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000003a.scope: Consumed 3.546s CPU time.
Nov 22 02:53:21 np0005531888 systemd-machined[153106]: Machine qemu-28-instance-0000003a terminated.
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.108 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:74:09 10.100.0.12'], port_security=['fa:16:3e:5d:74:09 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0ca89548-45e7-4c83-bd8b-4447c7898213', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=be123147-427c-4357-b495-4d7a782eeb33) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.110 104023 INFO neutron.agent.ovn.metadata.agent [-] Port be123147-427c-4357-b495-4d7a782eeb33 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c unbound from our chassis#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.111 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e910dbb-27d1-4915-8b74-d0538d33c33c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.112 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[235d1172-ab7a-4f7b-b4c5-f0b2c7dff3d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.113 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace which is not needed anymore#033[00m
Nov 22 02:53:21 np0005531888 podman[221686]: 2025-11-22 07:53:21.136677266 +0000 UTC m=+0.110970652 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.250 186792 DEBUG nova.compute.manager [None req-70884aad-db32-4f13-b7fa-8aab612c12bc 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:21 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221658]: [NOTICE]   (221672) : haproxy version is 2.8.14-c23fe91
Nov 22 02:53:21 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221658]: [NOTICE]   (221672) : path to executable is /usr/sbin/haproxy
Nov 22 02:53:21 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221658]: [WARNING]  (221672) : Exiting Master process...
Nov 22 02:53:21 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221658]: [WARNING]  (221672) : Exiting Master process...
Nov 22 02:53:21 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221658]: [ALERT]    (221672) : Current worker (221674) exited with code 143 (Terminated)
Nov 22 02:53:21 np0005531888 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221658]: [WARNING]  (221672) : All workers exited. Exiting... (0)
Nov 22 02:53:21 np0005531888 systemd[1]: libpod-8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748.scope: Deactivated successfully.
Nov 22 02:53:21 np0005531888 podman[221731]: 2025-11-22 07:53:21.27891048 +0000 UTC m=+0.064913555 container died 8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:53:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748-userdata-shm.mount: Deactivated successfully.
Nov 22 02:53:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay-d55e743e6e7c59ad925ec27aa4e450213f1d53d2b5d938a2d0aa93b10dc9e216-merged.mount: Deactivated successfully.
Nov 22 02:53:21 np0005531888 podman[221731]: 2025-11-22 07:53:21.369159489 +0000 UTC m=+0.155162584 container cleanup 8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:53:21 np0005531888 systemd[1]: libpod-conmon-8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748.scope: Deactivated successfully.
Nov 22 02:53:21 np0005531888 podman[221773]: 2025-11-22 07:53:21.454236673 +0000 UTC m=+0.059457195 container remove 8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.461 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[202300ea-6ed0-4bd0-9f27-4da0836bd97c]: (4, ('Sat Nov 22 07:53:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748)\n8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748\nSat Nov 22 07:53:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748)\n8e974b1da45f936c367e6540efc9ed6a9bea06c40a08e78411ad231b26aa4748\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.464 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9210e6ba-018c-48fd-bc2c-595f860e8f84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.465 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.468 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:21 np0005531888 kernel: tap5e910dbb-20: left promiscuous mode
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.488 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.490 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.493 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e2534068-2030-4bc6-af9d-0839087b49fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.515 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[10839cf3-09f3-433e-afcb-1b636e83a923]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.517 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a11b9e8e-899d-47ee-87fe-fe762ff46e68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.534 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec6e7c0-31d2-4324-87f6-1b73b5be90ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473405, 'reachable_time': 20549, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221792, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.538 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:53:21 np0005531888 systemd[1]: run-netns-ovnmeta\x2d5e910dbb\x2d27d1\x2d4915\x2d8b74\x2dd0538d33c33c.mount: Deactivated successfully.
Nov 22 02:53:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:21.539 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9b2191-11e6-4455-a061-214b227bad9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:21 np0005531888 nova_compute[186788]: 2025-11-22 07:53:21.688 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:22 np0005531888 nova_compute[186788]: 2025-11-22 07:53:22.187 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:22.187 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:53:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:22.189 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:53:22 np0005531888 nova_compute[186788]: 2025-11-22 07:53:22.469 186792 DEBUG nova.compute.manager [req-7d60a036-ee9b-4bc5-9aba-48c85dc70c78 req-ae4a69cb-2c56-4cac-ba54-c2c4dfa5668f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received event network-vif-unplugged-be123147-427c-4357-b495-4d7a782eeb33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:22 np0005531888 nova_compute[186788]: 2025-11-22 07:53:22.469 186792 DEBUG oslo_concurrency.lockutils [req-7d60a036-ee9b-4bc5-9aba-48c85dc70c78 req-ae4a69cb-2c56-4cac-ba54-c2c4dfa5668f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:22 np0005531888 nova_compute[186788]: 2025-11-22 07:53:22.469 186792 DEBUG oslo_concurrency.lockutils [req-7d60a036-ee9b-4bc5-9aba-48c85dc70c78 req-ae4a69cb-2c56-4cac-ba54-c2c4dfa5668f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:22 np0005531888 nova_compute[186788]: 2025-11-22 07:53:22.470 186792 DEBUG oslo_concurrency.lockutils [req-7d60a036-ee9b-4bc5-9aba-48c85dc70c78 req-ae4a69cb-2c56-4cac-ba54-c2c4dfa5668f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:22 np0005531888 nova_compute[186788]: 2025-11-22 07:53:22.470 186792 DEBUG nova.compute.manager [req-7d60a036-ee9b-4bc5-9aba-48c85dc70c78 req-ae4a69cb-2c56-4cac-ba54-c2c4dfa5668f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] No waiting events found dispatching network-vif-unplugged-be123147-427c-4357-b495-4d7a782eeb33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:53:22 np0005531888 nova_compute[186788]: 2025-11-22 07:53:22.470 186792 WARNING nova.compute.manager [req-7d60a036-ee9b-4bc5-9aba-48c85dc70c78 req-ae4a69cb-2c56-4cac-ba54-c2c4dfa5668f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received unexpected event network-vif-unplugged-be123147-427c-4357-b495-4d7a782eeb33 for instance with vm_state suspended and task_state None.#033[00m
Nov 22 02:53:22 np0005531888 nova_compute[186788]: 2025-11-22 07:53:22.723 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:23 np0005531888 podman[221793]: 2025-11-22 07:53:23.718291387 +0000 UTC m=+0.072981873 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:53:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:25.193 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:26 np0005531888 nova_compute[186788]: 2025-11-22 07:53:26.690 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:27 np0005531888 podman[221817]: 2025-11-22 07:53:27.705603518 +0000 UTC m=+0.076586951 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 02:53:27 np0005531888 nova_compute[186788]: 2025-11-22 07:53:27.725 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:31 np0005531888 nova_compute[186788]: 2025-11-22 07:53:31.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:32 np0005531888 nova_compute[186788]: 2025-11-22 07:53:32.728 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:35 np0005531888 podman[221840]: 2025-11-22 07:53:35.693251411 +0000 UTC m=+0.063151551 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 02:53:35 np0005531888 podman[221841]: 2025-11-22 07:53:35.739536047 +0000 UTC m=+0.103147872 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.251 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798001.2494633, 0ca89548-45e7-4c83-bd8b-4447c7898213 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.252 186792 INFO nova.compute.manager [-] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.271 186792 DEBUG nova.compute.manager [None req-90319e55-5d51-4c62-8107-e3e9c20df025 - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.275 186792 DEBUG nova.compute.manager [None req-90319e55-5d51-4c62-8107-e3e9c20df025 - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: suspended, current task_state: deleting, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.310 186792 INFO nova.compute.manager [None req-90319e55-5d51-4c62-8107-e3e9c20df025 - - - - - -] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.364 186792 DEBUG nova.compute.manager [req-d84ad9aa-8c93-47ba-8454-4c123a4e4e50 req-ad539408-dfcf-4fe5-9925-c4b8c0aeee38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received event network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.365 186792 DEBUG oslo_concurrency.lockutils [req-d84ad9aa-8c93-47ba-8454-4c123a4e4e50 req-ad539408-dfcf-4fe5-9925-c4b8c0aeee38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.366 186792 DEBUG oslo_concurrency.lockutils [req-d84ad9aa-8c93-47ba-8454-4c123a4e4e50 req-ad539408-dfcf-4fe5-9925-c4b8c0aeee38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.366 186792 DEBUG oslo_concurrency.lockutils [req-d84ad9aa-8c93-47ba-8454-4c123a4e4e50 req-ad539408-dfcf-4fe5-9925-c4b8c0aeee38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.366 186792 DEBUG nova.compute.manager [req-d84ad9aa-8c93-47ba-8454-4c123a4e4e50 req-ad539408-dfcf-4fe5-9925-c4b8c0aeee38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] No waiting events found dispatching network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.366 186792 WARNING nova.compute.manager [req-d84ad9aa-8c93-47ba-8454-4c123a4e4e50 req-ad539408-dfcf-4fe5-9925-c4b8c0aeee38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received unexpected event network-vif-plugged-be123147-427c-4357-b495-4d7a782eeb33 for instance with vm_state suspended and task_state deleting.#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.378 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "0ca89548-45e7-4c83-bd8b-4447c7898213" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.379 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.379 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.379 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.380 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.389 186792 INFO nova.compute.manager [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Terminating instance#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.396 186792 DEBUG nova.compute.manager [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.402 186792 INFO nova.virt.libvirt.driver [-] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Instance destroyed successfully.#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.403 186792 DEBUG nova.objects.instance [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'resources' on Instance uuid 0ca89548-45e7-4c83-bd8b-4447c7898213 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.488 186792 DEBUG nova.virt.libvirt.vif [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:52:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1536496656',display_name='tempest-DeleteServersTestJSON-server-1536496656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1536496656',id=58,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:53:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-odo2xzim',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:53:21Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=0ca89548-45e7-4c83-bd8b-4447c7898213,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.489 186792 DEBUG nova.network.os_vif_util [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "be123147-427c-4357-b495-4d7a782eeb33", "address": "fa:16:3e:5d:74:09", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe123147-42", "ovs_interfaceid": "be123147-427c-4357-b495-4d7a782eeb33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.490 186792 DEBUG nova.network.os_vif_util [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:74:09,bridge_name='br-int',has_traffic_filtering=True,id=be123147-427c-4357-b495-4d7a782eeb33,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe123147-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.491 186792 DEBUG os_vif [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:74:09,bridge_name='br-int',has_traffic_filtering=True,id=be123147-427c-4357-b495-4d7a782eeb33,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe123147-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.493 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.493 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe123147-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.496 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.498 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.504 186792 INFO os_vif [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:74:09,bridge_name='br-int',has_traffic_filtering=True,id=be123147-427c-4357-b495-4d7a782eeb33,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe123147-42')#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.505 186792 INFO nova.virt.libvirt.driver [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Deleting instance files /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213_del#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.507 186792 INFO nova.virt.libvirt.driver [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Deletion of /var/lib/nova/instances/0ca89548-45e7-4c83-bd8b-4447c7898213_del complete#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.733 186792 DEBUG nova.compute.manager [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.798 186792 INFO nova.compute.manager [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] instance snapshotting#033[00m
Nov 22 02:53:36 np0005531888 nova_compute[186788]: 2025-11-22 07:53:36.798 186792 WARNING nova.compute.manager [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] trying to snapshot a non-running instance: (state: 3 expected: 1)#033[00m
Nov 22 02:53:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:36.806 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:36.806 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:36.807 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:37 np0005531888 nova_compute[186788]: 2025-11-22 07:53:37.242 186792 INFO nova.virt.libvirt.driver [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Beginning live snapshot process#033[00m
Nov 22 02:53:37 np0005531888 nova_compute[186788]: 2025-11-22 07:53:37.568 186792 INFO nova.compute.manager [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Took 1.17 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:53:37 np0005531888 nova_compute[186788]: 2025-11-22 07:53:37.569 186792 DEBUG oslo.service.loopingcall [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:53:37 np0005531888 nova_compute[186788]: 2025-11-22 07:53:37.569 186792 DEBUG nova.compute.manager [-] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:53:37 np0005531888 nova_compute[186788]: 2025-11-22 07:53:37.570 186792 DEBUG nova.network.neutron [-] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:53:37 np0005531888 nova_compute[186788]: 2025-11-22 07:53:37.730 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:38 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:53:38 np0005531888 nova_compute[186788]: 2025-11-22 07:53:38.730 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:38 np0005531888 nova_compute[186788]: 2025-11-22 07:53:38.795 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:38 np0005531888 nova_compute[186788]: 2025-11-22 07:53:38.797 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:38 np0005531888 nova_compute[186788]: 2025-11-22 07:53:38.862 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:38 np0005531888 nova_compute[186788]: 2025-11-22 07:53:38.874 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:38 np0005531888 nova_compute[186788]: 2025-11-22 07:53:38.931 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:38 np0005531888 nova_compute[186788]: 2025-11-22 07:53:38.932 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp1_8owew6/27cc4fe98f2a4199bd2ebdcd347f6632.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:39 np0005531888 nova_compute[186788]: 2025-11-22 07:53:39.111 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp1_8owew6/27cc4fe98f2a4199bd2ebdcd347f6632.delta 1073741824" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:39 np0005531888 nova_compute[186788]: 2025-11-22 07:53:39.112 186792 INFO nova.virt.libvirt.driver [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:53:39 np0005531888 nova_compute[186788]: 2025-11-22 07:53:39.183 186792 DEBUG nova.virt.libvirt.guest [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:53:39 np0005531888 nova_compute[186788]: 2025-11-22 07:53:39.687 186792 DEBUG nova.virt.libvirt.guest [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:53:39 np0005531888 nova_compute[186788]: 2025-11-22 07:53:39.692 186792 INFO nova.virt.libvirt.driver [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:53:39 np0005531888 nova_compute[186788]: 2025-11-22 07:53:39.781 186792 DEBUG nova.privsep.utils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:53:39 np0005531888 nova_compute[186788]: 2025-11-22 07:53:39.782 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp1_8owew6/27cc4fe98f2a4199bd2ebdcd347f6632.delta /var/lib/nova/instances/snapshots/tmp1_8owew6/27cc4fe98f2a4199bd2ebdcd347f6632 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:40 np0005531888 nova_compute[186788]: 2025-11-22 07:53:40.670 186792 DEBUG oslo_concurrency.processutils [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp1_8owew6/27cc4fe98f2a4199bd2ebdcd347f6632.delta /var/lib/nova/instances/snapshots/tmp1_8owew6/27cc4fe98f2a4199bd2ebdcd347f6632" returned: 0 in 0.888s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:40 np0005531888 nova_compute[186788]: 2025-11-22 07:53:40.672 186792 INFO nova.virt.libvirt.driver [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:53:40 np0005531888 nova_compute[186788]: 2025-11-22 07:53:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:40 np0005531888 nova_compute[186788]: 2025-11-22 07:53:40.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:53:41 np0005531888 nova_compute[186788]: 2025-11-22 07:53:41.497 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:41 np0005531888 podman[221911]: 2025-11-22 07:53:41.686935506 +0000 UTC m=+0.053663838 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:53:41 np0005531888 nova_compute[186788]: 2025-11-22 07:53:41.777 186792 DEBUG nova.network.neutron [-] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:53:41 np0005531888 nova_compute[186788]: 2025-11-22 07:53:41.824 186792 INFO nova.compute.manager [-] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Took 4.25 seconds to deallocate network for instance.#033[00m
Nov 22 02:53:41 np0005531888 nova_compute[186788]: 2025-11-22 07:53:41.977 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:41 np0005531888 nova_compute[186788]: 2025-11-22 07:53:41.978 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:53:41 np0005531888 nova_compute[186788]: 2025-11-22 07:53:41.978 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:53:41 np0005531888 nova_compute[186788]: 2025-11-22 07:53:41.998 186792 DEBUG nova.compute.manager [req-cc4bc911-14ca-4f3c-bb33-b03f1633f2f2 req-fc340452-141b-4558-8e87-d2cf79d654b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0ca89548-45e7-4c83-bd8b-4447c7898213] Received event network-vif-deleted-be123147-427c-4357-b495-4d7a782eeb33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.005 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.006 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.012 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.012 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.012 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.013 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e8d09e62-e93e-4700-bec1-fb9aff8683eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.175 186792 DEBUG nova.compute.provider_tree [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.190 186792 DEBUG nova.scheduler.client.report [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.227 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.458 186792 INFO nova.scheduler.client.report [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Deleted allocations for instance 0ca89548-45e7-4c83-bd8b-4447c7898213#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.724 186792 DEBUG oslo_concurrency.lockutils [None req-30985552-65d5-4de7-bb4d-f23a2e8bf755 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "0ca89548-45e7-4c83-bd8b-4447c7898213" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:42 np0005531888 nova_compute[186788]: 2025-11-22 07:53:42.732 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:44 np0005531888 podman[221936]: 2025-11-22 07:53:44.68038243 +0000 UTC m=+0.054604072 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 22 02:53:45 np0005531888 nova_compute[186788]: 2025-11-22 07:53:45.790 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Updating instance_info_cache with network_info: [{"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:53:45 np0005531888 nova_compute[186788]: 2025-11-22 07:53:45.886 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-e8d09e62-e93e-4700-bec1-fb9aff8683eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:53:45 np0005531888 nova_compute[186788]: 2025-11-22 07:53:45.887 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:53:45 np0005531888 nova_compute[186788]: 2025-11-22 07:53:45.887 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:45 np0005531888 nova_compute[186788]: 2025-11-22 07:53:45.887 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:45 np0005531888 nova_compute[186788]: 2025-11-22 07:53:45.888 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:45 np0005531888 nova_compute[186788]: 2025-11-22 07:53:45.888 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:46 np0005531888 nova_compute[186788]: 2025-11-22 07:53:46.498 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:46 np0005531888 nova_compute[186788]: 2025-11-22 07:53:46.859 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:47 np0005531888 nova_compute[186788]: 2025-11-22 07:53:47.042 186792 INFO nova.virt.libvirt.driver [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Snapshot image upload complete#033[00m
Nov 22 02:53:47 np0005531888 nova_compute[186788]: 2025-11-22 07:53:47.043 186792 INFO nova.compute.manager [None req-b2733732-1b09-46a0-833c-b75c96478f1e 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Took 10.24 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:53:47 np0005531888 nova_compute[186788]: 2025-11-22 07:53:47.735 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:47 np0005531888 nova_compute[186788]: 2025-11-22 07:53:47.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:49 np0005531888 nova_compute[186788]: 2025-11-22 07:53:49.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:49 np0005531888 nova_compute[186788]: 2025-11-22 07:53:49.973 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:53:49 np0005531888 nova_compute[186788]: 2025-11-22 07:53:49.990 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:53:50 np0005531888 nova_compute[186788]: 2025-11-22 07:53:50.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.500 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:51 np0005531888 podman[221955]: 2025-11-22 07:53:51.715725375 +0000 UTC m=+0.086015903 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.978 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.979 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.981 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.981 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.982 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.982 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.983 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:51 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.992 186792 INFO nova.compute.manager [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Terminating instance#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:51.999 186792 DEBUG nova.compute.manager [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:53:52 np0005531888 kernel: tapfe4fc124-33 (unregistering): left promiscuous mode
Nov 22 02:53:52 np0005531888 NetworkManager[55166]: <info>  [1763798032.0238] device (tapfe4fc124-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.027 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:52Z|00149|binding|INFO|Releasing lport fe4fc124-3378-4a5c-a252-9c8c23b3164b from this chassis (sb_readonly=0)
Nov 22 02:53:52 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:52Z|00150|binding|INFO|Setting lport fe4fc124-3378-4a5c-a252-9c8c23b3164b down in Southbound
Nov 22 02:53:52 np0005531888 ovn_controller[95067]: 2025-11-22T07:53:52Z|00151|binding|INFO|Removing iface tapfe4fc124-33 ovn-installed in OVS
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.031 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.045 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:52.057 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:14:49 10.100.0.11'], port_security=['fa:16:3e:a8:14:49 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e8d09e62-e93e-4700-bec1-fb9aff8683eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=fe4fc124-3378-4a5c-a252-9c8c23b3164b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:53:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:52.058 104023 INFO neutron.agent.ovn.metadata.agent [-] Port fe4fc124-3378-4a5c-a252-9c8c23b3164b in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a unbound from our chassis#033[00m
Nov 22 02:53:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:52.060 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:53:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:52.091 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[77f2b023-a201-42ce-baf1-a225e54bfef3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:52.092 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace which is not needed anymore#033[00m
Nov 22 02:53:52 np0005531888 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000039.scope: Deactivated successfully.
Nov 22 02:53:52 np0005531888 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000039.scope: Consumed 5.348s CPU time.
Nov 22 02:53:52 np0005531888 systemd-machined[153106]: Machine qemu-27-instance-00000039 terminated.
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.226 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.232 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.279 186792 INFO nova.virt.libvirt.driver [-] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Instance destroyed successfully.#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.279 186792 DEBUG nova.objects.instance [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'resources' on Instance uuid e8d09e62-e93e-4700-bec1-fb9aff8683eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.294 186792 DEBUG nova.virt.libvirt.vif [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:52:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-470238106',display_name='tempest-ImagesTestJSON-server-470238106',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-470238106',id=57,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:53:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-nlmpfle3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:53:47Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=e8d09e62-e93e-4700-bec1-fb9aff8683eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.295 186792 DEBUG nova.network.os_vif_util [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "address": "fa:16:3e:a8:14:49", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4fc124-33", "ovs_interfaceid": "fe4fc124-3378-4a5c-a252-9c8c23b3164b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.295 186792 DEBUG nova.network.os_vif_util [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a8:14:49,bridge_name='br-int',has_traffic_filtering=True,id=fe4fc124-3378-4a5c-a252-9c8c23b3164b,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4fc124-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.296 186792 DEBUG os_vif [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a8:14:49,bridge_name='br-int',has_traffic_filtering=True,id=fe4fc124-3378-4a5c-a252-9c8c23b3164b,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4fc124-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.297 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.298 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe4fc124-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.300 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.301 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.303 186792 INFO os_vif [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a8:14:49,bridge_name='br-int',has_traffic_filtering=True,id=fe4fc124-3378-4a5c-a252-9c8c23b3164b,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4fc124-33')#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.304 186792 INFO nova.virt.libvirt.driver [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Deleting instance files /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb_del#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.305 186792 INFO nova.virt.libvirt.driver [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Deletion of /var/lib/nova/instances/e8d09e62-e93e-4700-bec1-fb9aff8683eb_del complete#033[00m
Nov 22 02:53:52 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[221545]: [NOTICE]   (221549) : haproxy version is 2.8.14-c23fe91
Nov 22 02:53:52 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[221545]: [NOTICE]   (221549) : path to executable is /usr/sbin/haproxy
Nov 22 02:53:52 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[221545]: [WARNING]  (221549) : Exiting Master process...
Nov 22 02:53:52 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[221545]: [ALERT]    (221549) : Current worker (221551) exited with code 143 (Terminated)
Nov 22 02:53:52 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[221545]: [WARNING]  (221549) : All workers exited. Exiting... (0)
Nov 22 02:53:52 np0005531888 systemd[1]: libpod-f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc.scope: Deactivated successfully.
Nov 22 02:53:52 np0005531888 podman[221999]: 2025-11-22 07:53:52.397641884 +0000 UTC m=+0.194981609 container died f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.413 186792 INFO nova.compute.manager [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.414 186792 DEBUG oslo.service.loopingcall [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.414 186792 DEBUG nova.compute.manager [-] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.414 186792 DEBUG nova.network.neutron [-] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.737 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc-userdata-shm.mount: Deactivated successfully.
Nov 22 02:53:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay-f45ceaeebb9eff22d79d27eda44481830fe3cc077dcb9354c0badb3d8089585d-merged.mount: Deactivated successfully.
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.994 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.995 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.995 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:52 np0005531888 nova_compute[186788]: 2025-11-22 07:53:52.995 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:53:53 np0005531888 podman[221999]: 2025-11-22 07:53:53.96865289 +0000 UTC m=+1.765992585 container cleanup f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:53:53 np0005531888 systemd[1]: libpod-conmon-f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc.scope: Deactivated successfully.
Nov 22 02:53:54 np0005531888 podman[222045]: 2025-11-22 07:53:54.057204353 +0000 UTC m=+0.174400652 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.091 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.092 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.35067749023438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.093 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.093 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.398 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance e8d09e62-e93e-4700-bec1-fb9aff8683eb actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.399 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.399 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:53:54 np0005531888 podman[222058]: 2025-11-22 07:53:54.407714547 +0000 UTC m=+0.407006581 container remove f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.418 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b4173b49-9a00-440b-b0a6-6b2123275300]: (4, ('Sat Nov 22 07:53:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc)\nf179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc\nSat Nov 22 07:53:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (f179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc)\nf179ede2d36746bdb75fb7f8440a7dfc3ccac8a8cd62c33ae44fa5d702ce40fc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.421 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[605d7100-2896-4916-ab26-f0e2070877b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.422 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.426 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:54 np0005531888 kernel: tapdc6b9ee8-e0: left promiscuous mode
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.439 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.443 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1e80267e-712e-4358-bae5-2146ef283822]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.471 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5358e007-7cd7-4e2b-b279-d6c50879dbf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.473 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[56a950f7-c90f-4065-92a5-19deb537f766]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.495 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e33fe999-906f-4008-bcbf-b1fa8a92f166]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473244, 'reachable_time': 23287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222088, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.498 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:53:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:53:54.499 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[d2c10389-412d-4299-852c-ef59e06f7b76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:53:54 np0005531888 systemd[1]: run-netns-ovnmeta\x2ddc6b9ee8\x2de824\x2d42ea\x2dbe5e\x2d5b3c4e48e46a.mount: Deactivated successfully.
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.858 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:53:54 np0005531888 nova_compute[186788]: 2025-11-22 07:53:54.884 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:53:55 np0005531888 nova_compute[186788]: 2025-11-22 07:53:55.019 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:53:55 np0005531888 nova_compute[186788]: 2025-11-22 07:53:55.019 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:55 np0005531888 nova_compute[186788]: 2025-11-22 07:53:55.642 186792 DEBUG nova.network.neutron [-] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:53:55 np0005531888 nova_compute[186788]: 2025-11-22 07:53:55.731 186792 INFO nova.compute.manager [-] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Took 3.32 seconds to deallocate network for instance.#033[00m
Nov 22 02:53:55 np0005531888 nova_compute[186788]: 2025-11-22 07:53:55.990 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:55 np0005531888 nova_compute[186788]: 2025-11-22 07:53:55.991 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.056 186792 DEBUG nova.compute.provider_tree [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.091 186792 DEBUG nova.compute.manager [req-fc4ff79f-fb79-40e2-9e9f-d74dcaebaae2 req-0b34e129-9edf-4b27-9c98-405d257b3b5b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Received event network-vif-deleted-fe4fc124-3378-4a5c-a252-9c8c23b3164b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.093 186792 DEBUG nova.scheduler.client.report [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.139 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.249 186792 INFO nova.scheduler.client.report [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Deleted allocations for instance e8d09e62-e93e-4700-bec1-fb9aff8683eb#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.335 186792 DEBUG oslo_concurrency.lockutils [None req-0285ae10-cf70-4d30-9e03-6ca7c1eeb801 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.353s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.770 186792 DEBUG nova.compute.manager [req-b3d5a9db-c57b-4884-8787-0a8689e6e1db req-e04de70f-33bf-4511-b39d-cccfd14cc3a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Received event network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.771 186792 DEBUG oslo_concurrency.lockutils [req-b3d5a9db-c57b-4884-8787-0a8689e6e1db req-e04de70f-33bf-4511-b39d-cccfd14cc3a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.771 186792 DEBUG oslo_concurrency.lockutils [req-b3d5a9db-c57b-4884-8787-0a8689e6e1db req-e04de70f-33bf-4511-b39d-cccfd14cc3a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.771 186792 DEBUG oslo_concurrency.lockutils [req-b3d5a9db-c57b-4884-8787-0a8689e6e1db req-e04de70f-33bf-4511-b39d-cccfd14cc3a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e8d09e62-e93e-4700-bec1-fb9aff8683eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.771 186792 DEBUG nova.compute.manager [req-b3d5a9db-c57b-4884-8787-0a8689e6e1db req-e04de70f-33bf-4511-b39d-cccfd14cc3a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] No waiting events found dispatching network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:53:56 np0005531888 nova_compute[186788]: 2025-11-22 07:53:56.772 186792 WARNING nova.compute.manager [req-b3d5a9db-c57b-4884-8787-0a8689e6e1db req-e04de70f-33bf-4511-b39d-cccfd14cc3a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Received unexpected event network-vif-plugged-fe4fc124-3378-4a5c-a252-9c8c23b3164b for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:53:57 np0005531888 nova_compute[186788]: 2025-11-22 07:53:57.301 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:57 np0005531888 nova_compute[186788]: 2025-11-22 07:53:57.739 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:58 np0005531888 podman[222090]: 2025-11-22 07:53:58.701444291 +0000 UTC m=+0.071964107 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 02:53:58 np0005531888 nova_compute[186788]: 2025-11-22 07:53:58.834 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:58 np0005531888 nova_compute[186788]: 2025-11-22 07:53:58.835 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:58 np0005531888 nova_compute[186788]: 2025-11-22 07:53:58.860 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.046 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.046 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.053 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.054 186792 INFO nova.compute.claims [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.363 186792 DEBUG nova.compute.provider_tree [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.382 186792 DEBUG nova.scheduler.client.report [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.459 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.460 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.518 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.519 186792 DEBUG nova.network.neutron [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.538 186792 INFO nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.556 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.709 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.711 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.712 186792 INFO nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Creating image(s)#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.712 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "/var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.713 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.714 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.732 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.801 186792 DEBUG nova.policy [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.807 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.808 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.808 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.820 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.883 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:53:59 np0005531888 nova_compute[186788]: 2025-11-22 07:53:59.884 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.220 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk 1073741824" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.221 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.222 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.285 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.287 186792 DEBUG nova.virt.disk.api [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Checking if we can resize image /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.287 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.350 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.351 186792 DEBUG nova.virt.disk.api [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Cannot resize image /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.352 186792 DEBUG nova.objects.instance [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'migration_context' on Instance uuid 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.392 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.393 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Ensure instance console log exists: /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.394 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.395 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.395 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:00 np0005531888 nova_compute[186788]: 2025-11-22 07:54:00.703 186792 DEBUG nova.network.neutron [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Successfully created port: af97aa95-4802-4456-ae07-64ec497d0797 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.184 186792 DEBUG nova.network.neutron [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Successfully updated port: af97aa95-4802-4456-ae07-64ec497d0797 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.261 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "refresh_cache-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.261 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquired lock "refresh_cache-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.261 186792 DEBUG nova.network.neutron [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.304 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.431 186792 DEBUG nova.compute.manager [req-87f04eab-bd12-4750-ae9d-6767e80ab61c req-02633a6f-f734-408f-8ce2-cc399cd54f9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received event network-changed-af97aa95-4802-4456-ae07-64ec497d0797 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.432 186792 DEBUG nova.compute.manager [req-87f04eab-bd12-4750-ae9d-6767e80ab61c req-02633a6f-f734-408f-8ce2-cc399cd54f9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Refreshing instance network info cache due to event network-changed-af97aa95-4802-4456-ae07-64ec497d0797. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.432 186792 DEBUG oslo_concurrency.lockutils [req-87f04eab-bd12-4750-ae9d-6767e80ab61c req-02633a6f-f734-408f-8ce2-cc399cd54f9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.740 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:02 np0005531888 nova_compute[186788]: 2025-11-22 07:54:02.986 186792 DEBUG nova.network.neutron [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.020 186792 DEBUG nova.network.neutron [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Updating instance_info_cache with network_info: [{"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.058 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Releasing lock "refresh_cache-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.059 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance network_info: |[{"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.060 186792 DEBUG oslo_concurrency.lockutils [req-87f04eab-bd12-4750-ae9d-6767e80ab61c req-02633a6f-f734-408f-8ce2-cc399cd54f9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.060 186792 DEBUG nova.network.neutron [req-87f04eab-bd12-4750-ae9d-6767e80ab61c req-02633a6f-f734-408f-8ce2-cc399cd54f9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Refreshing network info cache for port af97aa95-4802-4456-ae07-64ec497d0797 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.064 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Start _get_guest_xml network_info=[{"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.071 186792 WARNING nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.082 186792 DEBUG nova.virt.libvirt.host [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.084 186792 DEBUG nova.virt.libvirt.host [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.094 186792 DEBUG nova.virt.libvirt.host [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.095 186792 DEBUG nova.virt.libvirt.host [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.096 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.097 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.097 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.098 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.098 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.098 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.098 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.099 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.099 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.099 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.100 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.100 186792 DEBUG nova.virt.hardware [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.105 186792 DEBUG nova.virt.libvirt.vif [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:53:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1094572399',display_name='tempest-ImagesTestJSON-server-1094572399',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1094572399',id=60,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-u3a0bg8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:53:59Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.105 186792 DEBUG nova.network.os_vif_util [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.106 186792 DEBUG nova.network.os_vif_util [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:6f:0d,bridge_name='br-int',has_traffic_filtering=True,id=af97aa95-4802-4456-ae07-64ec497d0797,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf97aa95-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.108 186792 DEBUG nova.objects.instance [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'pci_devices' on Instance uuid 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.122 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <uuid>8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c</uuid>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <name>instance-0000003c</name>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <nova:name>tempest-ImagesTestJSON-server-1094572399</nova:name>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:54:06</nova:creationTime>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        <nova:user uuid="1ac2d2381d294c96aff369941185056a">tempest-ImagesTestJSON-117614339-project-member</nova:user>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        <nova:project uuid="7ec4007dc8214caab4e2eb40f11fb3cd">tempest-ImagesTestJSON-117614339</nova:project>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        <nova:port uuid="af97aa95-4802-4456-ae07-64ec497d0797">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <entry name="serial">8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c</entry>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <entry name="uuid">8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c</entry>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.config"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:a9:6f:0d"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <target dev="tapaf97aa95-48"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/console.log" append="off"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:54:06 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:54:06 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:54:06 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:54:06 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.124 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Preparing to wait for external event network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.125 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.125 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.125 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.126 186792 DEBUG nova.virt.libvirt.vif [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:53:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1094572399',display_name='tempest-ImagesTestJSON-server-1094572399',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1094572399',id=60,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-u3a0bg8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:53:59Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.127 186792 DEBUG nova.network.os_vif_util [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.128 186792 DEBUG nova.network.os_vif_util [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:6f:0d,bridge_name='br-int',has_traffic_filtering=True,id=af97aa95-4802-4456-ae07-64ec497d0797,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf97aa95-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.128 186792 DEBUG os_vif [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:6f:0d,bridge_name='br-int',has_traffic_filtering=True,id=af97aa95-4802-4456-ae07-64ec497d0797,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf97aa95-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.129 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.129 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.130 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.135 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.136 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf97aa95-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.137 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf97aa95-48, col_values=(('external_ids', {'iface-id': 'af97aa95-4802-4456-ae07-64ec497d0797', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:6f:0d', 'vm-uuid': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.139 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:06 np0005531888 NetworkManager[55166]: <info>  [1763798046.1404] manager: (tapaf97aa95-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.141 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.148 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.150 186792 INFO os_vif [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:6f:0d,bridge_name='br-int',has_traffic_filtering=True,id=af97aa95-4802-4456-ae07-64ec497d0797,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf97aa95-48')#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.240 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.241 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.241 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No VIF found with MAC fa:16:3e:a9:6f:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:54:06 np0005531888 nova_compute[186788]: 2025-11-22 07:54:06.242 186792 INFO nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Using config drive#033[00m
Nov 22 02:54:06 np0005531888 podman[222128]: 2025-11-22 07:54:06.709274018 +0000 UTC m=+0.075721070 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 02:54:06 np0005531888 podman[222129]: 2025-11-22 07:54:06.730745076 +0000 UTC m=+0.093256441 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.278 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798032.2764168, e8d09e62-e93e-4700-bec1-fb9aff8683eb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.278 186792 INFO nova.compute.manager [-] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.321 186792 DEBUG nova.compute.manager [None req-f10404e8-7f26-4bfd-86b1-77cc23fb5de1 - - - - - -] [instance: e8d09e62-e93e-4700-bec1-fb9aff8683eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.564 186792 INFO nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Creating config drive at /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.config#033[00m
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.570 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_ivuowug execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.705 186792 DEBUG oslo_concurrency.processutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_ivuowug" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.743 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:07 np0005531888 kernel: tapaf97aa95-48: entered promiscuous mode
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.780 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:07 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:07Z|00152|binding|INFO|Claiming lport af97aa95-4802-4456-ae07-64ec497d0797 for this chassis.
Nov 22 02:54:07 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:07Z|00153|binding|INFO|af97aa95-4802-4456-ae07-64ec497d0797: Claiming fa:16:3e:a9:6f:0d 10.100.0.8
Nov 22 02:54:07 np0005531888 NetworkManager[55166]: <info>  [1763798047.7819] manager: (tapaf97aa95-48): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Nov 22 02:54:07 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:07Z|00154|binding|INFO|Setting lport af97aa95-4802-4456-ae07-64ec497d0797 ovn-installed in OVS
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.797 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:07 np0005531888 nova_compute[186788]: 2025-11-22 07:54:07.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:07 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:07Z|00155|binding|INFO|Setting lport af97aa95-4802-4456-ae07-64ec497d0797 up in Southbound
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.803 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:6f:0d 10.100.0.8'], port_security=['fa:16:3e:a9:6f:0d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=af97aa95-4802-4456-ae07-64ec497d0797) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.805 104023 INFO neutron.agent.ovn.metadata.agent [-] Port af97aa95-4802-4456-ae07-64ec497d0797 in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a bound to our chassis#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.806 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a#033[00m
Nov 22 02:54:07 np0005531888 systemd-udevd[222190]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.820 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[99d5695b-8c05-46a5-8ad3-8db3e869c171]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.823 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdc6b9ee8-e1 in ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.825 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdc6b9ee8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.825 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4b78e0f1-7ba1-47cb-8d9b-d1aafb2a90a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.826 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[61857160-130a-4da3-b448-3a35b2b0c419]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 NetworkManager[55166]: <info>  [1763798047.8330] device (tapaf97aa95-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:54:07 np0005531888 NetworkManager[55166]: <info>  [1763798047.8342] device (tapaf97aa95-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:54:07 np0005531888 systemd-machined[153106]: New machine qemu-29-instance-0000003c.
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.840 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3bfabc-2bae-47b0-b494-df8515215546]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 systemd[1]: Started Virtual Machine qemu-29-instance-0000003c.
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.856 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5dafb69e-8571-41ac-ba84-c52f936a62bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.888 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6c416cab-e154-4b71-9889-c8f96e410054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 NetworkManager[55166]: <info>  [1763798047.8956] manager: (tapdc6b9ee8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.896 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[41c5f9e8-4ad9-4252-81ca-1d037b6f9147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.927 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b218a357-d85a-488d-affb-f9ca72ae780a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.930 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd87bb5-dd47-41f6-81b3-591d8c5fc17b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 NetworkManager[55166]: <info>  [1763798047.9579] device (tapdc6b9ee8-e0): carrier: link connected
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.965 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5d8f8d-a882-4caf-94d3-f18cc4dfe78e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:07.985 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a442b3e1-15ad-4357-987b-61ab55f4175a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478860, 'reachable_time': 38498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222223, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.003 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a5336863-3259-47ab-bc1a-6b30eb7d763a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:d89c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 478860, 'tstamp': 478860}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222224, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.024 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[16e259bb-7fe6-450e-b129-9864b9a6ebc1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478860, 'reachable_time': 38498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222225, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.061 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a97f376b-b8f4-426d-8fd3-56488d31b40c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.131 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[20f32df1-7489-4eac-99ce-25ea8f158836]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.133 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.133 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.134 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc6b9ee8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:08 np0005531888 kernel: tapdc6b9ee8-e0: entered promiscuous mode
Nov 22 02:54:08 np0005531888 NetworkManager[55166]: <info>  [1763798048.1367] manager: (tapdc6b9ee8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.136 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.141 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdc6b9ee8-e0, col_values=(('external_ids', {'iface-id': '99cae854-daa9-4d08-8152-257a15e21bf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.142 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:08 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:08Z|00156|binding|INFO|Releasing lport 99cae854-daa9-4d08-8152-257a15e21bf8 from this chassis (sb_readonly=0)
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.144 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.145 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2d2734-1c2b-49f4-91fd-0fcc8d0d2443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.146 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:54:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:08.147 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'env', 'PROCESS_TAG=haproxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.154 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.561 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798048.5607615, 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.562 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] VM Started (Lifecycle Event)#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.585 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.590 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798048.5611467, 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.591 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.607 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.612 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:08 np0005531888 nova_compute[186788]: 2025-11-22 07:54:08.634 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:54:08 np0005531888 podman[222263]: 2025-11-22 07:54:08.555373857 +0000 UTC m=+0.023612871 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:54:08 np0005531888 podman[222263]: 2025-11-22 07:54:08.797025138 +0000 UTC m=+0.265264132 container create 7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 02:54:08 np0005531888 systemd[1]: Started libpod-conmon-7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd.scope.
Nov 22 02:54:08 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:54:08 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1713a8c357262b9239a1cd158a3a026d96c7611a09d2ab2f4319e09ef4565a8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:54:08 np0005531888 podman[222263]: 2025-11-22 07:54:08.883482981 +0000 UTC m=+0.351721995 container init 7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:54:08 np0005531888 podman[222263]: 2025-11-22 07:54:08.890149045 +0000 UTC m=+0.358388039 container start 7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:54:08 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222280]: [NOTICE]   (222284) : New worker (222286) forked
Nov 22 02:54:08 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222280]: [NOTICE]   (222284) : Loading success.
Nov 22 02:54:09 np0005531888 nova_compute[186788]: 2025-11-22 07:54:09.871 186792 DEBUG nova.network.neutron [req-87f04eab-bd12-4750-ae9d-6767e80ab61c req-02633a6f-f734-408f-8ce2-cc399cd54f9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Updated VIF entry in instance network info cache for port af97aa95-4802-4456-ae07-64ec497d0797. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:54:09 np0005531888 nova_compute[186788]: 2025-11-22 07:54:09.872 186792 DEBUG nova.network.neutron [req-87f04eab-bd12-4750-ae9d-6767e80ab61c req-02633a6f-f734-408f-8ce2-cc399cd54f9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Updating instance_info_cache with network_info: [{"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:09 np0005531888 nova_compute[186788]: 2025-11-22 07:54:09.910 186792 DEBUG oslo_concurrency.lockutils [req-87f04eab-bd12-4750-ae9d-6767e80ab61c req-02633a6f-f734-408f-8ce2-cc399cd54f9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.756 186792 DEBUG nova.compute.manager [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received event network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.757 186792 DEBUG oslo_concurrency.lockutils [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.757 186792 DEBUG oslo_concurrency.lockutils [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.757 186792 DEBUG oslo_concurrency.lockutils [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.758 186792 DEBUG nova.compute.manager [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Processing event network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.758 186792 DEBUG nova.compute.manager [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received event network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.758 186792 DEBUG oslo_concurrency.lockutils [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.758 186792 DEBUG oslo_concurrency.lockutils [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.759 186792 DEBUG oslo_concurrency.lockutils [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.759 186792 DEBUG nova.compute.manager [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] No waiting events found dispatching network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.759 186792 WARNING nova.compute.manager [req-6ed21607-7674-4e7c-b3df-d431b78eb629 req-b0e7188f-d3bc-4a80-b319-bfd9a0adf9dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received unexpected event network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.760 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.765 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798050.7654405, 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.766 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.768 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.771 186792 INFO nova.virt.libvirt.driver [-] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance spawned successfully.#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.772 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.798 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.808 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.813 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.813 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.814 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.814 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.815 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.815 186792 DEBUG nova.virt.libvirt.driver [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.849 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.927 186792 INFO nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Took 11.22 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:54:10 np0005531888 nova_compute[186788]: 2025-11-22 07:54:10.927 186792 DEBUG nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:11 np0005531888 nova_compute[186788]: 2025-11-22 07:54:11.105 186792 INFO nova.compute.manager [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Took 12.09 seconds to build instance.#033[00m
Nov 22 02:54:11 np0005531888 nova_compute[186788]: 2025-11-22 07:54:11.141 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:11 np0005531888 nova_compute[186788]: 2025-11-22 07:54:11.194 186792 DEBUG oslo_concurrency.lockutils [None req-cd0fe699-7e2f-4260-a893-1af6505af3f7 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:11 np0005531888 nova_compute[186788]: 2025-11-22 07:54:11.997 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:11 np0005531888 nova_compute[186788]: 2025-11-22 07:54:11.998 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.025 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.188 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.188 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.196 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.197 186792 INFO nova.compute.claims [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.427 186792 DEBUG nova.compute.provider_tree [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.444 186792 DEBUG nova.scheduler.client.report [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.476 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.477 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.537 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.538 186792 DEBUG nova.network.neutron [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.567 186792 INFO nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.586 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:54:12 np0005531888 podman[222295]: 2025-11-22 07:54:12.687167685 +0000 UTC m=+0.061485730 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.716 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.719 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.719 186792 INFO nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Creating image(s)#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.720 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "/var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.720 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "/var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.721 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "/var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.734 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.754 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.802 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.803 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.803 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.819 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.882 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.883 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.925 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.926 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.926 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.995 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.997 186792 DEBUG nova.virt.disk.api [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Checking if we can resize image /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:54:12 np0005531888 nova_compute[186788]: 2025-11-22 07:54:12.997 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.042 186792 DEBUG nova.policy [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.054 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.054 186792 DEBUG nova.virt.disk.api [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Cannot resize image /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.055 186792 DEBUG nova.objects.instance [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'migration_context' on Instance uuid c2b016c4-0e79-4389-ad09-9b9362320ac7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.080 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.081 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Ensure instance console log exists: /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.081 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.082 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.082 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.486 186792 DEBUG oslo_concurrency.lockutils [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.487 186792 DEBUG oslo_concurrency.lockutils [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.487 186792 DEBUG nova.compute.manager [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.492 186792 DEBUG nova.compute.manager [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.493 186792 DEBUG nova.objects.instance [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'flavor' on Instance uuid 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.527 186792 DEBUG nova.objects.instance [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'info_cache' on Instance uuid 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:13 np0005531888 nova_compute[186788]: 2025-11-22 07:54:13.582 186792 DEBUG nova.virt.libvirt.driver [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:54:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:14.527 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:14 np0005531888 nova_compute[186788]: 2025-11-22 07:54:14.527 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:14.529 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:54:15 np0005531888 nova_compute[186788]: 2025-11-22 07:54:15.157 186792 DEBUG nova.network.neutron [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Successfully created port: c12ff87c-55f2-4e24-b84d-d105cfce590a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:54:15 np0005531888 podman[222335]: 2025-11-22 07:54:15.677764209 +0000 UTC m=+0.051573787 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:54:16 np0005531888 nova_compute[186788]: 2025-11-22 07:54:16.144 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:17 np0005531888 nova_compute[186788]: 2025-11-22 07:54:17.747 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:18.532 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.147 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.161 186792 DEBUG nova.network.neutron [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Successfully updated port: c12ff87c-55f2-4e24-b84d-d105cfce590a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.219 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "refresh_cache-c2b016c4-0e79-4389-ad09-9b9362320ac7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.220 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquired lock "refresh_cache-c2b016c4-0e79-4389-ad09-9b9362320ac7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.220 186792 DEBUG nova.network.neutron [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.314 186792 DEBUG nova.compute.manager [req-cce90467-edd6-4009-bad5-c120b36a6191 req-97b1f98b-9d36-41ef-98a0-712a101cd29d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received event network-changed-c12ff87c-55f2-4e24-b84d-d105cfce590a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.315 186792 DEBUG nova.compute.manager [req-cce90467-edd6-4009-bad5-c120b36a6191 req-97b1f98b-9d36-41ef-98a0-712a101cd29d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Refreshing instance network info cache due to event network-changed-c12ff87c-55f2-4e24-b84d-d105cfce590a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.315 186792 DEBUG oslo_concurrency.lockutils [req-cce90467-edd6-4009-bad5-c120b36a6191 req-97b1f98b-9d36-41ef-98a0-712a101cd29d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c2b016c4-0e79-4389-ad09-9b9362320ac7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:21 np0005531888 nova_compute[186788]: 2025-11-22 07:54:21.757 186792 DEBUG nova.network.neutron [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:54:22 np0005531888 podman[222355]: 2025-11-22 07:54:22.697208663 +0000 UTC m=+0.065120939 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:22 np0005531888 nova_compute[186788]: 2025-11-22 07:54:22.749 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:23 np0005531888 nova_compute[186788]: 2025-11-22 07:54:23.638 186792 DEBUG nova.virt.libvirt.driver [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.322 186792 DEBUG nova.network.neutron [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Updating instance_info_cache with network_info: [{"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.457 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Releasing lock "refresh_cache-c2b016c4-0e79-4389-ad09-9b9362320ac7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.458 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Instance network_info: |[{"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.458 186792 DEBUG oslo_concurrency.lockutils [req-cce90467-edd6-4009-bad5-c120b36a6191 req-97b1f98b-9d36-41ef-98a0-712a101cd29d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c2b016c4-0e79-4389-ad09-9b9362320ac7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.458 186792 DEBUG nova.network.neutron [req-cce90467-edd6-4009-bad5-c120b36a6191 req-97b1f98b-9d36-41ef-98a0-712a101cd29d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Refreshing network info cache for port c12ff87c-55f2-4e24-b84d-d105cfce590a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.462 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Start _get_guest_xml network_info=[{"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.467 186792 WARNING nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.482 186792 DEBUG nova.virt.libvirt.host [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.483 186792 DEBUG nova.virt.libvirt.host [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.491 186792 DEBUG nova.virt.libvirt.host [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.492 186792 DEBUG nova.virt.libvirt.host [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.493 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.493 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.494 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.494 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.495 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.495 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.495 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.495 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.496 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.496 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.496 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.497 186792 DEBUG nova.virt.hardware [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.502 186792 DEBUG nova.virt.libvirt.vif [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1555607633',display_name='tempest-ListServerFiltersTestJSON-instance-1555607633',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1555607633',id=63,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ihsevoj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:12Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=c2b016c4-0e79-4389-ad09-9b9362320ac7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.502 186792 DEBUG nova.network.os_vif_util [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.503 186792 DEBUG nova.network.os_vif_util [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=c12ff87c-55f2-4e24-b84d-d105cfce590a,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc12ff87c-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.504 186792 DEBUG nova.objects.instance [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'pci_devices' on Instance uuid c2b016c4-0e79-4389-ad09-9b9362320ac7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.521 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <uuid>c2b016c4-0e79-4389-ad09-9b9362320ac7</uuid>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <name>instance-0000003f</name>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1555607633</nova:name>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:54:24</nova:creationTime>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        <nova:user uuid="6d9b8aa760ed4afdbf24f9deb5d29190">tempest-ListServerFiltersTestJSON-1217253496-project-member</nova:user>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        <nova:project uuid="b4ca2b2e65ac4bf8b3d14f3310a3a7bf">tempest-ListServerFiltersTestJSON-1217253496</nova:project>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        <nova:port uuid="c12ff87c-55f2-4e24-b84d-d105cfce590a">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <entry name="serial">c2b016c4-0e79-4389-ad09-9b9362320ac7</entry>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <entry name="uuid">c2b016c4-0e79-4389-ad09-9b9362320ac7</entry>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.config"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:78:e2:00"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <target dev="tapc12ff87c-55"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/console.log" append="off"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:54:24 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:54:24 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:54:24 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:54:24 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.523 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Preparing to wait for external event network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.524 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.524 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.524 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.526 186792 DEBUG nova.virt.libvirt.vif [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1555607633',display_name='tempest-ListServerFiltersTestJSON-instance-1555607633',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1555607633',id=63,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ihsevoj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:12Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=c2b016c4-0e79-4389-ad09-9b9362320ac7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.526 186792 DEBUG nova.network.os_vif_util [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.527 186792 DEBUG nova.network.os_vif_util [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=c12ff87c-55f2-4e24-b84d-d105cfce590a,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc12ff87c-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.527 186792 DEBUG os_vif [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=c12ff87c-55f2-4e24-b84d-d105cfce590a,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc12ff87c-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.528 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.530 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.530 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.535 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.535 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc12ff87c-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.536 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc12ff87c-55, col_values=(('external_ids', {'iface-id': 'c12ff87c-55f2-4e24-b84d-d105cfce590a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:e2:00', 'vm-uuid': 'c2b016c4-0e79-4389-ad09-9b9362320ac7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:24 np0005531888 NetworkManager[55166]: <info>  [1763798064.5390] manager: (tapc12ff87c-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.540 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.546 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.548 186792 INFO os_vif [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=c12ff87c-55f2-4e24-b84d-d105cfce590a,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc12ff87c-55')#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.632 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.633 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.633 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No VIF found with MAC fa:16:3e:78:e2:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:54:24 np0005531888 nova_compute[186788]: 2025-11-22 07:54:24.633 186792 INFO nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Using config drive#033[00m
Nov 22 02:54:24 np0005531888 podman[222399]: 2025-11-22 07:54:24.664630419 +0000 UTC m=+0.068113022 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:54:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:25Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:6f:0d 10.100.0.8
Nov 22 02:54:25 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:25Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:6f:0d 10.100.0.8
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.025 186792 INFO nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Creating config drive at /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.config#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.032 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb3jwoazd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.168 186792 DEBUG oslo_concurrency.processutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb3jwoazd" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:26 np0005531888 kernel: tapc12ff87c-55: entered promiscuous mode
Nov 22 02:54:26 np0005531888 NetworkManager[55166]: <info>  [1763798066.2385] manager: (tapc12ff87c-55): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 22 02:54:26 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:26Z|00157|binding|INFO|Claiming lport c12ff87c-55f2-4e24-b84d-d105cfce590a for this chassis.
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.239 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:26Z|00158|binding|INFO|c12ff87c-55f2-4e24-b84d-d105cfce590a: Claiming fa:16:3e:78:e2:00 10.100.0.10
Nov 22 02:54:26 np0005531888 systemd-udevd[222439]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.278 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.284 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:e2:00 10.100.0.10'], port_security=['fa:16:3e:78:e2:00 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cd63e957-ae08-4ca1-9eb9-8ce253173257', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13b92379-ae34-491c-b971-1757bc6e8c79, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=c12ff87c-55f2-4e24-b84d-d105cfce590a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:26 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:26Z|00159|binding|INFO|Setting lport c12ff87c-55f2-4e24-b84d-d105cfce590a ovn-installed in OVS
Nov 22 02:54:26 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:26Z|00160|binding|INFO|Setting lport c12ff87c-55f2-4e24-b84d-d105cfce590a up in Southbound
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.286 104023 INFO neutron.agent.ovn.metadata.agent [-] Port c12ff87c-55f2-4e24-b84d-d105cfce590a in datapath 62930ff4-55a3-4e08-8229-5532aa7dcaed bound to our chassis#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.288 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62930ff4-55a3-4e08-8229-5532aa7dcaed#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.290 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531888 NetworkManager[55166]: <info>  [1763798066.2929] device (tapc12ff87c-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:54:26 np0005531888 NetworkManager[55166]: <info>  [1763798066.2943] device (tapc12ff87c-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:54:26 np0005531888 systemd-machined[153106]: New machine qemu-30-instance-0000003f.
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.302 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbc8330-8953-458d-882f-88735e7f8399]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.303 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62930ff4-51 in ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.306 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62930ff4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.306 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a03691-4d86-4d7c-9809-8bdefb2150ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.307 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[27328dbd-d5d3-42cb-8f75-b92b0ff35ebb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 systemd[1]: Started Virtual Machine qemu-30-instance-0000003f.
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.324 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[245f9761-ead3-467f-b2aa-2ed6af0986c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.346 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b07e47ff-b52b-4e00-84a3-a8725c3c409d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.382 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[1e4c4155-c06d-411f-baad-d7ecbca8b2eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 NetworkManager[55166]: <info>  [1763798066.3914] manager: (tap62930ff4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.391 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[19c3670e-1edf-4bcb-87ec-1579344d04d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.436 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[218bbeb0-1c58-4627-8939-cd8a79cdbadb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.440 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[282988c2-9220-41f8-9ea4-16e4a3cc99ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 NetworkManager[55166]: <info>  [1763798066.4701] device (tap62930ff4-50): carrier: link connected
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.478 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[0510b6ab-9c9b-4815-944b-ca69beed4d53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.495 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c59f2c7f-119e-43f7-928c-746be3d87b7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62930ff4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:07:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480711, 'reachable_time': 21275, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222475, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.513 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f03b4a24-0a44-419b-8396-71a523872568]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:714'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480711, 'tstamp': 480711}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222476, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.530 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dc06ad56-bd52-4b0d-9f46-fca355544df6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62930ff4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:07:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480711, 'reachable_time': 21275, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222477, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.564 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[13537d77-76a1-481c-afe4-eb08125bd5d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.649 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc112ea-79c7-4344-adcb-c632c9420baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.651 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62930ff4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.651 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.651 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62930ff4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.653 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531888 NetworkManager[55166]: <info>  [1763798066.6543] manager: (tap62930ff4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 22 02:54:26 np0005531888 kernel: tap62930ff4-50: entered promiscuous mode
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.656 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.656 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62930ff4-50, col_values=(('external_ids', {'iface-id': '02324e7a-c5bf-443b-a6e3-5a1cdac9fee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:26 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:26Z|00161|binding|INFO|Releasing lport 02324e7a-c5bf-443b-a6e3-5a1cdac9fee4 from this chassis (sb_readonly=0)
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.659 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.662 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.662 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.664 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d408b887-3a90-4339-8390-7f8e54df2773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.665 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:54:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:26.665 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'env', 'PROCESS_TAG=haproxy-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62930ff4-55a3-4e08-8229-5532aa7dcaed.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.675 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.719 186792 DEBUG nova.compute.manager [req-47cdfca0-83a2-435e-9bac-7259070ea1c6 req-02ec8072-906e-4d92-9a98-6f12c4788503 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received event network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.719 186792 DEBUG oslo_concurrency.lockutils [req-47cdfca0-83a2-435e-9bac-7259070ea1c6 req-02ec8072-906e-4d92-9a98-6f12c4788503 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.719 186792 DEBUG oslo_concurrency.lockutils [req-47cdfca0-83a2-435e-9bac-7259070ea1c6 req-02ec8072-906e-4d92-9a98-6f12c4788503 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.720 186792 DEBUG oslo_concurrency.lockutils [req-47cdfca0-83a2-435e-9bac-7259070ea1c6 req-02ec8072-906e-4d92-9a98-6f12c4788503 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:26 np0005531888 nova_compute[186788]: 2025-11-22 07:54:26.720 186792 DEBUG nova.compute.manager [req-47cdfca0-83a2-435e-9bac-7259070ea1c6 req-02ec8072-906e-4d92-9a98-6f12c4788503 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Processing event network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:54:27 np0005531888 podman[222509]: 2025-11-22 07:54:27.072185481 +0000 UTC m=+0.026984154 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:54:27 np0005531888 podman[222509]: 2025-11-22 07:54:27.213963102 +0000 UTC m=+0.168761755 container create 5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.374 186792 DEBUG nova.network.neutron [req-cce90467-edd6-4009-bad5-c120b36a6191 req-97b1f98b-9d36-41ef-98a0-712a101cd29d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Updated VIF entry in instance network info cache for port c12ff87c-55f2-4e24-b84d-d105cfce590a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.375 186792 DEBUG nova.network.neutron [req-cce90467-edd6-4009-bad5-c120b36a6191 req-97b1f98b-9d36-41ef-98a0-712a101cd29d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Updating instance_info_cache with network_info: [{"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.396 186792 DEBUG oslo_concurrency.lockutils [req-cce90467-edd6-4009-bad5-c120b36a6191 req-97b1f98b-9d36-41ef-98a0-712a101cd29d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c2b016c4-0e79-4389-ad09-9b9362320ac7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:27 np0005531888 systemd[1]: Started libpod-conmon-5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac.scope.
Nov 22 02:54:27 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:54:27 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98ce53faa472ff6273b1e8c83d059e3767a94fa7fc69b4d4c19cb618b7d5aada/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:54:27 np0005531888 podman[222509]: 2025-11-22 07:54:27.545862949 +0000 UTC m=+0.500661622 container init 5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:54:27 np0005531888 podman[222509]: 2025-11-22 07:54:27.553976248 +0000 UTC m=+0.508774901 container start 5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:54:27 np0005531888 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[222526]: [NOTICE]   (222535) : New worker (222538) forked
Nov 22 02:54:27 np0005531888 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[222526]: [NOTICE]   (222535) : Loading success.
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.600 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.601 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798067.5997772, c2b016c4-0e79-4389-ad09-9b9362320ac7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.602 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] VM Started (Lifecycle Event)#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.604 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.608 186792 INFO nova.virt.libvirt.driver [-] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Instance spawned successfully.#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.609 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.631 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.638 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.642 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.643 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.643 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.644 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.644 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.644 186792 DEBUG nova.virt.libvirt.driver [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.677 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.678 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798067.6011422, c2b016c4-0e79-4389-ad09-9b9362320ac7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.678 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.703 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.706 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798067.6043553, c2b016c4-0e79-4389-ad09-9b9362320ac7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.706 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.751 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.752 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.759 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.782 186792 INFO nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Took 15.06 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.782 186792 DEBUG nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.791 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.868 186792 INFO nova.compute.manager [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Took 15.77 seconds to build instance.#033[00m
Nov 22 02:54:27 np0005531888 nova_compute[186788]: 2025-11-22 07:54:27.916 186792 DEBUG oslo_concurrency.lockutils [None req-68e9f68a-ef39-4eb7-9f56-d1a07d958ade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:28 np0005531888 nova_compute[186788]: 2025-11-22 07:54:28.873 186792 DEBUG nova.compute.manager [req-b66c02c5-070e-42b9-b2f8-75e302a588f1 req-b70de5d1-08c0-456b-ab86-ca45c136c7ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received event network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:28 np0005531888 nova_compute[186788]: 2025-11-22 07:54:28.873 186792 DEBUG oslo_concurrency.lockutils [req-b66c02c5-070e-42b9-b2f8-75e302a588f1 req-b70de5d1-08c0-456b-ab86-ca45c136c7ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:28 np0005531888 nova_compute[186788]: 2025-11-22 07:54:28.874 186792 DEBUG oslo_concurrency.lockutils [req-b66c02c5-070e-42b9-b2f8-75e302a588f1 req-b70de5d1-08c0-456b-ab86-ca45c136c7ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:28 np0005531888 nova_compute[186788]: 2025-11-22 07:54:28.874 186792 DEBUG oslo_concurrency.lockutils [req-b66c02c5-070e-42b9-b2f8-75e302a588f1 req-b70de5d1-08c0-456b-ab86-ca45c136c7ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:28 np0005531888 nova_compute[186788]: 2025-11-22 07:54:28.874 186792 DEBUG nova.compute.manager [req-b66c02c5-070e-42b9-b2f8-75e302a588f1 req-b70de5d1-08c0-456b-ab86-ca45c136c7ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] No waiting events found dispatching network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:28 np0005531888 nova_compute[186788]: 2025-11-22 07:54:28.874 186792 WARNING nova.compute.manager [req-b66c02c5-070e-42b9-b2f8-75e302a588f1 req-b70de5d1-08c0-456b-ab86-ca45c136c7ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received unexpected event network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a for instance with vm_state active and task_state None.#033[00m
Nov 22 02:54:29 np0005531888 nova_compute[186788]: 2025-11-22 07:54:29.539 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:29 np0005531888 podman[222547]: 2025-11-22 07:54:29.732252871 +0000 UTC m=+0.098475729 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Nov 22 02:54:32 np0005531888 nova_compute[186788]: 2025-11-22 07:54:32.753 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:34 np0005531888 nova_compute[186788]: 2025-11-22 07:54:34.543 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:34 np0005531888 nova_compute[186788]: 2025-11-22 07:54:34.700 186792 DEBUG nova.virt.libvirt.driver [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:36.807 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:36.807 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:36.808 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.842 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'hostId': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.846 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'name': 'tempest-ImagesTestJSON-server-1094572399', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003c', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'user_id': '1ac2d2381d294c96aff369941185056a', 'hostId': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.847 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.862 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.863 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.879 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.880 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89cc0fcc-c057-4087-9726-0da60347a2f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:36.847266', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dc6477e-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.546764864, 'message_signature': '04361953ec3b05724c13a21c6478dbb71fbd84a78d52a00d961d45690cf5e2b1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:36.847266', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dc65cbe-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.546764864, 'message_signature': 'b15cdddb90be5a16ff0ee2d91b46e9b7a1679826b69f38e9efca5d451a56e2f5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-vda', 'timestamp': '2025-11-22T07:54:36.847266', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'instance-0000003c', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dc8e52e-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.563664789, 'message_signature': '95d23eb377c06dd1e896d30087e0b62b3c2999ecfd5fcba05f00e8f78e0f180c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-sda', 'timestamp': '2025-11-22T07:54:36.847266', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'instance-0000003c', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dc8f668-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.563664789, 'message_signature': 'b76f3f06b05e89fdf58e9c3582bb76f60a694e5fe710082f4cd4103bc4cb15b8'}]}, 'timestamp': '2025-11-22 07:54:36.881184', '_unique_id': 'a27e4ccd26ce4f12be805bd2d60b5350'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.886 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.889 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c2b016c4-0e79-4389-ad09-9b9362320ac7 / tapc12ff87c-55 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.890 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.892 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c / tapaf97aa95-48 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.893 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bca3cd7-f07b-4c11-95ff-9e147be0fd9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:36.886551', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7dca68d6-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': 'dd852c0a2505ab00e685edc62f1f3854371e99ee022d3aa128cf476aced2b064'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:36.886551', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7dcad866-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': '2a42eff0f49bca1c2c9156f7f8d5923ca9bd2be9f87232aa8b8f44a7fb839cbf'}]}, 'timestamp': '2025-11-22 07:54:36.893608', '_unique_id': '889d672928ac42d8868dd39435ab9d5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:54:36 np0005531888 kernel: tapaf97aa95-48 (unregistering): left promiscuous mode
Nov 22 02:54:36 np0005531888 NetworkManager[55166]: <info>  [1763798076.9258] device (tapaf97aa95-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:54:36 np0005531888 nova_compute[186788]: 2025-11-22 07:54:36.938 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:36 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:36Z|00162|binding|INFO|Releasing lport af97aa95-4802-4456-ae07-64ec497d0797 from this chassis (sb_readonly=0)
Nov 22 02:54:36 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:36Z|00163|binding|INFO|Setting lport af97aa95-4802-4456-ae07-64ec497d0797 down in Southbound
Nov 22 02:54:36 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:36Z|00164|binding|INFO|Removing iface tapaf97aa95-48 ovn-installed in OVS
Nov 22 02:54:36 np0005531888 nova_compute[186788]: 2025-11-22 07:54:36.944 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:36 np0005531888 nova_compute[186788]: 2025-11-22 07:54:36.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.965 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:36.966 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:36 np0005531888 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Nov 22 02:54:36 np0005531888 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003c.scope: Consumed 15.065s CPU time.
Nov 22 02:54:36 np0005531888 systemd-machined[153106]: Machine qemu-29-instance-0000003c terminated.
Nov 22 02:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:36.990 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:6f:0d 10.100.0.8'], port_security=['fa:16:3e:a9:6f:0d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=af97aa95-4802-4456-ae07-64ec497d0797) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:36.993 104023 INFO neutron.agent.ovn.metadata.agent [-] Port af97aa95-4802-4456-ae07-64ec497d0797 in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a unbound from our chassis#033[00m
Nov 22 02:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:36.995 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:36.996 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[819e0e4f-4fed-4f0c-8139-512cff95d112]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:36.997 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace which is not needed anymore#033[00m
Nov 22 02:54:37 np0005531888 podman[222571]: 2025-11-22 07:54:37.036968168 +0000 UTC m=+0.078069807 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:54:37 np0005531888 podman[222572]: 2025-11-22 07:54:37.075879774 +0000 UTC m=+0.112218857 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:54:37 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222280]: [NOTICE]   (222284) : haproxy version is 2.8.14-c23fe91
Nov 22 02:54:37 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222280]: [NOTICE]   (222284) : path to executable is /usr/sbin/haproxy
Nov 22 02:54:37 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222280]: [WARNING]  (222284) : Exiting Master process...
Nov 22 02:54:37 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222280]: [WARNING]  (222284) : Exiting Master process...
Nov 22 02:54:37 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222280]: [ALERT]    (222284) : Current worker (222286) exited with code 143 (Terminated)
Nov 22 02:54:37 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222280]: [WARNING]  (222284) : All workers exited. Exiting... (0)
Nov 22 02:54:37 np0005531888 systemd[1]: libpod-7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd.scope: Deactivated successfully.
Nov 22 02:54:37 np0005531888 podman[222634]: 2025-11-22 07:54:37.168165928 +0000 UTC m=+0.058987289 container died 7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:37 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd-userdata-shm.mount: Deactivated successfully.
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.217 12 DEBUG ceilometer.compute.pollsters [-] Instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000003c, id=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b02db3f2-e2f3-4af1-9cc0-19fc62e3497a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:36.898181', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dd5f0e8-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': 'ed4d9d256075166484eaf97c7b2d995cd8fc32bfaa74e59b8f80e7e84ebae43b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:36.898181', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dd5fcb4-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': 'f35d560144c3e7463d4ed457edb5e3f3af13e54faf4b8fb4020f972c3d3b4422'}]}, 'timestamp': '2025-11-22 07:54:37.218018', '_unique_id': '2ce3da9aae95419896ea9e10b9ee51af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.219 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.220 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.221 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 systemd[1]: var-lib-containers-storage-overlay-1713a8c357262b9239a1cd158a3a026d96c7611a09d2ab2f4319e09ef4565a8e-merged.mount: Deactivated successfully.
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.227 12 DEBUG ceilometer.compute.pollsters [-] Instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000003c, id=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71d5ec81-1512-40e1-bb5a-4a7b647f5c75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:37.220853', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dfcd884-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': '158bbb480a0b83880bc6f423e4a79c2f369ec86ea3a36e8ab5bb192db645b639'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:37.220853', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dfce5a4-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': 'e6aec5dc23de360d033b372c6481b08339715265d1f9047f046280ee2a8a3e02'}]}, 'timestamp': '2025-11-22 07:54:37.227514', '_unique_id': '108f6113fe99484ca6b220406ad91b59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.229 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.230 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.230 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.230 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '775847bc-975c-4ffe-9039-5e6635794599', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.230100', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7dfe4142-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': 'e97757fa1a69aeeb80747ca693aeb1c07e8b66ba585d790abd55d6620395725f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.230100', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7dfe6ab4-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': '931aef27184428593a08212f18a288502f0aecfd8116253ff1d512e9735d5128'}]}, 'timestamp': '2025-11-22 07:54:37.231512', '_unique_id': '0069e673477a4156adec603942d188da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.232 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.234 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.read.latency volume: 924259258 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.234 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.read.latency volume: 2959782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.236 12 DEBUG ceilometer.compute.pollsters [-] Instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000003c, id=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9dcae0c-9159-4430-9822-3b5f146239a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 924259258, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:37.234113', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dfede54-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': '4ab1fbc4a67eab8826b67d3046299dfe0fdf118f7198fa510b214f053e4c197a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2959782, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:37.234113', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dfef7e0-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': 'd7dfe704aa3838772de6e5055f51ee9cacbdc368fdf3683ed001399beed0e469'}]}, 'timestamp': '2025-11-22 07:54:37.236602', '_unique_id': 'f55b6c745b8940f8a46b39e4da406bbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.237 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.239 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.239 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1555607633>, <NovaLikeServer: tempest-ImagesTestJSON-server-1094572399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1555607633>, <NovaLikeServer: tempest-ImagesTestJSON-server-1094572399>]
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.240 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.240 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3232d90-282c-4aa9-9b39-82258096b08d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.240186', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7dffcdbe-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': 'b4945b5abe7aa7bc2fa8939e763516dff2f59bae099e4aee6ed904c84fd7a8a3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.240186', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7dffdf34-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': 'd53c250b56a9a3db5ed742cf8ff0ba23bccd3b55430a83679cc15d1f304cebd6'}]}, 'timestamp': '2025-11-22 07:54:37.241067', '_unique_id': 'd16a73920b13405d8b6fb841808ea1de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.242 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.243 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.244 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '087bf9ad-1d8f-4e46-9e45-c600228ca4e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.243411', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7e0050e0-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': 'f5caac87d390546762739e5a117b59f5c3e343059731338c381e153bc779cc27'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.243411', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7e006076-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': '40931095e2ef1c8aac7177095782b73d8ebc1667a5b7bddef036ec9364a1287f'}]}, 'timestamp': '2025-11-22 07:54:37.244375', '_unique_id': '9f5920f1bea848dd800ec10b9475d531'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.245 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.246 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.246 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.248 12 DEBUG ceilometer.compute.pollsters [-] Instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000003c, id=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f5ddd6f-c837-4636-94c2-f07840fb9fa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:37.246346', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e00bcba-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': '417e75939359d78b333269052c8154665a9e05b6c4706c4b5f2356ca3174a47e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:37.246346', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e00c9bc-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': '110a0d0c6158b1984080477e99c53a9bdf9f3b18c6b261b60a6e82525e55154c'}]}, 'timestamp': '2025-11-22 07:54:37.248392', '_unique_id': '6128102717844e2584f15c5a46ca9eac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.249 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.250 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.251 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33a3a189-2626-4dee-a254-4f9026b6b79b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.250662', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7e01652a-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': 'f17f2efd55b545fa3713b4dabae81d5aaabfc3ec5e20469977437af054bf1767'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.250662', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7e0171a0-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': '7d4de0fb5a0d1ac5e3f7e73f7d8a0aa6b153d07c13947a10edc4200e3ae1c9a7'}]}, 'timestamp': '2025-11-22 07:54:37.251369', '_unique_id': '33987d58726b43feb8a5b203d742a27d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.252 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.253 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.254 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1555607633>, <NovaLikeServer: tempest-ImagesTestJSON-server-1094572399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1555607633>, <NovaLikeServer: tempest-ImagesTestJSON-server-1094572399>]
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.254 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.254 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5171a2ad-dd69-4be9-8dfc-3603731daa8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.254371', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7e01f5a8-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': 'f6eac7fc2fdecefe93efda401a737e61cb9d31e8f367e92095c21598fd36437c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.254371', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7e020110-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': 'b072c5e687b5c3bdb17cc655ee95cbbcba0ddc40e17408b00029d9bb582dacd8'}]}, 'timestamp': '2025-11-22 07:54:37.254995', '_unique_id': 'c1238398f31c4fdbb78998bc6c44723a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.255 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.256 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.256 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1555607633>, <NovaLikeServer: tempest-ImagesTestJSON-server-1094572399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1555607633>, <NovaLikeServer: tempest-ImagesTestJSON-server-1094572399>]
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.257 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.257 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1555607633>, <NovaLikeServer: tempest-ImagesTestJSON-server-1094572399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1555607633>, <NovaLikeServer: tempest-ImagesTestJSON-server-1094572399>]
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.257 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:54:37 np0005531888 podman[222634]: 2025-11-22 07:54:37.273053143 +0000 UTC m=+0.163874504 container cleanup 7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.280 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/cpu volume: 9300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.282 12 DEBUG ceilometer.compute.pollsters [-] Instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000003c, id=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:54:37 np0005531888 systemd[1]: libpod-conmon-7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd.scope: Deactivated successfully.
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '409d6336-2514-4c8d-9f2a-2c6ddafbb676', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9300000000, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'timestamp': '2025-11-22T07:54:37.257706', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7e061bf6-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.979723273, 'message_signature': '9a5e8f8269200c8ef609ba57eeaf54906c0e994525fcbfc4dc4718806092f747'}]}, 'timestamp': '2025-11-22 07:54:37.283305', '_unique_id': '43a5b103b1804babbe19603500bb053e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.284 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.287 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.287 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3632bb9-169a-4c69-ace6-d72ea79a03ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.287257', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7e06fab2-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': '37d7dde0f56bb80f43dd6e8da893c930cde84b6a99d63f1e7b615e4f6aecddd2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.287257', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7e0708d6-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': '5df932e00e027ebf46f19e3b4e0fcf973306855b38f73027e7bb62c97cf19d39'}]}, 'timestamp': '2025-11-22 07:54:37.288001', '_unique_id': 'adfbccc8be634c1687ee172b0914a105'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.288 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.289 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c4256e6-2f4c-4bb8-958d-c82e1b39b0e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.289860', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7e075d9a-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': '4ae5abde3b9ba9376b2e525bf35a166c1a1d1e5d43eb37bf086646f7e49134ce'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.289860', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7e07677c-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': 'd6fec3af03e0ca34f44bb52a60248707e37b88fedd132fd7d2e30a0b46e869a2'}]}, 'timestamp': '2025-11-22 07:54:37.290383', '_unique_id': 'fe9807f662834128894843e7db2ecd13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.290 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.291 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.292 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.292 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25cc298d-9665-4599-94f7-af5c8be929ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.292135', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7e07b7c2-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': '5765c0627c0ea880a9a95a74063583b005f16a041f229f4ed9f55495051bf48b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.292135', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7e07c67c-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': 'f927e7621acaf586b17eee9f0f999f46f0476c06587c2b363359acf53dfc9d9f'}]}, 'timestamp': '2025-11-22 07:54:37.292853', '_unique_id': '843163c531864d0382e4d71a7f3806ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.293 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.294 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.294 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.295 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.295 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.295 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1982beea-0c23-4789-ba9b-02235576b6c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:37.294861', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e08234c-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.546764864, 'message_signature': 'dbe812512dd0841bc278c1f2e8172781c92a0b7f18b072d1e3a859a232e107a3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:37.294861', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e083062-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.546764864, 'message_signature': 'be209fe2edc49c8dccc9c7ba9e87b285b9c0ffebcd8bda83d6a1971719583bf6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-vda', 'timestamp': '2025-11-22T07:54:37.294861', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'instance-0000003c', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e083de6-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.563664789, 'message_signature': 'f5bba7e7273adb7590994b695cdd7aa4cd769b38fabdc58e4dbad0b0ca6502fd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-sda', 'timestamp': '2025-11-22T07:54:37.294861', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'instance-0000003c', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e084ade-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.563664789, 'message_signature': '3769c827d9f86d9b2fe3340858baf1490e1c28391b6b0d70472b51d2b34fbbcf'}]}, 'timestamp': '2025-11-22 07:54:37.296244', '_unique_id': 'da0fd9e4f3ac4f91be7cd8c7c9a5cf13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.297 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.299 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.299 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5337b790-19a8-415f-9581-4495bd4e5625', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003f-c2b016c4-0e79-4389-ad09-9b9362320ac7-tapc12ff87c-55', 'timestamp': '2025-11-22T07:54:37.299014', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'tapc12ff87c-55', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:78:e2:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc12ff87c-55'}, 'message_id': '7e08c64e-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.58613152, 'message_signature': '2ccaa2c4400ba76338211b9cf4a5c2b72df3fb6ac576c71d5d0c6f2208463469'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': 'instance-0000003c-8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-tapaf97aa95-48', 'timestamp': '2025-11-22T07:54:37.299014', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'tapaf97aa95-48', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:6f:0d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf97aa95-48'}, 'message_id': '7e08d59e-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.590264862, 'message_signature': 'be4827fbff03b5c9e02de45ea0d7aef1464e573e44632fd621093ac505b3f943'}]}, 'timestamp': '2025-11-22 07:54:37.299799', '_unique_id': '6b451d407ed74497b33798a67bb1021d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.300 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.302 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.302 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.302 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.303 12 DEBUG ceilometer.compute.pollsters [-] 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a904784-7ea7-4830-93a1-25132dc4eda8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:37.302082', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e093d22-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.546764864, 'message_signature': '62a0febe536d247c98766b9a1a648f1d77b1e1c9daa54793ea6eadfe56666905'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:37.302082', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e094ad8-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.546764864, 'message_signature': 'e5a59f31a29bcce46eca82da163f1757596e5874b56b7b5caa59779dfd6c8b75'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-vda', 'timestamp': '2025-11-22T07:54:37.302082', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'instance-0000003c', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e095730-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.563664789, 'message_signature': '4886207f9fc5a2bc0d840ef322c40bc49d7930b6dd120fd95fa109908ad7600c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_name': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_name': None, 'resource_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-sda', 'timestamp': '2025-11-22T07:54:37.302082', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1094572399', 'name': 'instance-0000003c', 'instance_id': '8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c', 'instance_type': 'm1.nano', 'host': '72e94b7c6fa486d3350f26ea28e73433a36c61fee78be7ad311765b5', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e096388-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.563664789, 'message_signature': '82c9336c681a19bdf6e3c06e93fe30ac118865bfcf962ecb0a66440947d058be'}]}, 'timestamp': '2025-11-22 07:54:37.303417', '_unique_id': '1110df1da24047d2bb0e8185124209f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.304 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.305 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.305 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.305 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c2b016c4-0e79-4389-ad09-9b9362320ac7: ceilometer.compute.pollsters.NoVolumeException
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.306 12 DEBUG ceilometer.compute.pollsters [-] Instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000003c, id=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.307 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.307 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.307 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.308 12 DEBUG ceilometer.compute.pollsters [-] Instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000003c, id=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad2459f0-63cc-4122-aab9-e4819fa052ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:37.307294', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e0a0946-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': 'baf70516a8aa4cd02971ae2cad7febb1b512bbf61488618fd832d0f76cf172da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:37.307294', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e0a18b4-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': '6edac837f3e06fc2f01e09ca1132fb4c74d6141f7fb444af4b286a71170962e0'}]}, 'timestamp': '2025-11-22 07:54:37.308786', '_unique_id': '7c732b13e66a4953a2b5c5b3ea39ef14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.309 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.310 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.310 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.311 12 DEBUG ceilometer.compute.pollsters [-] c2b016c4-0e79-4389-ad09-9b9362320ac7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.312 12 DEBUG ceilometer.compute.pollsters [-] Instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000003c, id=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ba13780-7a35-4028-be0e-cdac8f5fbf4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-vda', 'timestamp': '2025-11-22T07:54:37.310762', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e0a91ea-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': 'd240a69383dd1c59dde3851455d8479c501ee02b63c40c47638dfd7bf56b6ebc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7-sda', 'timestamp': '2025-11-22T07:54:37.310762', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1555607633', 'name': 'instance-0000003f', 'instance_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'instance_type': 'm1.nano', 'host': '0e7973d552c6d33cfb86ae5a0f2c5ab81586707493d11fd6d3c545b3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '360f90ca-2ddb-4e60-a48e-364e3b48bd96'}, 'image_ref': '360f90ca-2ddb-4e60-a48e-364e3b48bd96', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e0a9e06-c778-11f0-941d-fa163e6775e5', 'monotonic_time': 4817.597731296, 'message_signature': 'e75019d107b423a4524cef7ad7bdd8627d69a4536c969de9d5b49f22b7500935'}]}, 'timestamp': '2025-11-22 07:54:37.312200', '_unique_id': '08a50f81f1144509ad4a0e38504d1fed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:54:37.313 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531888 podman[222682]: 2025-11-22 07:54:37.606013027 +0000 UTC m=+0.303671886 container remove 7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.614 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6894c83f-5f38-48d0-b62c-d67b5e8acfeb]: (4, ('Sat Nov 22 07:54:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd)\n7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd\nSat Nov 22 07:54:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd)\n7931c5dd1413abb123b21e3a3f260e98488875b824b941621fa0ce5657accfbd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.618 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d5375442-8710-42f4-9b6d-a780710237ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.619 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:37 np0005531888 kernel: tapdc6b9ee8-e0: left promiscuous mode
Nov 22 02:54:37 np0005531888 nova_compute[186788]: 2025-11-22 07:54:37.622 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:37 np0005531888 nova_compute[186788]: 2025-11-22 07:54:37.643 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.646 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[21a924dc-5351-4942-9ec0-7738b3423170]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.669 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf8d6a5-1cb5-430b-aafd-33aee77147d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.672 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4237c9db-aed8-4a24-99b7-4bd35a2dce88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.693 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6b232325-86ef-459d-af4f-d91ff7019482]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478852, 'reachable_time': 26694, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222701, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:37 np0005531888 systemd[1]: run-netns-ovnmeta\x2ddc6b9ee8\x2de824\x2d42ea\x2dbe5e\x2d5b3c4e48e46a.mount: Deactivated successfully.
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.697 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:54:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:37.697 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e4ccff-0f43-4b1b-801f-d9216e047857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:37 np0005531888 nova_compute[186788]: 2025-11-22 07:54:37.719 186792 INFO nova.virt.libvirt.driver [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance shutdown successfully after 24 seconds.#033[00m
Nov 22 02:54:37 np0005531888 nova_compute[186788]: 2025-11-22 07:54:37.728 186792 INFO nova.virt.libvirt.driver [-] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance destroyed successfully.#033[00m
Nov 22 02:54:37 np0005531888 nova_compute[186788]: 2025-11-22 07:54:37.729 186792 DEBUG nova.objects.instance [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'numa_topology' on Instance uuid 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:37 np0005531888 nova_compute[186788]: 2025-11-22 07:54:37.743 186792 DEBUG nova.compute.manager [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:37 np0005531888 nova_compute[186788]: 2025-11-22 07:54:37.758 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:37 np0005531888 nova_compute[186788]: 2025-11-22 07:54:37.847 186792 DEBUG oslo_concurrency.lockutils [None req-0ce64fb0-20b0-4a33-adb6-14312f3b6933 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 24.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531888 nova_compute[186788]: 2025-11-22 07:54:39.546 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:39 np0005531888 nova_compute[186788]: 2025-11-22 07:54:39.809 186792 DEBUG nova.compute.manager [req-0dec57f9-e742-438b-ba49-90d57d23989f req-de916cd0-c236-43a2-8f67-d765392d72b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received event network-vif-unplugged-af97aa95-4802-4456-ae07-64ec497d0797 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:39 np0005531888 nova_compute[186788]: 2025-11-22 07:54:39.810 186792 DEBUG oslo_concurrency.lockutils [req-0dec57f9-e742-438b-ba49-90d57d23989f req-de916cd0-c236-43a2-8f67-d765392d72b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:39 np0005531888 nova_compute[186788]: 2025-11-22 07:54:39.810 186792 DEBUG oslo_concurrency.lockutils [req-0dec57f9-e742-438b-ba49-90d57d23989f req-de916cd0-c236-43a2-8f67-d765392d72b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:39 np0005531888 nova_compute[186788]: 2025-11-22 07:54:39.811 186792 DEBUG oslo_concurrency.lockutils [req-0dec57f9-e742-438b-ba49-90d57d23989f req-de916cd0-c236-43a2-8f67-d765392d72b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531888 nova_compute[186788]: 2025-11-22 07:54:39.811 186792 DEBUG nova.compute.manager [req-0dec57f9-e742-438b-ba49-90d57d23989f req-de916cd0-c236-43a2-8f67-d765392d72b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] No waiting events found dispatching network-vif-unplugged-af97aa95-4802-4456-ae07-64ec497d0797 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:39 np0005531888 nova_compute[186788]: 2025-11-22 07:54:39.811 186792 WARNING nova.compute.manager [req-0dec57f9-e742-438b-ba49-90d57d23989f req-de916cd0-c236-43a2-8f67-d765392d72b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received unexpected event network-vif-unplugged-af97aa95-4802-4456-ae07-64ec497d0797 for instance with vm_state stopped and task_state None.#033[00m
Nov 22 02:54:40 np0005531888 nova_compute[186788]: 2025-11-22 07:54:40.831 186792 DEBUG nova.compute.manager [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:40 np0005531888 nova_compute[186788]: 2025-11-22 07:54:40.963 186792 INFO nova.compute.manager [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] instance snapshotting#033[00m
Nov 22 02:54:40 np0005531888 nova_compute[186788]: 2025-11-22 07:54:40.964 186792 WARNING nova.compute.manager [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] trying to snapshot a non-running instance: (state: 4 expected: 1)#033[00m
Nov 22 02:54:41 np0005531888 nova_compute[186788]: 2025-11-22 07:54:41.547 186792 INFO nova.virt.libvirt.driver [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Beginning cold snapshot process#033[00m
Nov 22 02:54:41 np0005531888 nova_compute[186788]: 2025-11-22 07:54:41.788 186792 DEBUG nova.privsep.utils [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:54:41 np0005531888 nova_compute[186788]: 2025-11-22 07:54:41.789 186792 DEBUG oslo_concurrency.processutils [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk /var/lib/nova/instances/snapshots/tmp5kq58_cf/3e6ea7e6b07b4f4297b4812fbea48e22 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.273 186792 DEBUG nova.compute.manager [req-ca250022-42ca-4f81-8852-f4d92940211b req-d2a8ed19-33b0-4821-b39e-1fbd940f8afe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received event network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.274 186792 DEBUG oslo_concurrency.lockutils [req-ca250022-42ca-4f81-8852-f4d92940211b req-d2a8ed19-33b0-4821-b39e-1fbd940f8afe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.274 186792 DEBUG oslo_concurrency.lockutils [req-ca250022-42ca-4f81-8852-f4d92940211b req-d2a8ed19-33b0-4821-b39e-1fbd940f8afe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.274 186792 DEBUG oslo_concurrency.lockutils [req-ca250022-42ca-4f81-8852-f4d92940211b req-d2a8ed19-33b0-4821-b39e-1fbd940f8afe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.274 186792 DEBUG nova.compute.manager [req-ca250022-42ca-4f81-8852-f4d92940211b req-d2a8ed19-33b0-4821-b39e-1fbd940f8afe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] No waiting events found dispatching network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.275 186792 WARNING nova.compute.manager [req-ca250022-42ca-4f81-8852-f4d92940211b req-d2a8ed19-33b0-4821-b39e-1fbd940f8afe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received unexpected event network-vif-plugged-af97aa95-4802-4456-ae07-64ec497d0797 for instance with vm_state stopped and task_state image_pending_upload.#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.529 186792 DEBUG oslo_concurrency.processutils [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c/disk /var/lib/nova/instances/snapshots/tmp5kq58_cf/3e6ea7e6b07b4f4297b4812fbea48e22" returned: 0 in 0.739s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.530 186792 INFO nova.virt.libvirt.driver [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.760 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.945 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Triggering sync for uuid 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Triggering sync for uuid c2b016c4-0e79-4389-ad09-9b9362320ac7 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.970 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.970 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.970 186792 INFO nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] During sync_power_state the instance has a pending task (image_uploading). Skip.#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.970 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.971 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.971 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:42 np0005531888 nova_compute[186788]: 2025-11-22 07:54:42.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:43 np0005531888 nova_compute[186788]: 2025-11-22 07:54:43.001 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:43 np0005531888 podman[222733]: 2025-11-22 07:54:43.674779954 +0000 UTC m=+0.050520712 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:54:43 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:43Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:78:e2:00 10.100.0.10
Nov 22 02:54:43 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:43Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:78:e2:00 10.100.0.10
Nov 22 02:54:43 np0005531888 nova_compute[186788]: 2025-11-22 07:54:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:43 np0005531888 nova_compute[186788]: 2025-11-22 07:54:43.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:54:43 np0005531888 nova_compute[186788]: 2025-11-22 07:54:43.990 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:54:44 np0005531888 nova_compute[186788]: 2025-11-22 07:54:44.549 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:44 np0005531888 nova_compute[186788]: 2025-11-22 07:54:44.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:45 np0005531888 nova_compute[186788]: 2025-11-22 07:54:45.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:45 np0005531888 nova_compute[186788]: 2025-11-22 07:54:45.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:46 np0005531888 nova_compute[186788]: 2025-11-22 07:54:46.564 186792 INFO nova.virt.libvirt.driver [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Snapshot image upload complete#033[00m
Nov 22 02:54:46 np0005531888 nova_compute[186788]: 2025-11-22 07:54:46.564 186792 INFO nova.compute.manager [None req-c9a79666-f1e0-4666-a906-30e7708a5608 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Took 5.59 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:54:46 np0005531888 podman[222758]: 2025-11-22 07:54:46.682509539 +0000 UTC m=+0.053165596 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:54:47 np0005531888 nova_compute[186788]: 2025-11-22 07:54:47.762 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.552 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.946 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.947 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.947 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.947 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.947 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.955 186792 INFO nova.compute.manager [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Terminating instance#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.961 186792 DEBUG nova.compute.manager [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.968 186792 INFO nova.virt.libvirt.driver [-] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Instance destroyed successfully.#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.969 186792 DEBUG nova.objects.instance [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'resources' on Instance uuid 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.992 186792 DEBUG nova.virt.libvirt.vif [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:53:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1094572399',display_name='tempest-ImagesTestJSON-server-1094572399',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1094572399',id=60,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-u3a0bg8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:46Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.993 186792 DEBUG nova.network.os_vif_util [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "af97aa95-4802-4456-ae07-64ec497d0797", "address": "fa:16:3e:a9:6f:0d", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf97aa95-48", "ovs_interfaceid": "af97aa95-4802-4456-ae07-64ec497d0797", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.994 186792 DEBUG nova.network.os_vif_util [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:6f:0d,bridge_name='br-int',has_traffic_filtering=True,id=af97aa95-4802-4456-ae07-64ec497d0797,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf97aa95-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.995 186792 DEBUG os_vif [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:6f:0d,bridge_name='br-int',has_traffic_filtering=True,id=af97aa95-4802-4456-ae07-64ec497d0797,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf97aa95-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.996 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.997 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf97aa95-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:49 np0005531888 nova_compute[186788]: 2025-11-22 07:54:49.998 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:50 np0005531888 nova_compute[186788]: 2025-11-22 07:54:50.000 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:50 np0005531888 nova_compute[186788]: 2025-11-22 07:54:50.003 186792 INFO os_vif [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:6f:0d,bridge_name='br-int',has_traffic_filtering=True,id=af97aa95-4802-4456-ae07-64ec497d0797,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf97aa95-48')#033[00m
Nov 22 02:54:50 np0005531888 nova_compute[186788]: 2025-11-22 07:54:50.004 186792 INFO nova.virt.libvirt.driver [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Deleting instance files /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c_del#033[00m
Nov 22 02:54:50 np0005531888 nova_compute[186788]: 2025-11-22 07:54:50.010 186792 INFO nova.virt.libvirt.driver [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Deletion of /var/lib/nova/instances/8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c_del complete#033[00m
Nov 22 02:54:50 np0005531888 nova_compute[186788]: 2025-11-22 07:54:50.181 186792 INFO nova.compute.manager [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Took 0.22 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:54:50 np0005531888 nova_compute[186788]: 2025-11-22 07:54:50.182 186792 DEBUG oslo.service.loopingcall [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:54:50 np0005531888 nova_compute[186788]: 2025-11-22 07:54:50.182 186792 DEBUG nova.compute.manager [-] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:54:50 np0005531888 nova_compute[186788]: 2025-11-22 07:54:50.182 186792 DEBUG nova.network.neutron [-] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.442 186792 DEBUG nova.network.neutron [-] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.472 186792 INFO nova.compute.manager [-] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Took 1.29 seconds to deallocate network for instance.#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.588 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.589 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.654 186792 DEBUG nova.scheduler.client.report [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.680 186792 DEBUG nova.scheduler.client.report [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.681 186792 DEBUG nova.compute.provider_tree [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.697 186792 DEBUG nova.scheduler.client.report [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.750 186792 DEBUG nova.scheduler.client.report [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.815 186792 DEBUG nova.compute.provider_tree [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.837 186792 DEBUG nova.scheduler.client.report [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.886 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:51 np0005531888 nova_compute[186788]: 2025-11-22 07:54:51.934 186792 INFO nova.scheduler.client.report [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Deleted allocations for instance 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.074 186792 DEBUG oslo_concurrency.lockutils [None req-4ccb5e75-5268-48dd-8bee-ed61a45de398 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.099 186792 DEBUG nova.compute.manager [req-c7c82f19-4cf8-4da2-9d38-cdc1200a3f48 req-b19d57ad-9faf-4023-9a34-2fe36e8653fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Received event network-vif-deleted-af97aa95-4802-4456-ae07-64ec497d0797 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.227 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798077.2263935, 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.228 186792 INFO nova.compute.manager [-] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.260 186792 DEBUG nova.compute.manager [None req-e083ff93-ba1d-419d-970c-0d6a1425f5a9 - - - - - -] [instance: 8e2e9e8f-55de-46fe-9a1e-f05de3b23e9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.765 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.994 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.995 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.995 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:52 np0005531888 nova_compute[186788]: 2025-11-22 07:54:52.995 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:54:53 np0005531888 podman[222777]: 2025-11-22 07:54:53.123731079 +0000 UTC m=+0.070203144 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.127 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.204 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.205 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.271 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.467 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.469 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5521MB free_disk=73.3219985961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.469 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.469 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.577 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance c2b016c4-0e79-4389-ad09-9b9362320ac7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.577 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.577 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.647 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.692 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.729 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:54:53 np0005531888 nova_compute[186788]: 2025-11-22 07:54:53.729 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:54 np0005531888 ovn_controller[95067]: 2025-11-22T07:54:54Z|00165|binding|INFO|Releasing lport 02324e7a-c5bf-443b-a6e3-5a1cdac9fee4 from this chassis (sb_readonly=0)
Nov 22 02:54:54 np0005531888 nova_compute[186788]: 2025-11-22 07:54:54.235 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:55 np0005531888 nova_compute[186788]: 2025-11-22 07:54:54.999 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:55 np0005531888 podman[222805]: 2025-11-22 07:54:55.719625173 +0000 UTC m=+0.083646825 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:54:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:55.884 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:55 np0005531888 nova_compute[186788]: 2025-11-22 07:54:55.884 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:55.885 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:54:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:54:55.887 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:57 np0005531888 nova_compute[186788]: 2025-11-22 07:54:57.766 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:00 np0005531888 nova_compute[186788]: 2025-11-22 07:55:00.002 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:00 np0005531888 podman[222829]: 2025-11-22 07:55:00.689251538 +0000 UTC m=+0.059880191 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.openshift.expose-services=)
Nov 22 02:55:02 np0005531888 nova_compute[186788]: 2025-11-22 07:55:02.768 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531888 nova_compute[186788]: 2025-11-22 07:55:04.792 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:04 np0005531888 nova_compute[186788]: 2025-11-22 07:55:04.793 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:04 np0005531888 nova_compute[186788]: 2025-11-22 07:55:04.828 186792 DEBUG nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:55:04 np0005531888 nova_compute[186788]: 2025-11-22 07:55:04.984 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:04 np0005531888 nova_compute[186788]: 2025-11-22 07:55:04.985 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:04 np0005531888 nova_compute[186788]: 2025-11-22 07:55:04.993 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:55:04 np0005531888 nova_compute[186788]: 2025-11-22 07:55:04.994 186792 INFO nova.compute.claims [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.004 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.335 186792 DEBUG nova.compute.provider_tree [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.375 186792 DEBUG nova.scheduler.client.report [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.408 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.409 186792 DEBUG nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.466 186792 DEBUG nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.489 186792 INFO nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.516 186792 DEBUG nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.868 186792 DEBUG nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.869 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.870 186792 INFO nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Creating image(s)#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.870 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "/var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.871 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "/var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.871 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "/var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.885 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.948 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.950 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.951 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:05 np0005531888 nova_compute[186788]: 2025-11-22 07:55:05.963 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.026 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.028 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.062 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.066 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.067 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.129 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.130 186792 DEBUG nova.virt.disk.api [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Checking if we can resize image /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.131 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.193 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.194 186792 DEBUG nova.virt.disk.api [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Cannot resize image /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.195 186792 DEBUG nova.objects.instance [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lazy-loading 'migration_context' on Instance uuid 22d1fb4f-53b9-4f92-b767-ef6cb7630aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.216 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.217 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Ensure instance console log exists: /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.218 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.218 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.218 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.220 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.225 186792 WARNING nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.232 186792 DEBUG nova.virt.libvirt.host [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.234 186792 DEBUG nova.virt.libvirt.host [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.237 186792 DEBUG nova.virt.libvirt.host [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.238 186792 DEBUG nova.virt.libvirt.host [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.239 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.240 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.240 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.240 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.241 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.241 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.241 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.241 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.241 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.242 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.242 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.242 186792 DEBUG nova.virt.hardware [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.246 186792 DEBUG nova.objects.instance [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 22d1fb4f-53b9-4f92-b767-ef6cb7630aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.268 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <uuid>22d1fb4f-53b9-4f92-b767-ef6cb7630aff</uuid>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <name>instance-00000043</name>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersAaction247Test-server-768063749</nova:name>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:55:06</nova:creationTime>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:        <nova:user uuid="f5e79c63e7d54c98933d852cf5fb3a15">tempest-ServersAaction247Test-1162188201-project-member</nova:user>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:        <nova:project uuid="a541f4bac1b34ce2a71a3debd4c5fe81">tempest-ServersAaction247Test-1162188201</nova:project>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <entry name="serial">22d1fb4f-53b9-4f92-b767-ef6cb7630aff</entry>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <entry name="uuid">22d1fb4f-53b9-4f92-b767-ef6cb7630aff</entry>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk.config"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/console.log" append="off"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:55:06 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:55:06 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:55:06 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:55:06 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.326 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.327 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:55:06 np0005531888 nova_compute[186788]: 2025-11-22 07:55:06.327 186792 INFO nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Using config drive#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.100 186792 INFO nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Creating config drive at /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk.config#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.106 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl4ti8tpb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.235 186792 DEBUG oslo_concurrency.processutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl4ti8tpb" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:07 np0005531888 systemd-machined[153106]: New machine qemu-31-instance-00000043.
Nov 22 02:55:07 np0005531888 systemd[1]: Started Virtual Machine qemu-31-instance-00000043.
Nov 22 02:55:07 np0005531888 podman[222878]: 2025-11-22 07:55:07.383098951 +0000 UTC m=+0.077232497 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:55:07 np0005531888 podman[222879]: 2025-11-22 07:55:07.408147336 +0000 UTC m=+0.094912732 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.711 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798107.7107863, 22d1fb4f-53b9-4f92-b767-ef6cb7630aff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.714 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.717 186792 DEBUG nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.718 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.725 186792 INFO nova.virt.libvirt.driver [-] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Instance spawned successfully.#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.726 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.744 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.752 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.755 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.755 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.756 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.756 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.757 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.757 186792 DEBUG nova.virt.libvirt.driver [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.770 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.787 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.788 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798107.7118237, 22d1fb4f-53b9-4f92-b767-ef6cb7630aff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.788 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] VM Started (Lifecycle Event)#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.809 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.813 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.832 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.846 186792 INFO nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Took 1.98 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.846 186792 DEBUG nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.924 186792 INFO nova.compute.manager [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Took 2.99 seconds to build instance.#033[00m
Nov 22 02:55:07 np0005531888 nova_compute[186788]: 2025-11-22 07:55:07.942 186792 DEBUG oslo_concurrency.lockutils [None req-9e5c9f1c-cfaf-4a4d-88b3-f0de11074800 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:09 np0005531888 nova_compute[186788]: 2025-11-22 07:55:09.854 186792 DEBUG nova.compute.manager [None req-654bacd4-b3c2-4952-a89b-504715d46ede f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:09 np0005531888 nova_compute[186788]: 2025-11-22 07:55:09.980 186792 INFO nova.compute.manager [None req-654bacd4-b3c2-4952-a89b-504715d46ede f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] instance snapshotting#033[00m
Nov 22 02:55:09 np0005531888 nova_compute[186788]: 2025-11-22 07:55:09.982 186792 DEBUG nova.objects.instance [None req-654bacd4-b3c2-4952-a89b-504715d46ede f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lazy-loading 'flavor' on Instance uuid 22d1fb4f-53b9-4f92-b767-ef6cb7630aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.008 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.090 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.091 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.091 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.091 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.092 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.099 186792 INFO nova.compute.manager [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Terminating instance#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.104 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "refresh_cache-22d1fb4f-53b9-4f92-b767-ef6cb7630aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.105 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquired lock "refresh_cache-22d1fb4f-53b9-4f92-b767-ef6cb7630aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.105 186792 DEBUG nova.network.neutron [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.366 186792 DEBUG nova.network.neutron [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.391 186792 INFO nova.virt.libvirt.driver [None req-654bacd4-b3c2-4952-a89b-504715d46ede f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Beginning live snapshot process#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.467 186792 DEBUG nova.compute.manager [None req-654bacd4-b3c2-4952-a89b-504715d46ede f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.797 186792 DEBUG nova.network.neutron [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.814 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Releasing lock "refresh_cache-22d1fb4f-53b9-4f92-b767-ef6cb7630aff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:10 np0005531888 nova_compute[186788]: 2025-11-22 07:55:10.815 186792 DEBUG nova.compute.manager [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:55:10 np0005531888 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000043.scope: Deactivated successfully.
Nov 22 02:55:10 np0005531888 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000043.scope: Consumed 3.464s CPU time.
Nov 22 02:55:10 np0005531888 systemd-machined[153106]: Machine qemu-31-instance-00000043 terminated.
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.060 186792 INFO nova.virt.libvirt.driver [-] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Instance destroyed successfully.#033[00m
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.061 186792 DEBUG nova.objects.instance [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lazy-loading 'resources' on Instance uuid 22d1fb4f-53b9-4f92-b767-ef6cb7630aff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.074 186792 INFO nova.virt.libvirt.driver [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Deleting instance files /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff_del#033[00m
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.074 186792 INFO nova.virt.libvirt.driver [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Deletion of /var/lib/nova/instances/22d1fb4f-53b9-4f92-b767-ef6cb7630aff_del complete#033[00m
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.167 186792 INFO nova.compute.manager [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.167 186792 DEBUG oslo.service.loopingcall [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.168 186792 DEBUG nova.compute.manager [-] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.168 186792 DEBUG nova.network.neutron [-] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:55:11 np0005531888 nova_compute[186788]: 2025-11-22 07:55:11.537 186792 DEBUG nova.compute.manager [None req-654bacd4-b3c2-4952-a89b-504715d46ede f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.086 186792 DEBUG nova.network.neutron [-] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.117 186792 DEBUG nova.network.neutron [-] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.131 186792 INFO nova.compute.manager [-] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Took 0.96 seconds to deallocate network for instance.#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.574 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.575 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.698 186792 DEBUG nova.compute.provider_tree [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.773 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.906 186792 DEBUG nova.scheduler.client.report [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:12 np0005531888 nova_compute[186788]: 2025-11-22 07:55:12.939 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:13 np0005531888 nova_compute[186788]: 2025-11-22 07:55:13.015 186792 INFO nova.scheduler.client.report [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Deleted allocations for instance 22d1fb4f-53b9-4f92-b767-ef6cb7630aff#033[00m
Nov 22 02:55:13 np0005531888 nova_compute[186788]: 2025-11-22 07:55:13.100 186792 DEBUG oslo_concurrency.lockutils [None req-d4286d0b-35e1-4979-a962-01a24f565dd3 f5e79c63e7d54c98933d852cf5fb3a15 a541f4bac1b34ce2a71a3debd4c5fe81 - - default default] Lock "22d1fb4f-53b9-4f92-b767-ef6cb7630aff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:14 np0005531888 podman[222949]: 2025-11-22 07:55:14.706823805 +0000 UTC m=+0.063288964 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:55:15 np0005531888 nova_compute[186788]: 2025-11-22 07:55:15.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.084 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.085 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.085 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.085 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.085 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.093 186792 INFO nova.compute.manager [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Terminating instance#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.098 186792 DEBUG nova.compute.manager [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:55:17 np0005531888 kernel: tapc12ff87c-55 (unregistering): left promiscuous mode
Nov 22 02:55:17 np0005531888 NetworkManager[55166]: <info>  [1763798117.1381] device (tapc12ff87c-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:55:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:17Z|00166|binding|INFO|Releasing lport c12ff87c-55f2-4e24-b84d-d105cfce590a from this chassis (sb_readonly=0)
Nov 22 02:55:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:17Z|00167|binding|INFO|Setting lport c12ff87c-55f2-4e24-b84d-d105cfce590a down in Southbound
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.148 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:17Z|00168|binding|INFO|Removing iface tapc12ff87c-55 ovn-installed in OVS
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.152 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.171 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Nov 22 02:55:17 np0005531888 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003f.scope: Consumed 17.575s CPU time.
Nov 22 02:55:17 np0005531888 systemd-machined[153106]: Machine qemu-30-instance-0000003f terminated.
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.213 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:e2:00 10.100.0.10'], port_security=['fa:16:3e:78:e2:00 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c2b016c4-0e79-4389-ad09-9b9362320ac7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd63e957-ae08-4ca1-9eb9-8ce253173257', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13b92379-ae34-491c-b971-1757bc6e8c79, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=c12ff87c-55f2-4e24-b84d-d105cfce590a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.214 104023 INFO neutron.agent.ovn.metadata.agent [-] Port c12ff87c-55f2-4e24-b84d-d105cfce590a in datapath 62930ff4-55a3-4e08-8229-5532aa7dcaed unbound from our chassis#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.215 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62930ff4-55a3-4e08-8229-5532aa7dcaed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.217 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5eacaa03-089e-4a6c-be59-b700f25e2987]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.218 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed namespace which is not needed anymore#033[00m
Nov 22 02:55:17 np0005531888 podman[222974]: 2025-11-22 07:55:17.225491014 +0000 UTC m=+0.063907480 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.324 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.328 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[222526]: [NOTICE]   (222535) : haproxy version is 2.8.14-c23fe91
Nov 22 02:55:17 np0005531888 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[222526]: [NOTICE]   (222535) : path to executable is /usr/sbin/haproxy
Nov 22 02:55:17 np0005531888 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[222526]: [WARNING]  (222535) : Exiting Master process...
Nov 22 02:55:17 np0005531888 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[222526]: [ALERT]    (222535) : Current worker (222538) exited with code 143 (Terminated)
Nov 22 02:55:17 np0005531888 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[222526]: [WARNING]  (222535) : All workers exited. Exiting... (0)
Nov 22 02:55:17 np0005531888 systemd[1]: libpod-5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac.scope: Deactivated successfully.
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.374 186792 INFO nova.virt.libvirt.driver [-] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Instance destroyed successfully.#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.375 186792 DEBUG nova.objects.instance [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'resources' on Instance uuid c2b016c4-0e79-4389-ad09-9b9362320ac7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:17 np0005531888 podman[223018]: 2025-11-22 07:55:17.377456634 +0000 UTC m=+0.073130726 container died 5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.399 186792 DEBUG nova.virt.libvirt.vif [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1555607633',display_name='tempest-ListServerFiltersTestJSON-instance-1555607633',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1555607633',id=63,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ihsevoj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:27Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=c2b016c4-0e79-4389-ad09-9b9362320ac7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.399 186792 DEBUG nova.network.os_vif_util [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "address": "fa:16:3e:78:e2:00", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc12ff87c-55", "ovs_interfaceid": "c12ff87c-55f2-4e24-b84d-d105cfce590a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.400 186792 DEBUG nova.network.os_vif_util [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=c12ff87c-55f2-4e24-b84d-d105cfce590a,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc12ff87c-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.400 186792 DEBUG os_vif [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=c12ff87c-55f2-4e24-b84d-d105cfce590a,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc12ff87c-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.402 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.403 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc12ff87c-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.404 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.409 186792 INFO os_vif [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=c12ff87c-55f2-4e24-b84d-d105cfce590a,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc12ff87c-55')#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.410 186792 INFO nova.virt.libvirt.driver [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Deleting instance files /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7_del#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.411 186792 INFO nova.virt.libvirt.driver [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Deletion of /var/lib/nova/instances/c2b016c4-0e79-4389-ad09-9b9362320ac7_del complete#033[00m
Nov 22 02:55:17 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac-userdata-shm.mount: Deactivated successfully.
Nov 22 02:55:17 np0005531888 systemd[1]: var-lib-containers-storage-overlay-98ce53faa472ff6273b1e8c83d059e3767a94fa7fc69b4d4c19cb618b7d5aada-merged.mount: Deactivated successfully.
Nov 22 02:55:17 np0005531888 podman[223018]: 2025-11-22 07:55:17.459205931 +0000 UTC m=+0.154880023 container cleanup 5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:55:17 np0005531888 systemd[1]: libpod-conmon-5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac.scope: Deactivated successfully.
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.501 186792 INFO nova.compute.manager [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.502 186792 DEBUG oslo.service.loopingcall [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.502 186792 DEBUG nova.compute.manager [-] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.502 186792 DEBUG nova.network.neutron [-] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:55:17 np0005531888 podman[223060]: 2025-11-22 07:55:17.703872198 +0000 UTC m=+0.223703613 container remove 5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.710 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[58736d95-5691-4917-9458-6cbbcbf2cb9f]: (4, ('Sat Nov 22 07:55:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed (5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac)\n5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac\nSat Nov 22 07:55:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed (5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac)\n5c9520b19b0959ad9e54771123b09320311d18faaa3b1fd8dd03affcc9f5e4ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.712 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d817e1f1-1e43-4cfc-be57-271102da5c2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.713 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62930ff4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.716 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 kernel: tap62930ff4-50: left promiscuous mode
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.728 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.732 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8577f6-7b81-4e53-9488-e0cb3cf2adca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.749 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f9502769-705c-42f5-9856-7da4ff324fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.750 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[733762a9-c025-4b41-a1d2-e2d4df71a7c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.767 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a5570ef9-7cc5-439c-aa79-e3154d55be4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480701, 'reachable_time': 40504, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223075, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.771 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:55:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:17.771 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[5f845f0b-a1cd-47d6-8fd3-2d94bc5728f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:17 np0005531888 systemd[1]: run-netns-ovnmeta\x2d62930ff4\x2d55a3\x2d4e08\x2d8229\x2d5532aa7dcaed.mount: Deactivated successfully.
Nov 22 02:55:17 np0005531888 nova_compute[186788]: 2025-11-22 07:55:17.775 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.161 186792 DEBUG nova.compute.manager [req-5f1148bb-6ebe-4f9c-ac0b-71a1e27afe6e req-cde1327c-af28-4b97-ae6e-0877c782b340 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received event network-vif-unplugged-c12ff87c-55f2-4e24-b84d-d105cfce590a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.162 186792 DEBUG oslo_concurrency.lockutils [req-5f1148bb-6ebe-4f9c-ac0b-71a1e27afe6e req-cde1327c-af28-4b97-ae6e-0877c782b340 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.162 186792 DEBUG oslo_concurrency.lockutils [req-5f1148bb-6ebe-4f9c-ac0b-71a1e27afe6e req-cde1327c-af28-4b97-ae6e-0877c782b340 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.162 186792 DEBUG oslo_concurrency.lockutils [req-5f1148bb-6ebe-4f9c-ac0b-71a1e27afe6e req-cde1327c-af28-4b97-ae6e-0877c782b340 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.162 186792 DEBUG nova.compute.manager [req-5f1148bb-6ebe-4f9c-ac0b-71a1e27afe6e req-cde1327c-af28-4b97-ae6e-0877c782b340 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] No waiting events found dispatching network-vif-unplugged-c12ff87c-55f2-4e24-b84d-d105cfce590a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.163 186792 DEBUG nova.compute.manager [req-5f1148bb-6ebe-4f9c-ac0b-71a1e27afe6e req-cde1327c-af28-4b97-ae6e-0877c782b340 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received event network-vif-unplugged-c12ff87c-55f2-4e24-b84d-d105cfce590a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.735 186792 DEBUG nova.network.neutron [-] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.767 186792 INFO nova.compute.manager [-] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Took 1.26 seconds to deallocate network for instance.#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.840 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.841 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.856 186792 DEBUG nova.compute.manager [req-45046f99-8719-4c8e-bb25-62adf1ef4e74 req-bfb70d01-0901-41ab-ae9a-5027ce796061 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received event network-vif-deleted-c12ff87c-55f2-4e24-b84d-d105cfce590a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.926 186792 DEBUG nova.compute.provider_tree [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.940 186792 DEBUG nova.scheduler.client.report [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:18 np0005531888 nova_compute[186788]: 2025-11-22 07:55:18.964 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:19 np0005531888 nova_compute[186788]: 2025-11-22 07:55:19.006 186792 INFO nova.scheduler.client.report [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Deleted allocations for instance c2b016c4-0e79-4389-ad09-9b9362320ac7#033[00m
Nov 22 02:55:19 np0005531888 nova_compute[186788]: 2025-11-22 07:55:19.089 186792 DEBUG oslo_concurrency.lockutils [None req-09655d2d-b734-4b10-8927-d5b584a642ac 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:20 np0005531888 nova_compute[186788]: 2025-11-22 07:55:20.315 186792 DEBUG nova.compute.manager [req-5ea4170f-c1cc-46b6-be0d-e9fff6aa361e req-a3ded92a-3e7e-40c9-8567-9807ea6659ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received event network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:20 np0005531888 nova_compute[186788]: 2025-11-22 07:55:20.316 186792 DEBUG oslo_concurrency.lockutils [req-5ea4170f-c1cc-46b6-be0d-e9fff6aa361e req-a3ded92a-3e7e-40c9-8567-9807ea6659ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:20 np0005531888 nova_compute[186788]: 2025-11-22 07:55:20.316 186792 DEBUG oslo_concurrency.lockutils [req-5ea4170f-c1cc-46b6-be0d-e9fff6aa361e req-a3ded92a-3e7e-40c9-8567-9807ea6659ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:20 np0005531888 nova_compute[186788]: 2025-11-22 07:55:20.316 186792 DEBUG oslo_concurrency.lockutils [req-5ea4170f-c1cc-46b6-be0d-e9fff6aa361e req-a3ded92a-3e7e-40c9-8567-9807ea6659ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2b016c4-0e79-4389-ad09-9b9362320ac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:20 np0005531888 nova_compute[186788]: 2025-11-22 07:55:20.316 186792 DEBUG nova.compute.manager [req-5ea4170f-c1cc-46b6-be0d-e9fff6aa361e req-a3ded92a-3e7e-40c9-8567-9807ea6659ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] No waiting events found dispatching network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:20 np0005531888 nova_compute[186788]: 2025-11-22 07:55:20.317 186792 WARNING nova.compute.manager [req-5ea4170f-c1cc-46b6-be0d-e9fff6aa361e req-a3ded92a-3e7e-40c9-8567-9807ea6659ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Received unexpected event network-vif-plugged-c12ff87c-55f2-4e24-b84d-d105cfce590a for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:55:22 np0005531888 nova_compute[186788]: 2025-11-22 07:55:22.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:22 np0005531888 nova_compute[186788]: 2025-11-22 07:55:22.778 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.701 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "dd6876d4-cab4-413b-9d67-10e2ba45a220" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.701 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:23 np0005531888 podman[223076]: 2025-11-22 07:55:23.707748812 +0000 UTC m=+0.080731933 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible)
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.719 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.813 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.813 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.821 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.821 186792 INFO nova.compute.claims [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.968 186792 DEBUG nova.compute.provider_tree [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:23 np0005531888 nova_compute[186788]: 2025-11-22 07:55:23.991 186792 DEBUG nova.scheduler.client.report [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.019 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.020 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.081 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.082 186792 DEBUG nova.network.neutron [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.110 186792 INFO nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.129 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.262 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.265 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.266 186792 INFO nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Creating image(s)#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.267 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "/var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.267 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.268 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.284 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.369 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.370 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.371 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.385 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.410 186792 DEBUG nova.policy [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.450 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.451 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.499 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.501 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.502 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.559 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.563 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.564 186792 DEBUG nova.virt.disk.api [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Checking if we can resize image /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.564 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.622 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.623 186792 DEBUG nova.virt.disk.api [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Cannot resize image /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.623 186792 DEBUG nova.objects.instance [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'migration_context' on Instance uuid dd6876d4-cab4-413b-9d67-10e2ba45a220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.651 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.651 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Ensure instance console log exists: /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.652 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.652 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:24 np0005531888 nova_compute[186788]: 2025-11-22 07:55:24.652 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:25 np0005531888 nova_compute[186788]: 2025-11-22 07:55:25.342 186792 DEBUG nova.network.neutron [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Successfully created port: 82dff43c-f553-4814-a2ec-d51fb34dd31e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:55:26 np0005531888 nova_compute[186788]: 2025-11-22 07:55:26.060 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798111.057709, 22d1fb4f-53b9-4f92-b767-ef6cb7630aff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:26 np0005531888 nova_compute[186788]: 2025-11-22 07:55:26.061 186792 INFO nova.compute.manager [-] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:55:26 np0005531888 nova_compute[186788]: 2025-11-22 07:55:26.104 186792 DEBUG nova.compute.manager [None req-693cb922-6eed-41ab-89df-52ab64fb484f - - - - - -] [instance: 22d1fb4f-53b9-4f92-b767-ef6cb7630aff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:26 np0005531888 podman[223112]: 2025-11-22 07:55:26.682542487 +0000 UTC m=+0.060354932 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.074 186792 DEBUG nova.network.neutron [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Successfully updated port: 82dff43c-f553-4814-a2ec-d51fb34dd31e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.102 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.103 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquired lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.103 186792 DEBUG nova.network.neutron [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.216 186792 DEBUG nova.compute.manager [req-61c11bd8-8525-4a2e-814e-5fbad8cdf91c req-9262890d-8ba2-4bde-8586-6dfd5eb8e609 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received event network-changed-82dff43c-f553-4814-a2ec-d51fb34dd31e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.217 186792 DEBUG nova.compute.manager [req-61c11bd8-8525-4a2e-814e-5fbad8cdf91c req-9262890d-8ba2-4bde-8586-6dfd5eb8e609 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Refreshing instance network info cache due to event network-changed-82dff43c-f553-4814-a2ec-d51fb34dd31e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.217 186792 DEBUG oslo_concurrency.lockutils [req-61c11bd8-8525-4a2e-814e-5fbad8cdf91c req-9262890d-8ba2-4bde-8586-6dfd5eb8e609 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.408 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.657 186792 DEBUG nova.network.neutron [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:55:27 np0005531888 nova_compute[186788]: 2025-11-22 07:55:27.780 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.167 186792 DEBUG nova.network.neutron [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Updating instance_info_cache with network_info: [{"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.194 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Releasing lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.195 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Instance network_info: |[{"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.196 186792 DEBUG oslo_concurrency.lockutils [req-61c11bd8-8525-4a2e-814e-5fbad8cdf91c req-9262890d-8ba2-4bde-8586-6dfd5eb8e609 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.197 186792 DEBUG nova.network.neutron [req-61c11bd8-8525-4a2e-814e-5fbad8cdf91c req-9262890d-8ba2-4bde-8586-6dfd5eb8e609 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Refreshing network info cache for port 82dff43c-f553-4814-a2ec-d51fb34dd31e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.202 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Start _get_guest_xml network_info=[{"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.210 186792 WARNING nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.218 186792 DEBUG nova.virt.libvirt.host [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.219 186792 DEBUG nova.virt.libvirt.host [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.233 186792 DEBUG nova.virt.libvirt.host [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.234 186792 DEBUG nova.virt.libvirt.host [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.235 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.235 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.236 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.236 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.236 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.236 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.237 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.237 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.237 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.237 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.238 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.238 186792 DEBUG nova.virt.hardware [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.241 186792 DEBUG nova.virt.libvirt.vif [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1813873897',display_name='tempest-ImagesTestJSON-server-1813873897',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1813873897',id=68,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-jizt31sp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:24Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=dd6876d4-cab4-413b-9d67-10e2ba45a220,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.242 186792 DEBUG nova.network.os_vif_util [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.242 186792 DEBUG nova.network.os_vif_util [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:0d:90,bridge_name='br-int',has_traffic_filtering=True,id=82dff43c-f553-4814-a2ec-d51fb34dd31e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82dff43c-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.243 186792 DEBUG nova.objects.instance [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'pci_devices' on Instance uuid dd6876d4-cab4-413b-9d67-10e2ba45a220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.271 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <uuid>dd6876d4-cab4-413b-9d67-10e2ba45a220</uuid>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <name>instance-00000044</name>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <nova:name>tempest-ImagesTestJSON-server-1813873897</nova:name>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:55:29</nova:creationTime>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        <nova:user uuid="1ac2d2381d294c96aff369941185056a">tempest-ImagesTestJSON-117614339-project-member</nova:user>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        <nova:project uuid="7ec4007dc8214caab4e2eb40f11fb3cd">tempest-ImagesTestJSON-117614339</nova:project>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        <nova:port uuid="82dff43c-f553-4814-a2ec-d51fb34dd31e">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <entry name="serial">dd6876d4-cab4-413b-9d67-10e2ba45a220</entry>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <entry name="uuid">dd6876d4-cab4-413b-9d67-10e2ba45a220</entry>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk.config"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:e2:0d:90"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <target dev="tap82dff43c-f5"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/console.log" append="off"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:55:29 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:55:29 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:55:29 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:55:29 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.273 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Preparing to wait for external event network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.273 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.273 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.274 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.274 186792 DEBUG nova.virt.libvirt.vif [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1813873897',display_name='tempest-ImagesTestJSON-server-1813873897',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1813873897',id=68,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-jizt31sp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:24Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=dd6876d4-cab4-413b-9d67-10e2ba45a220,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.275 186792 DEBUG nova.network.os_vif_util [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.275 186792 DEBUG nova.network.os_vif_util [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:0d:90,bridge_name='br-int',has_traffic_filtering=True,id=82dff43c-f553-4814-a2ec-d51fb34dd31e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82dff43c-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.275 186792 DEBUG os_vif [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:0d:90,bridge_name='br-int',has_traffic_filtering=True,id=82dff43c-f553-4814-a2ec-d51fb34dd31e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82dff43c-f5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.276 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.277 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.281 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.281 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82dff43c-f5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.281 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82dff43c-f5, col_values=(('external_ids', {'iface-id': '82dff43c-f553-4814-a2ec-d51fb34dd31e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:0d:90', 'vm-uuid': 'dd6876d4-cab4-413b-9d67-10e2ba45a220'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:29 np0005531888 NetworkManager[55166]: <info>  [1763798129.2842] manager: (tap82dff43c-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.285 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.290 186792 INFO os_vif [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:0d:90,bridge_name='br-int',has_traffic_filtering=True,id=82dff43c-f553-4814-a2ec-d51fb34dd31e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82dff43c-f5')#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.352 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.352 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.352 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No VIF found with MAC fa:16:3e:e2:0d:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:55:29 np0005531888 nova_compute[186788]: 2025-11-22 07:55:29.353 186792 INFO nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Using config drive#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.121 186792 INFO nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Creating config drive at /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk.config#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.127 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx4pzwn27 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.253 186792 DEBUG oslo_concurrency.processutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx4pzwn27" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:30 np0005531888 kernel: tap82dff43c-f5: entered promiscuous mode
Nov 22 02:55:30 np0005531888 NetworkManager[55166]: <info>  [1763798130.3264] manager: (tap82dff43c-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Nov 22 02:55:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:30Z|00169|binding|INFO|Claiming lport 82dff43c-f553-4814-a2ec-d51fb34dd31e for this chassis.
Nov 22 02:55:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:30Z|00170|binding|INFO|82dff43c-f553-4814-a2ec-d51fb34dd31e: Claiming fa:16:3e:e2:0d:90 10.100.0.13
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.326 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.330 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.340 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:0d:90 10.100.0.13'], port_security=['fa:16:3e:e2:0d:90 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'dd6876d4-cab4-413b-9d67-10e2ba45a220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=82dff43c-f553-4814-a2ec-d51fb34dd31e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.341 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 82dff43c-f553-4814-a2ec-d51fb34dd31e in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a bound to our chassis#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.342 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.353 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[640e3f6b-09c5-49bd-80b8-1140d81ba6c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.354 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdc6b9ee8-e1 in ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.356 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdc6b9ee8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.356 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[264aa331-3d68-4ff2-9668-9c8de9651137]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.357 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a8d7e7-271a-443e-9f5c-7cbdf25bf365]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 systemd-udevd[223158]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.369 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0015e5eb-0d5e-4919-9172-ee67a67b0275]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 NetworkManager[55166]: <info>  [1763798130.3727] device (tap82dff43c-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:55:30 np0005531888 NetworkManager[55166]: <info>  [1763798130.3739] device (tap82dff43c-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:55:30 np0005531888 systemd-machined[153106]: New machine qemu-32-instance-00000044.
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.385 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:30Z|00171|binding|INFO|Setting lport 82dff43c-f553-4814-a2ec-d51fb34dd31e ovn-installed in OVS
Nov 22 02:55:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:30Z|00172|binding|INFO|Setting lport 82dff43c-f553-4814-a2ec-d51fb34dd31e up in Southbound
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.392 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.393 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[115395ca-5c32-4f43-ae92-a4de9ed52a4d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 systemd[1]: Started Virtual Machine qemu-32-instance-00000044.
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.422 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[32656c66-3a48-4c7a-9e3c-5b782d3b262a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 systemd-udevd[223163]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:55:30 np0005531888 NetworkManager[55166]: <info>  [1763798130.4286] manager: (tapdc6b9ee8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.428 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b2bcaa0e-7b58-4cc7-95bc-7a75c8b004ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.461 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d09856a6-0c60-4cdf-b324-2ca023b58745]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.466 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a59a0e-a99d-4190-9722-ad51e763d485]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 NetworkManager[55166]: <info>  [1763798130.4899] device (tapdc6b9ee8-e0): carrier: link connected
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.495 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb4640a-59c6-4513-b9fb-19dd4aa0906c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.511 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c553f95e-80df-4ad3-9b02-ec0c5dcae9e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487113, 'reachable_time': 21697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223192, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.526 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[de56725c-9ad3-486d-b5d0-b98bc880046a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:d89c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487113, 'tstamp': 487113}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223193, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.544 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b2994b19-f4a2-4aa1-af53-decd291caf7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487113, 'reachable_time': 21697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223194, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.574 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b543de03-a927-417d-89a2-412b8424dae2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.633 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae88726-bf0f-4dbd-91e9-e352fcfbb9e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.634 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.635 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.635 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc6b9ee8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.637 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 kernel: tapdc6b9ee8-e0: entered promiscuous mode
Nov 22 02:55:30 np0005531888 NetworkManager[55166]: <info>  [1763798130.6397] manager: (tapdc6b9ee8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.639 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.640 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdc6b9ee8-e0, col_values=(('external_ids', {'iface-id': '99cae854-daa9-4d08-8152-257a15e21bf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.642 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:30Z|00173|binding|INFO|Releasing lport 99cae854-daa9-4d08-8152-257a15e21bf8 from this chassis (sb_readonly=0)
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.644 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.643 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.644 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[be7f2e5c-0de8-4ca0-8dd2-a44984e144ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.645 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:55:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:30.646 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'env', 'PROCESS_TAG=haproxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.654 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.667 186792 DEBUG nova.compute.manager [req-61bafb90-365b-4d53-b461-77ac42cf7c52 req-71df9895-3b87-4e44-a462-e7d535d8e7fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received event network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.668 186792 DEBUG oslo_concurrency.lockutils [req-61bafb90-365b-4d53-b461-77ac42cf7c52 req-71df9895-3b87-4e44-a462-e7d535d8e7fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.668 186792 DEBUG oslo_concurrency.lockutils [req-61bafb90-365b-4d53-b461-77ac42cf7c52 req-71df9895-3b87-4e44-a462-e7d535d8e7fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.669 186792 DEBUG oslo_concurrency.lockutils [req-61bafb90-365b-4d53-b461-77ac42cf7c52 req-71df9895-3b87-4e44-a462-e7d535d8e7fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.669 186792 DEBUG nova.compute.manager [req-61bafb90-365b-4d53-b461-77ac42cf7c52 req-71df9895-3b87-4e44-a462-e7d535d8e7fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Processing event network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.939 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798130.9389956, dd6876d4-cab4-413b-9d67-10e2ba45a220 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.940 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] VM Started (Lifecycle Event)#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.943 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.947 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.951 186792 INFO nova.virt.libvirt.driver [-] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Instance spawned successfully.#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.952 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.968 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.972 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.995 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.996 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.996 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.997 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.998 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:30 np0005531888 nova_compute[186788]: 2025-11-22 07:55:30.998 186792 DEBUG nova.virt.libvirt.driver [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.002 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.003 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798130.9402285, dd6876d4-cab4-413b-9d67-10e2ba45a220 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.003 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:55:31 np0005531888 podman[223233]: 2025-11-22 07:55:31.039124774 +0000 UTC m=+0.070349568 container create 394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.041 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.047 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798130.9463317, dd6876d4-cab4-413b-9d67-10e2ba45a220 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.048 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:55:31 np0005531888 podman[223233]: 2025-11-22 07:55:30.996734193 +0000 UTC m=+0.027959007 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.093 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:31 np0005531888 systemd[1]: Started libpod-conmon-394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868.scope.
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.102 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.106 186792 INFO nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Took 6.84 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.107 186792 DEBUG nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:31 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:55:31 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6c55836c742788982345cd9ab17a78f8cd3425cf50f7e516d33f6b15ea8f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.136 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:55:31 np0005531888 podman[223246]: 2025-11-22 07:55:31.137591421 +0000 UTC m=+0.064586257 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Nov 22 02:55:31 np0005531888 podman[223233]: 2025-11-22 07:55:31.153957393 +0000 UTC m=+0.185182207 container init 394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:55:31 np0005531888 podman[223233]: 2025-11-22 07:55:31.159678223 +0000 UTC m=+0.190903017 container start 394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 02:55:31 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223261]: [NOTICE]   (223273) : New worker (223275) forked
Nov 22 02:55:31 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223261]: [NOTICE]   (223273) : Loading success.
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.232 186792 INFO nova.compute.manager [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Took 7.45 seconds to build instance.#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.260 186792 DEBUG oslo_concurrency.lockutils [None req-523ef3ae-b63f-49bd-9d75-1b36b0cc0b49 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.688 186792 DEBUG nova.network.neutron [req-61c11bd8-8525-4a2e-814e-5fbad8cdf91c req-9262890d-8ba2-4bde-8586-6dfd5eb8e609 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Updated VIF entry in instance network info cache for port 82dff43c-f553-4814-a2ec-d51fb34dd31e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.690 186792 DEBUG nova.network.neutron [req-61c11bd8-8525-4a2e-814e-5fbad8cdf91c req-9262890d-8ba2-4bde-8586-6dfd5eb8e609 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Updating instance_info_cache with network_info: [{"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:31 np0005531888 nova_compute[186788]: 2025-11-22 07:55:31.719 186792 DEBUG oslo_concurrency.lockutils [req-61c11bd8-8525-4a2e-814e-5fbad8cdf91c req-9262890d-8ba2-4bde-8586-6dfd5eb8e609 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.374 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798117.3724475, c2b016c4-0e79-4389-ad09-9b9362320ac7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.376 186792 INFO nova.compute.manager [-] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.404 186792 DEBUG nova.compute.manager [None req-7817531e-6df6-4df5-a10f-2be62b5773cc - - - - - -] [instance: c2b016c4-0e79-4389-ad09-9b9362320ac7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.781 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.789 186792 DEBUG nova.compute.manager [req-974c7089-ed59-4cbd-8403-6c8b656487bc req-dfe6ef19-87e4-4ec9-b93b-0e52e3514629 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received event network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.789 186792 DEBUG oslo_concurrency.lockutils [req-974c7089-ed59-4cbd-8403-6c8b656487bc req-dfe6ef19-87e4-4ec9-b93b-0e52e3514629 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.789 186792 DEBUG oslo_concurrency.lockutils [req-974c7089-ed59-4cbd-8403-6c8b656487bc req-dfe6ef19-87e4-4ec9-b93b-0e52e3514629 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.790 186792 DEBUG oslo_concurrency.lockutils [req-974c7089-ed59-4cbd-8403-6c8b656487bc req-dfe6ef19-87e4-4ec9-b93b-0e52e3514629 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.790 186792 DEBUG nova.compute.manager [req-974c7089-ed59-4cbd-8403-6c8b656487bc req-dfe6ef19-87e4-4ec9-b93b-0e52e3514629 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] No waiting events found dispatching network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:32 np0005531888 nova_compute[186788]: 2025-11-22 07:55:32.790 186792 WARNING nova.compute.manager [req-974c7089-ed59-4cbd-8403-6c8b656487bc req-dfe6ef19-87e4-4ec9-b93b-0e52e3514629 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received unexpected event network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e for instance with vm_state active and task_state None.#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.080 186792 DEBUG nova.compute.manager [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.172 186792 INFO nova.compute.manager [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] instance snapshotting#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.284 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.506 186792 INFO nova.virt.libvirt.driver [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Beginning live snapshot process#033[00m
Nov 22 02:55:34 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.800 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.862 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.863 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.921 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.935 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.997 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:34 np0005531888 nova_compute[186788]: 2025-11-22 07:55:34.998 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp_n6vr6xd/136bd326c5d346819d4a0457e54b2d96.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:35 np0005531888 nova_compute[186788]: 2025-11-22 07:55:35.036 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp_n6vr6xd/136bd326c5d346819d4a0457e54b2d96.delta 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:35 np0005531888 nova_compute[186788]: 2025-11-22 07:55:35.038 186792 INFO nova.virt.libvirt.driver [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:55:35 np0005531888 nova_compute[186788]: 2025-11-22 07:55:35.100 186792 DEBUG nova.virt.libvirt.guest [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:55:35 np0005531888 nova_compute[186788]: 2025-11-22 07:55:35.105 186792 INFO nova.virt.libvirt.driver [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:55:35 np0005531888 nova_compute[186788]: 2025-11-22 07:55:35.154 186792 DEBUG nova.privsep.utils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:55:35 np0005531888 nova_compute[186788]: 2025-11-22 07:55:35.155 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp_n6vr6xd/136bd326c5d346819d4a0457e54b2d96.delta /var/lib/nova/instances/snapshots/tmp_n6vr6xd/136bd326c5d346819d4a0457e54b2d96 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:35 np0005531888 nova_compute[186788]: 2025-11-22 07:55:35.370 186792 DEBUG oslo_concurrency.processutils [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp_n6vr6xd/136bd326c5d346819d4a0457e54b2d96.delta /var/lib/nova/instances/snapshots/tmp_n6vr6xd/136bd326c5d346819d4a0457e54b2d96" returned: 0 in 0.215s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:35 np0005531888 nova_compute[186788]: 2025-11-22 07:55:35.372 186792 INFO nova.virt.libvirt.driver [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:55:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:36.808 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:36.809 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:55:36.810 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:37 np0005531888 podman[223311]: 2025-11-22 07:55:37.703915262 +0000 UTC m=+0.073411384 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:55:37 np0005531888 podman[223312]: 2025-11-22 07:55:37.721289628 +0000 UTC m=+0.090182214 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 22 02:55:37 np0005531888 nova_compute[186788]: 2025-11-22 07:55:37.783 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:39 np0005531888 nova_compute[186788]: 2025-11-22 07:55:39.287 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:40 np0005531888 nova_compute[186788]: 2025-11-22 07:55:40.471 186792 INFO nova.virt.libvirt.driver [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Snapshot image upload complete#033[00m
Nov 22 02:55:40 np0005531888 nova_compute[186788]: 2025-11-22 07:55:40.471 186792 INFO nova.compute.manager [None req-fb25adb7-2185-47b8-ba6e-fb91984d1208 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Took 6.29 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:55:42 np0005531888 nova_compute[186788]: 2025-11-22 07:55:42.786 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:44 np0005531888 nova_compute[186788]: 2025-11-22 07:55:44.290 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:44 np0005531888 nova_compute[186788]: 2025-11-22 07:55:44.723 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:44 np0005531888 nova_compute[186788]: 2025-11-22 07:55:44.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:44 np0005531888 nova_compute[186788]: 2025-11-22 07:55:44.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:55:44 np0005531888 nova_compute[186788]: 2025-11-22 07:55:44.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:55:45 np0005531888 nova_compute[186788]: 2025-11-22 07:55:45.196 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:55:45 np0005531888 nova_compute[186788]: 2025-11-22 07:55:45.196 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:55:45 np0005531888 nova_compute[186788]: 2025-11-22 07:55:45.196 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:55:45 np0005531888 nova_compute[186788]: 2025-11-22 07:55:45.197 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid dd6876d4-cab4-413b-9d67-10e2ba45a220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:45 np0005531888 podman[223369]: 2025-11-22 07:55:45.68776339 +0000 UTC m=+0.054118069 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:55:46 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:46Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:0d:90 10.100.0.13
Nov 22 02:55:46 np0005531888 ovn_controller[95067]: 2025-11-22T07:55:46Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:0d:90 10.100.0.13
Nov 22 02:55:47 np0005531888 podman[223398]: 2025-11-22 07:55:47.676694256 +0000 UTC m=+0.050832380 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 02:55:47 np0005531888 nova_compute[186788]: 2025-11-22 07:55:47.787 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:48 np0005531888 nova_compute[186788]: 2025-11-22 07:55:48.007 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Updating instance_info_cache with network_info: [{"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:48 np0005531888 nova_compute[186788]: 2025-11-22 07:55:48.031 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-dd6876d4-cab4-413b-9d67-10e2ba45a220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:48 np0005531888 nova_compute[186788]: 2025-11-22 07:55:48.032 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:55:48 np0005531888 nova_compute[186788]: 2025-11-22 07:55:48.032 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:48 np0005531888 nova_compute[186788]: 2025-11-22 07:55:48.032 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:48 np0005531888 nova_compute[186788]: 2025-11-22 07:55:48.033 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:48 np0005531888 nova_compute[186788]: 2025-11-22 07:55:48.033 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:49 np0005531888 nova_compute[186788]: 2025-11-22 07:55:49.291 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:52 np0005531888 nova_compute[186788]: 2025-11-22 07:55:52.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.293 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:54 np0005531888 podman[223417]: 2025-11-22 07:55:54.697455341 +0000 UTC m=+0.068856781 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.995 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.995 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.995 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:54 np0005531888 nova_compute[186788]: 2025-11-22 07:55:54.996 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:55:55 np0005531888 nova_compute[186788]: 2025-11-22 07:55:55.609 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:55 np0005531888 nova_compute[186788]: 2025-11-22 07:55:55.675 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:55 np0005531888 nova_compute[186788]: 2025-11-22 07:55:55.676 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:55 np0005531888 nova_compute[186788]: 2025-11-22 07:55:55.747 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:55 np0005531888 nova_compute[186788]: 2025-11-22 07:55:55.979 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:55:55 np0005531888 nova_compute[186788]: 2025-11-22 07:55:55.980 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5566MB free_disk=73.32127380371094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:55:55 np0005531888 nova_compute[186788]: 2025-11-22 07:55:55.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:55 np0005531888 nova_compute[186788]: 2025-11-22 07:55:55.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:56 np0005531888 nova_compute[186788]: 2025-11-22 07:55:56.037 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance dd6876d4-cab4-413b-9d67-10e2ba45a220 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:55:56 np0005531888 nova_compute[186788]: 2025-11-22 07:55:56.038 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:55:56 np0005531888 nova_compute[186788]: 2025-11-22 07:55:56.038 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:55:56 np0005531888 nova_compute[186788]: 2025-11-22 07:55:56.078 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:56 np0005531888 nova_compute[186788]: 2025-11-22 07:55:56.112 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:56 np0005531888 nova_compute[186788]: 2025-11-22 07:55:56.151 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:55:56 np0005531888 nova_compute[186788]: 2025-11-22 07:55:56.151 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:57 np0005531888 podman[223443]: 2025-11-22 07:55:57.683936744 +0000 UTC m=+0.056351504 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:55:57 np0005531888 nova_compute[186788]: 2025-11-22 07:55:57.793 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:58 np0005531888 nova_compute[186788]: 2025-11-22 07:55:58.147 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:59 np0005531888 nova_compute[186788]: 2025-11-22 07:55:59.296 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:01 np0005531888 podman[223467]: 2025-11-22 07:56:01.6938503 +0000 UTC m=+0.060441695 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal)
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.069 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "dd6876d4-cab4-413b-9d67-10e2ba45a220" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.070 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.070 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.071 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.071 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.080 186792 INFO nova.compute.manager [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Terminating instance#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.085 186792 DEBUG nova.compute.manager [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:56:02 np0005531888 kernel: tap82dff43c-f5 (unregistering): left promiscuous mode
Nov 22 02:56:02 np0005531888 NetworkManager[55166]: <info>  [1763798162.1113] device (tap82dff43c-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.123 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:02 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:02Z|00174|binding|INFO|Releasing lport 82dff43c-f553-4814-a2ec-d51fb34dd31e from this chassis (sb_readonly=0)
Nov 22 02:56:02 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:02Z|00175|binding|INFO|Setting lport 82dff43c-f553-4814-a2ec-d51fb34dd31e down in Southbound
Nov 22 02:56:02 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:02Z|00176|binding|INFO|Removing iface tap82dff43c-f5 ovn-installed in OVS
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.127 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.137 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:0d:90 10.100.0.13'], port_security=['fa:16:3e:e2:0d:90 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'dd6876d4-cab4-413b-9d67-10e2ba45a220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=82dff43c-f553-4814-a2ec-d51fb34dd31e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.140 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 82dff43c-f553-4814-a2ec-d51fb34dd31e in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a unbound from our chassis#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.142 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.144 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[329e075e-c095-4db0-b4a9-eb477241bec0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.145 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace which is not needed anymore#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.155 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:02 np0005531888 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000044.scope: Deactivated successfully.
Nov 22 02:56:02 np0005531888 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000044.scope: Consumed 15.624s CPU time.
Nov 22 02:56:02 np0005531888 systemd-machined[153106]: Machine qemu-32-instance-00000044 terminated.
Nov 22 02:56:02 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223261]: [NOTICE]   (223273) : haproxy version is 2.8.14-c23fe91
Nov 22 02:56:02 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223261]: [NOTICE]   (223273) : path to executable is /usr/sbin/haproxy
Nov 22 02:56:02 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223261]: [WARNING]  (223273) : Exiting Master process...
Nov 22 02:56:02 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223261]: [WARNING]  (223273) : Exiting Master process...
Nov 22 02:56:02 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223261]: [ALERT]    (223273) : Current worker (223275) exited with code 143 (Terminated)
Nov 22 02:56:02 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223261]: [WARNING]  (223273) : All workers exited. Exiting... (0)
Nov 22 02:56:02 np0005531888 systemd[1]: libpod-394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868.scope: Deactivated successfully.
Nov 22 02:56:02 np0005531888 conmon[223261]: conmon 394a115057593637ae08 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868.scope/container/memory.events
Nov 22 02:56:02 np0005531888 podman[223514]: 2025-11-22 07:56:02.316360181 +0000 UTC m=+0.066997725 container died 394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:56:02 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868-userdata-shm.mount: Deactivated successfully.
Nov 22 02:56:02 np0005531888 systemd[1]: var-lib-containers-storage-overlay-e4f6c55836c742788982345cd9ab17a78f8cd3425cf50f7e516d33f6b15ea8f4-merged.mount: Deactivated successfully.
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.369 186792 INFO nova.virt.libvirt.driver [-] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Instance destroyed successfully.#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.372 186792 DEBUG nova.objects.instance [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'resources' on Instance uuid dd6876d4-cab4-413b-9d67-10e2ba45a220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:02 np0005531888 podman[223514]: 2025-11-22 07:56:02.376897887 +0000 UTC m=+0.127535431 container cleanup 394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:56:02 np0005531888 systemd[1]: libpod-conmon-394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868.scope: Deactivated successfully.
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.383 186792 DEBUG nova.compute.manager [req-edcd933e-361c-4eb6-959a-18cd56680ccf req-1f481700-477e-43a9-98fd-ae4c50440936 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received event network-vif-unplugged-82dff43c-f553-4814-a2ec-d51fb34dd31e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.384 186792 DEBUG oslo_concurrency.lockutils [req-edcd933e-361c-4eb6-959a-18cd56680ccf req-1f481700-477e-43a9-98fd-ae4c50440936 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.384 186792 DEBUG oslo_concurrency.lockutils [req-edcd933e-361c-4eb6-959a-18cd56680ccf req-1f481700-477e-43a9-98fd-ae4c50440936 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.385 186792 DEBUG oslo_concurrency.lockutils [req-edcd933e-361c-4eb6-959a-18cd56680ccf req-1f481700-477e-43a9-98fd-ae4c50440936 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.385 186792 DEBUG nova.compute.manager [req-edcd933e-361c-4eb6-959a-18cd56680ccf req-1f481700-477e-43a9-98fd-ae4c50440936 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] No waiting events found dispatching network-vif-unplugged-82dff43c-f553-4814-a2ec-d51fb34dd31e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.385 186792 DEBUG nova.compute.manager [req-edcd933e-361c-4eb6-959a-18cd56680ccf req-1f481700-477e-43a9-98fd-ae4c50440936 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received event network-vif-unplugged-82dff43c-f553-4814-a2ec-d51fb34dd31e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.395 186792 DEBUG nova.virt.libvirt.vif [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1813873897',display_name='tempest-ImagesTestJSON-server-1813873897',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1813873897',id=68,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-jizt31sp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:55:40Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=dd6876d4-cab4-413b-9d67-10e2ba45a220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.396 186792 DEBUG nova.network.os_vif_util [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "address": "fa:16:3e:e2:0d:90", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82dff43c-f5", "ovs_interfaceid": "82dff43c-f553-4814-a2ec-d51fb34dd31e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.397 186792 DEBUG nova.network.os_vif_util [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:0d:90,bridge_name='br-int',has_traffic_filtering=True,id=82dff43c-f553-4814-a2ec-d51fb34dd31e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82dff43c-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.398 186792 DEBUG os_vif [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:0d:90,bridge_name='br-int',has_traffic_filtering=True,id=82dff43c-f553-4814-a2ec-d51fb34dd31e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82dff43c-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.400 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.400 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82dff43c-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.402 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.404 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.408 186792 INFO os_vif [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:0d:90,bridge_name='br-int',has_traffic_filtering=True,id=82dff43c-f553-4814-a2ec-d51fb34dd31e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82dff43c-f5')#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.409 186792 INFO nova.virt.libvirt.driver [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Deleting instance files /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220_del#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.409 186792 INFO nova.virt.libvirt.driver [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Deletion of /var/lib/nova/instances/dd6876d4-cab4-413b-9d67-10e2ba45a220_del complete#033[00m
Nov 22 02:56:02 np0005531888 podman[223559]: 2025-11-22 07:56:02.459000582 +0000 UTC m=+0.057819920 container remove 394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.468 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d86efef7-5b0d-47f9-8456-1b3b33ba7791]: (4, ('Sat Nov 22 07:56:02 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868)\n394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868\nSat Nov 22 07:56:02 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868)\n394a115057593637ae08952c1348def2de8a2183d29012399df4cbdc6d1dd868\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.470 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a840a484-2540-4068-9126-8c8a941b4091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.470 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.473 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:02 np0005531888 kernel: tapdc6b9ee8-e0: left promiscuous mode
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.487 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.490 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0fce3f84-5d60-4c07-ad9a-8713da02724c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.500 186792 INFO nova.compute.manager [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.501 186792 DEBUG oslo.service.loopingcall [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.502 186792 DEBUG nova.compute.manager [-] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.502 186792 DEBUG nova.network.neutron [-] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.507 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5a1829f5-ac31-42e7-ba84-6affd6f4521b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.509 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2bf855cb-dca0-46c7-b509-cf7f854553c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.527 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[85b808ea-e23e-41dc-afe6-12b3545e9fd8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487106, 'reachable_time': 16724, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223574, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.531 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:56:02 np0005531888 systemd[1]: run-netns-ovnmeta\x2ddc6b9ee8\x2de824\x2d42ea\x2dbe5e\x2d5b3c4e48e46a.mount: Deactivated successfully.
Nov 22 02:56:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:02.531 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[5b676239-2c83-4246-8bc3-022d1db672fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:02 np0005531888 nova_compute[186788]: 2025-11-22 07:56:02.795 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.158 186792 DEBUG nova.network.neutron [-] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.175 186792 INFO nova.compute.manager [-] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Took 0.67 seconds to deallocate network for instance.#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.245 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.246 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.303 186792 DEBUG nova.compute.provider_tree [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.317 186792 DEBUG nova.scheduler.client.report [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.343 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.447 186792 INFO nova.scheduler.client.report [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Deleted allocations for instance dd6876d4-cab4-413b-9d67-10e2ba45a220#033[00m
Nov 22 02:56:03 np0005531888 nova_compute[186788]: 2025-11-22 07:56:03.558 186792 DEBUG oslo_concurrency.lockutils [None req-f993d8e6-c8fb-4fc2-8693-30bd10371f64 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.016 186792 DEBUG nova.compute.manager [req-d4cc8614-ec0c-422a-b855-adc08d5ec3ca req-725fd58f-b343-4cf9-ad6b-c76e8d7a135a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received event network-vif-deleted-82dff43c-f553-4814-a2ec-d51fb34dd31e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.472 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "a7aa669e-9a07-44ff-ad3d-919077eb0646" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.473 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.480 186792 DEBUG nova.compute.manager [req-4b442420-9ef9-42f1-9f97-254bc54c14a1 req-9fdce83e-3d29-47d7-a07b-b64850fccc71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received event network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.481 186792 DEBUG oslo_concurrency.lockutils [req-4b442420-9ef9-42f1-9f97-254bc54c14a1 req-9fdce83e-3d29-47d7-a07b-b64850fccc71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.481 186792 DEBUG oslo_concurrency.lockutils [req-4b442420-9ef9-42f1-9f97-254bc54c14a1 req-9fdce83e-3d29-47d7-a07b-b64850fccc71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.481 186792 DEBUG oslo_concurrency.lockutils [req-4b442420-9ef9-42f1-9f97-254bc54c14a1 req-9fdce83e-3d29-47d7-a07b-b64850fccc71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd6876d4-cab4-413b-9d67-10e2ba45a220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.482 186792 DEBUG nova.compute.manager [req-4b442420-9ef9-42f1-9f97-254bc54c14a1 req-9fdce83e-3d29-47d7-a07b-b64850fccc71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] No waiting events found dispatching network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.482 186792 WARNING nova.compute.manager [req-4b442420-9ef9-42f1-9f97-254bc54c14a1 req-9fdce83e-3d29-47d7-a07b-b64850fccc71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Received unexpected event network-vif-plugged-82dff43c-f553-4814-a2ec-d51fb34dd31e for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.491 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.561 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.562 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.569 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.570 186792 INFO nova.compute.claims [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.673 186792 DEBUG nova.compute.provider_tree [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.686 186792 DEBUG nova.scheduler.client.report [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.706 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.707 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.765 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.765 186792 DEBUG nova.network.neutron [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.786 186792 INFO nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.804 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.919 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.921 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.921 186792 INFO nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Creating image(s)#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.922 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "/var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.922 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.923 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.939 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:04 np0005531888 nova_compute[186788]: 2025-11-22 07:56:04.965 186792 DEBUG nova.policy [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.007 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.009 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.009 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.023 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.093 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.095 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.162 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk 1073741824" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.163 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.164 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.224 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.225 186792 DEBUG nova.virt.disk.api [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Checking if we can resize image /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.225 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.289 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.290 186792 DEBUG nova.virt.disk.api [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Cannot resize image /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.290 186792 DEBUG nova.objects.instance [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'migration_context' on Instance uuid a7aa669e-9a07-44ff-ad3d-919077eb0646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.306 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.306 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Ensure instance console log exists: /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.307 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.307 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.307 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:05 np0005531888 nova_compute[186788]: 2025-11-22 07:56:05.636 186792 DEBUG nova.network.neutron [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Successfully created port: 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:56:07 np0005531888 nova_compute[186788]: 2025-11-22 07:56:07.403 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:07 np0005531888 nova_compute[186788]: 2025-11-22 07:56:07.797 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:08 np0005531888 podman[223590]: 2025-11-22 07:56:08.724868238 +0000 UTC m=+0.092079181 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 02:56:08 np0005531888 podman[223591]: 2025-11-22 07:56:08.733218443 +0000 UTC m=+0.094766607 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:56:09 np0005531888 nova_compute[186788]: 2025-11-22 07:56:09.198 186792 DEBUG nova.network.neutron [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Successfully updated port: 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:56:09 np0005531888 nova_compute[186788]: 2025-11-22 07:56:09.219 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "refresh_cache-a7aa669e-9a07-44ff-ad3d-919077eb0646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:09 np0005531888 nova_compute[186788]: 2025-11-22 07:56:09.219 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquired lock "refresh_cache-a7aa669e-9a07-44ff-ad3d-919077eb0646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:09 np0005531888 nova_compute[186788]: 2025-11-22 07:56:09.220 186792 DEBUG nova.network.neutron [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:56:09 np0005531888 nova_compute[186788]: 2025-11-22 07:56:09.393 186792 DEBUG nova.compute.manager [req-7aa56ba3-9549-4ddf-a084-b4c88d493f08 req-c8901565-8a2f-4cb0-b86c-24fe32526c99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received event network-changed-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:09 np0005531888 nova_compute[186788]: 2025-11-22 07:56:09.394 186792 DEBUG nova.compute.manager [req-7aa56ba3-9549-4ddf-a084-b4c88d493f08 req-c8901565-8a2f-4cb0-b86c-24fe32526c99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Refreshing instance network info cache due to event network-changed-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:56:09 np0005531888 nova_compute[186788]: 2025-11-22 07:56:09.394 186792 DEBUG oslo_concurrency.lockutils [req-7aa56ba3-9549-4ddf-a084-b4c88d493f08 req-c8901565-8a2f-4cb0-b86c-24fe32526c99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a7aa669e-9a07-44ff-ad3d-919077eb0646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:10 np0005531888 nova_compute[186788]: 2025-11-22 07:56:10.147 186792 DEBUG nova.network.neutron [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.885 186792 DEBUG nova.network.neutron [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Updating instance_info_cache with network_info: [{"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.911 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Releasing lock "refresh_cache-a7aa669e-9a07-44ff-ad3d-919077eb0646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.912 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Instance network_info: |[{"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.912 186792 DEBUG oslo_concurrency.lockutils [req-7aa56ba3-9549-4ddf-a084-b4c88d493f08 req-c8901565-8a2f-4cb0-b86c-24fe32526c99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a7aa669e-9a07-44ff-ad3d-919077eb0646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.913 186792 DEBUG nova.network.neutron [req-7aa56ba3-9549-4ddf-a084-b4c88d493f08 req-c8901565-8a2f-4cb0-b86c-24fe32526c99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Refreshing network info cache for port 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.917 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Start _get_guest_xml network_info=[{"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.925 186792 WARNING nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.933 186792 DEBUG nova.virt.libvirt.host [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.934 186792 DEBUG nova.virt.libvirt.host [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.938 186792 DEBUG nova.virt.libvirt.host [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.938 186792 DEBUG nova.virt.libvirt.host [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.939 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.939 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.940 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.940 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.940 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.941 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.941 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.941 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.941 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.941 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.942 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.942 186792 DEBUG nova.virt.hardware [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.945 186792 DEBUG nova.virt.libvirt.vif [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-606032013',display_name='tempest-ImagesTestJSON-server-606032013',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-606032013',id=73,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-mz0p40lo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:04Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=a7aa669e-9a07-44ff-ad3d-919077eb0646,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.946 186792 DEBUG nova.network.os_vif_util [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.946 186792 DEBUG nova.network.os_vif_util [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:41:27,bridge_name='br-int',has_traffic_filtering=True,id=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18dc8aa2-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.947 186792 DEBUG nova.objects.instance [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'pci_devices' on Instance uuid a7aa669e-9a07-44ff-ad3d-919077eb0646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.971 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <uuid>a7aa669e-9a07-44ff-ad3d-919077eb0646</uuid>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <name>instance-00000049</name>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <nova:name>tempest-ImagesTestJSON-server-606032013</nova:name>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:56:11</nova:creationTime>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        <nova:user uuid="1ac2d2381d294c96aff369941185056a">tempest-ImagesTestJSON-117614339-project-member</nova:user>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        <nova:project uuid="7ec4007dc8214caab4e2eb40f11fb3cd">tempest-ImagesTestJSON-117614339</nova:project>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        <nova:port uuid="18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <entry name="serial">a7aa669e-9a07-44ff-ad3d-919077eb0646</entry>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <entry name="uuid">a7aa669e-9a07-44ff-ad3d-919077eb0646</entry>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk.config"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:65:41:27"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <target dev="tap18dc8aa2-0b"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/console.log" append="off"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:56:11 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:56:11 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:56:11 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:56:11 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.972 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Preparing to wait for external event network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.972 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.973 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.973 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.974 186792 DEBUG nova.virt.libvirt.vif [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-606032013',display_name='tempest-ImagesTestJSON-server-606032013',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-606032013',id=73,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-mz0p40lo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:04Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=a7aa669e-9a07-44ff-ad3d-919077eb0646,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.974 186792 DEBUG nova.network.os_vif_util [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.974 186792 DEBUG nova.network.os_vif_util [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:41:27,bridge_name='br-int',has_traffic_filtering=True,id=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18dc8aa2-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.975 186792 DEBUG os_vif [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:41:27,bridge_name='br-int',has_traffic_filtering=True,id=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18dc8aa2-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.976 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.976 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.980 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18dc8aa2-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.981 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap18dc8aa2-0b, col_values=(('external_ids', {'iface-id': '18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:41:27', 'vm-uuid': 'a7aa669e-9a07-44ff-ad3d-919077eb0646'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.982 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:11 np0005531888 NetworkManager[55166]: <info>  [1763798171.9837] manager: (tap18dc8aa2-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.985 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.988 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:11 np0005531888 nova_compute[186788]: 2025-11-22 07:56:11.989 186792 INFO os_vif [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:41:27,bridge_name='br-int',has_traffic_filtering=True,id=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18dc8aa2-0b')#033[00m
Nov 22 02:56:12 np0005531888 nova_compute[186788]: 2025-11-22 07:56:12.069 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:12 np0005531888 nova_compute[186788]: 2025-11-22 07:56:12.070 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:12 np0005531888 nova_compute[186788]: 2025-11-22 07:56:12.071 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No VIF found with MAC fa:16:3e:65:41:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:56:12 np0005531888 nova_compute[186788]: 2025-11-22 07:56:12.071 186792 INFO nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Using config drive#033[00m
Nov 22 02:56:12 np0005531888 nova_compute[186788]: 2025-11-22 07:56:12.799 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.122 186792 INFO nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Creating config drive at /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk.config#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.127 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr579vfa3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.255 186792 DEBUG oslo_concurrency.processutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr579vfa3" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:13 np0005531888 kernel: tap18dc8aa2-0b: entered promiscuous mode
Nov 22 02:56:13 np0005531888 NetworkManager[55166]: <info>  [1763798173.3388] manager: (tap18dc8aa2-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Nov 22 02:56:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:13Z|00177|binding|INFO|Claiming lport 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 for this chassis.
Nov 22 02:56:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:13Z|00178|binding|INFO|18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8: Claiming fa:16:3e:65:41:27 10.100.0.11
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.341 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:13Z|00179|binding|INFO|Setting lport 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 ovn-installed in OVS
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.352 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:13Z|00180|binding|INFO|Setting lport 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 up in Southbound
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.353 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.354 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:41:27 10.100.0.11'], port_security=['fa:16:3e:65:41:27 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a7aa669e-9a07-44ff-ad3d-919077eb0646', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.356 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a bound to our chassis#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.359 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.373 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[04329b89-91a1-4b7f-a463-b957760b6d30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.374 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdc6b9ee8-e1 in ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:56:13 np0005531888 systemd-udevd[223656]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.378 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdc6b9ee8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.378 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[05d7682b-afed-415c-b127-f725a266edff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.379 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e15e16e6-f115-4fc5-9f08-a492085311f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 systemd-machined[153106]: New machine qemu-33-instance-00000049.
Nov 22 02:56:13 np0005531888 NetworkManager[55166]: <info>  [1763798173.3921] device (tap18dc8aa2-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:56:13 np0005531888 NetworkManager[55166]: <info>  [1763798173.3930] device (tap18dc8aa2-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.393 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[af943c5d-2d38-4382-98dc-efde6791305c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 systemd[1]: Started Virtual Machine qemu-33-instance-00000049.
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.419 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[be191441-4fa3-4d16-a81e-15049991d272]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.448 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[da57d062-0dea-42dd-b35a-d8ac584d2924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 NetworkManager[55166]: <info>  [1763798173.4547] manager: (tapdc6b9ee8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/97)
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.455 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0703a16a-4be5-4793-92e5-6ca488c06370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.485 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d3711538-8536-41cd-805f-ef464ad06517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.489 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff08010-b4e3-46c2-8865-d1c29d8cfdc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 NetworkManager[55166]: <info>  [1763798173.5124] device (tapdc6b9ee8-e0): carrier: link connected
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.519 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f644e9-d7f6-4bb9-8fc5-261d5d124835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.539 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2297d726-914e-4d08-8a9d-043562d1b573]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491415, 'reachable_time': 40736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223689, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.557 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f12e731a-623d-48ab-b707-9dcbb4e1184c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:d89c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 491415, 'tstamp': 491415}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223690, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.574 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8b73e0ca-ec1b-4952-847d-6b97607e4385]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491415, 'reachable_time': 40736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223691, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.609 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[882597c2-555d-49e3-ab5c-1a2edeecbe72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.674 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a16d92d8-02e3-406c-9d00-932bc999d9a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.676 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.677 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.677 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc6b9ee8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.679 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:13 np0005531888 kernel: tapdc6b9ee8-e0: entered promiscuous mode
Nov 22 02:56:13 np0005531888 NetworkManager[55166]: <info>  [1763798173.6816] manager: (tapdc6b9ee8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.684 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdc6b9ee8-e0, col_values=(('external_ids', {'iface-id': '99cae854-daa9-4d08-8152-257a15e21bf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:13 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:13Z|00181|binding|INFO|Releasing lport 99cae854-daa9-4d08-8152-257a15e21bf8 from this chassis (sb_readonly=0)
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.686 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.687 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.688 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4051ed13-db22-4750-9506-a6f9e612c4ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.689 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.689 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798173.6891959, a7aa669e-9a07-44ff-ad3d-919077eb0646 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:13.690 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'env', 'PROCESS_TAG=haproxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.690 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] VM Started (Lifecycle Event)#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.698 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.708 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.713 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798173.6902058, a7aa669e-9a07-44ff-ad3d-919077eb0646 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.714 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.731 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.735 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:13 np0005531888 nova_compute[186788]: 2025-11-22 07:56:13.757 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:56:14 np0005531888 podman[223730]: 2025-11-22 07:56:14.114442121 +0000 UTC m=+0.109131238 container create 18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:56:14 np0005531888 podman[223730]: 2025-11-22 07:56:14.028479382 +0000 UTC m=+0.023168529 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:56:14 np0005531888 systemd[1]: Started libpod-conmon-18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8.scope.
Nov 22 02:56:14 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:56:14 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c35bbd6bff369c566d46db2daf5f32751aaf0b69f799cffdb8edaeb02049246/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:56:14 np0005531888 podman[223730]: 2025-11-22 07:56:14.208147212 +0000 UTC m=+0.202836349 container init 18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 02:56:14 np0005531888 podman[223730]: 2025-11-22 07:56:14.21417174 +0000 UTC m=+0.208860857 container start 18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:56:14 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223745]: [NOTICE]   (223749) : New worker (223751) forked
Nov 22 02:56:14 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223745]: [NOTICE]   (223749) : Loading success.
Nov 22 02:56:14 np0005531888 nova_compute[186788]: 2025-11-22 07:56:14.578 186792 DEBUG nova.network.neutron [req-7aa56ba3-9549-4ddf-a084-b4c88d493f08 req-c8901565-8a2f-4cb0-b86c-24fe32526c99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Updated VIF entry in instance network info cache for port 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:56:14 np0005531888 nova_compute[186788]: 2025-11-22 07:56:14.579 186792 DEBUG nova.network.neutron [req-7aa56ba3-9549-4ddf-a084-b4c88d493f08 req-c8901565-8a2f-4cb0-b86c-24fe32526c99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Updating instance_info_cache with network_info: [{"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:14 np0005531888 nova_compute[186788]: 2025-11-22 07:56:14.599 186792 DEBUG oslo_concurrency.lockutils [req-7aa56ba3-9549-4ddf-a084-b4c88d493f08 req-c8901565-8a2f-4cb0-b86c-24fe32526c99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a7aa669e-9a07-44ff-ad3d-919077eb0646" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:16 np0005531888 nova_compute[186788]: 2025-11-22 07:56:16.195 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:16.196 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:16.197 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:56:16 np0005531888 podman[223760]: 2025-11-22 07:56:16.691897124 +0000 UTC m=+0.063991652 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:56:16 np0005531888 nova_compute[186788]: 2025-11-22 07:56:16.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:17 np0005531888 nova_compute[186788]: 2025-11-22 07:56:17.366 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798162.3649244, dd6876d4-cab4-413b-9d67-10e2ba45a220 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:17 np0005531888 nova_compute[186788]: 2025-11-22 07:56:17.367 186792 INFO nova.compute.manager [-] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:56:17 np0005531888 nova_compute[186788]: 2025-11-22 07:56:17.387 186792 DEBUG nova.compute.manager [None req-f487e333-fca1-487f-a227-e908c9347eca - - - - - -] [instance: dd6876d4-cab4-413b-9d67-10e2ba45a220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:17 np0005531888 nova_compute[186788]: 2025-11-22 07:56:17.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.069 186792 DEBUG nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received event network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.070 186792 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.070 186792 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.071 186792 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.071 186792 DEBUG nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Processing event network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.071 186792 DEBUG nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received event network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.071 186792 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.072 186792 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.072 186792 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.072 186792 DEBUG nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] No waiting events found dispatching network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.072 186792 WARNING nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received unexpected event network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.073 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.077 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798178.0776563, a7aa669e-9a07-44ff-ad3d-919077eb0646 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.078 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.080 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.083 186792 INFO nova.virt.libvirt.driver [-] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Instance spawned successfully.#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.084 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.109 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.110 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.110 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.111 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.111 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.112 186792 DEBUG nova.virt.libvirt.driver [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.118 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.121 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.159 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.255 186792 INFO nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Took 13.33 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.255 186792 DEBUG nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.366 186792 INFO nova.compute.manager [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Took 13.83 seconds to build instance.#033[00m
Nov 22 02:56:18 np0005531888 nova_compute[186788]: 2025-11-22 07:56:18.385 186792 DEBUG oslo_concurrency.lockutils [None req-3286c37f-67e9-492b-8478-9a0755bcf3df 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.912s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:18 np0005531888 podman[223784]: 2025-11-22 07:56:18.677724093 +0000 UTC m=+0.051205559 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:56:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:20.200 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:20 np0005531888 nova_compute[186788]: 2025-11-22 07:56:20.251 186792 DEBUG nova.compute.manager [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:20 np0005531888 nova_compute[186788]: 2025-11-22 07:56:20.321 186792 INFO nova.compute.manager [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] instance snapshotting#033[00m
Nov 22 02:56:20 np0005531888 nova_compute[186788]: 2025-11-22 07:56:20.701 186792 INFO nova.virt.libvirt.driver [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Beginning live snapshot process#033[00m
Nov 22 02:56:21 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.557 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.617 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk --force-share --output=json -f qcow2" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.618 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.684 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.714 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.780 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.782 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpba6yd2ap/79f98d07ec4f48a486f00e3f6b1a1ebb.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.830 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpba6yd2ap/79f98d07ec4f48a486f00e3f6b1a1ebb.delta 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.832 186792 INFO nova.virt.libvirt.driver [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.900 186792 DEBUG nova.virt.libvirt.guest [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.903 186792 INFO nova.virt.libvirt.driver [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.948 186792 DEBUG nova.privsep.utils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.949 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpba6yd2ap/79f98d07ec4f48a486f00e3f6b1a1ebb.delta /var/lib/nova/instances/snapshots/tmpba6yd2ap/79f98d07ec4f48a486f00e3f6b1a1ebb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:21 np0005531888 nova_compute[186788]: 2025-11-22 07:56:21.984 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:22 np0005531888 nova_compute[186788]: 2025-11-22 07:56:22.161 186792 DEBUG oslo_concurrency.processutils [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpba6yd2ap/79f98d07ec4f48a486f00e3f6b1a1ebb.delta /var/lib/nova/instances/snapshots/tmpba6yd2ap/79f98d07ec4f48a486f00e3f6b1a1ebb" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:22 np0005531888 nova_compute[186788]: 2025-11-22 07:56:22.162 186792 INFO nova.virt.libvirt.driver [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:56:22 np0005531888 nova_compute[186788]: 2025-11-22 07:56:22.648 186792 WARNING nova.compute.manager [None req-b0ce29dd-72ea-4395-bbf4-4e5f15d770e2 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Image not found during snapshot: nova.exception.ImageNotFound: Image 81bdf64d-c1fd-419c-bc57-d437ebd5b66a could not be found.#033[00m
Nov 22 02:56:22 np0005531888 nova_compute[186788]: 2025-11-22 07:56:22.804 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.867 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "a7aa669e-9a07-44ff-ad3d-919077eb0646" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.868 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.868 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.868 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.869 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.877 186792 INFO nova.compute.manager [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Terminating instance#033[00m
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.882 186792 DEBUG nova.compute.manager [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:56:23 np0005531888 kernel: tap18dc8aa2-0b (unregistering): left promiscuous mode
Nov 22 02:56:23 np0005531888 NetworkManager[55166]: <info>  [1763798183.9077] device (tap18dc8aa2-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:56:23 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:23Z|00182|binding|INFO|Releasing lport 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 from this chassis (sb_readonly=0)
Nov 22 02:56:23 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:23Z|00183|binding|INFO|Setting lport 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 down in Southbound
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.918 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:23 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:23Z|00184|binding|INFO|Removing iface tap18dc8aa2-0b ovn-installed in OVS
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.921 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:23.929 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:41:27 10.100.0.11'], port_security=['fa:16:3e:65:41:27 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a7aa669e-9a07-44ff-ad3d-919077eb0646', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:23.930 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a unbound from our chassis#033[00m
Nov 22 02:56:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:23.932 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:56:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:23.933 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2f968c17-2afe-4885-83fb-448f7b92efd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:23.933 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace which is not needed anymore#033[00m
Nov 22 02:56:23 np0005531888 nova_compute[186788]: 2025-11-22 07:56:23.935 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:23 np0005531888 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000049.scope: Deactivated successfully.
Nov 22 02:56:23 np0005531888 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000049.scope: Consumed 6.197s CPU time.
Nov 22 02:56:23 np0005531888 systemd-machined[153106]: Machine qemu-33-instance-00000049 terminated.
Nov 22 02:56:24 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223745]: [NOTICE]   (223749) : haproxy version is 2.8.14-c23fe91
Nov 22 02:56:24 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223745]: [NOTICE]   (223749) : path to executable is /usr/sbin/haproxy
Nov 22 02:56:24 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223745]: [WARNING]  (223749) : Exiting Master process...
Nov 22 02:56:24 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223745]: [ALERT]    (223749) : Current worker (223751) exited with code 143 (Terminated)
Nov 22 02:56:24 np0005531888 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[223745]: [WARNING]  (223749) : All workers exited. Exiting... (0)
Nov 22 02:56:24 np0005531888 systemd[1]: libpod-18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8.scope: Deactivated successfully.
Nov 22 02:56:24 np0005531888 podman[223853]: 2025-11-22 07:56:24.122955459 +0000 UTC m=+0.096903454 container died 18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.153 186792 INFO nova.virt.libvirt.driver [-] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Instance destroyed successfully.#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.154 186792 DEBUG nova.objects.instance [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'resources' on Instance uuid a7aa669e-9a07-44ff-ad3d-919077eb0646 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.166 186792 DEBUG nova.virt.libvirt.vif [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-606032013',display_name='tempest-ImagesTestJSON-server-606032013',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-606032013',id=73,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-mz0p40lo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:56:22Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=a7aa669e-9a07-44ff-ad3d-919077eb0646,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.166 186792 DEBUG nova.network.os_vif_util [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "address": "fa:16:3e:65:41:27", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18dc8aa2-0b", "ovs_interfaceid": "18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.167 186792 DEBUG nova.network.os_vif_util [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:41:27,bridge_name='br-int',has_traffic_filtering=True,id=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18dc8aa2-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.167 186792 DEBUG os_vif [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:41:27,bridge_name='br-int',has_traffic_filtering=True,id=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18dc8aa2-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.169 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.169 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18dc8aa2-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.171 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.173 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.175 186792 INFO os_vif [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:41:27,bridge_name='br-int',has_traffic_filtering=True,id=18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18dc8aa2-0b')#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.176 186792 INFO nova.virt.libvirt.driver [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Deleting instance files /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646_del#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.176 186792 INFO nova.virt.libvirt.driver [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Deletion of /var/lib/nova/instances/a7aa669e-9a07-44ff-ad3d-919077eb0646_del complete#033[00m
Nov 22 02:56:24 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8-userdata-shm.mount: Deactivated successfully.
Nov 22 02:56:24 np0005531888 systemd[1]: var-lib-containers-storage-overlay-9c35bbd6bff369c566d46db2daf5f32751aaf0b69f799cffdb8edaeb02049246-merged.mount: Deactivated successfully.
Nov 22 02:56:24 np0005531888 podman[223853]: 2025-11-22 07:56:24.256529675 +0000 UTC m=+0.230477650 container cleanup 18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.272 186792 INFO nova.compute.manager [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.273 186792 DEBUG oslo.service.loopingcall [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.273 186792 DEBUG nova.compute.manager [-] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.274 186792 DEBUG nova.network.neutron [-] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:56:24 np0005531888 systemd[1]: libpod-conmon-18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8.scope: Deactivated successfully.
Nov 22 02:56:24 np0005531888 podman[223897]: 2025-11-22 07:56:24.46683396 +0000 UTC m=+0.188145521 container remove 18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.472 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[506d3552-5b27-45ed-9cb2-cc0f5f7ca9f0]: (4, ('Sat Nov 22 07:56:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8)\n18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8\nSat Nov 22 07:56:24 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8)\n18bf02b1c221a8037706f60feffc372c946b79b097ce7c6e645841aba940bfb8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.474 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[71673a83-c0f0-402f-ba2a-244fad7cb0b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.475 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.477 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:24 np0005531888 kernel: tapdc6b9ee8-e0: left promiscuous mode
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.492 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.495 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b1355d05-967e-4fa5-b0c1-53ed5ad98862]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.511 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[992d6d34-4ed0-4c9e-a6d0-488eade179a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.512 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f3db2535-0ddf-498b-8776-1e8bdb4194d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.527 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e6190871-41d4-410b-8147-17ad14c994e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491408, 'reachable_time': 15331, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223914, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.529 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:56:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:24.530 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9ef1e0-76ff-4d5b-b6fd-926891ad7f76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:24 np0005531888 systemd[1]: run-netns-ovnmeta\x2ddc6b9ee8\x2de824\x2d42ea\x2dbe5e\x2d5b3c4e48e46a.mount: Deactivated successfully.
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.644 186792 DEBUG nova.compute.manager [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received event network-vif-unplugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.644 186792 DEBUG oslo_concurrency.lockutils [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.645 186792 DEBUG oslo_concurrency.lockutils [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.645 186792 DEBUG oslo_concurrency.lockutils [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.645 186792 DEBUG nova.compute.manager [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] No waiting events found dispatching network-vif-unplugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.645 186792 DEBUG nova.compute.manager [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received event network-vif-unplugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.646 186792 DEBUG nova.compute.manager [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received event network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.646 186792 DEBUG oslo_concurrency.lockutils [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.646 186792 DEBUG oslo_concurrency.lockutils [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.646 186792 DEBUG oslo_concurrency.lockutils [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.647 186792 DEBUG nova.compute.manager [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] No waiting events found dispatching network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:24 np0005531888 nova_compute[186788]: 2025-11-22 07:56:24.647 186792 WARNING nova.compute.manager [req-e69ee2f7-324a-4a5c-ba43-40fb1b6763d0 req-8e2532f0-5631-467e-b565-66aa5956f843 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received unexpected event network-vif-plugged-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.336 186792 DEBUG nova.network.neutron [-] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.368 186792 INFO nova.compute.manager [-] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Took 1.09 seconds to deallocate network for instance.#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.444 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.445 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.539 186792 DEBUG nova.compute.provider_tree [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.560 186792 DEBUG nova.scheduler.client.report [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.580 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.605 186792 INFO nova.scheduler.client.report [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Deleted allocations for instance a7aa669e-9a07-44ff-ad3d-919077eb0646#033[00m
Nov 22 02:56:25 np0005531888 nova_compute[186788]: 2025-11-22 07:56:25.672 186792 DEBUG oslo_concurrency.lockutils [None req-c89f9802-5f21-4788-a91a-7630fd81af1d 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a7aa669e-9a07-44ff-ad3d-919077eb0646" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:25 np0005531888 podman[223915]: 2025-11-22 07:56:25.684899005 +0000 UTC m=+0.057627529 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 02:56:26 np0005531888 nova_compute[186788]: 2025-11-22 07:56:26.735 186792 DEBUG nova.compute.manager [req-6601e067-b9d8-4a20-ac51-85e40c791e7f req-cbe83198-dbd6-4426-aee0-4e82c2b62623 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Received event network-vif-deleted-18dc8aa2-0bbc-46c3-b5ff-4d4f51a8ebf8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:27 np0005531888 nova_compute[186788]: 2025-11-22 07:56:27.805 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:28 np0005531888 podman[223932]: 2025-11-22 07:56:28.679854475 +0000 UTC m=+0.051988691 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:56:29 np0005531888 nova_compute[186788]: 2025-11-22 07:56:29.173 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531888 podman[223957]: 2025-11-22 07:56:32.678751583 +0000 UTC m=+0.051407566 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 02:56:32 np0005531888 nova_compute[186788]: 2025-11-22 07:56:32.807 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:34 np0005531888 nova_compute[186788]: 2025-11-22 07:56:34.176 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:35 np0005531888 nova_compute[186788]: 2025-11-22 07:56:35.339 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:35 np0005531888 nova_compute[186788]: 2025-11-22 07:56:35.712 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:35 np0005531888 nova_compute[186788]: 2025-11-22 07:56:35.713 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:35 np0005531888 nova_compute[186788]: 2025-11-22 07:56:35.734 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:56:35 np0005531888 nova_compute[186788]: 2025-11-22 07:56:35.831 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:35 np0005531888 nova_compute[186788]: 2025-11-22 07:56:35.832 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:35 np0005531888 nova_compute[186788]: 2025-11-22 07:56:35.841 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:56:35 np0005531888 nova_compute[186788]: 2025-11-22 07:56:35.841 186792 INFO nova.compute.claims [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.037 186792 DEBUG nova.compute.provider_tree [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.052 186792 DEBUG nova.scheduler.client.report [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.073 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.073 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.161 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.162 186792 DEBUG nova.network.neutron [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.214 186792 INFO nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.232 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.381 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.382 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.383 186792 INFO nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Creating image(s)#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.383 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.384 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.384 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.401 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.473 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.474 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.475 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.487 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.545 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.546 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.585 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.586 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.587 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.613 186792 DEBUG nova.policy [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.651 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.652 186792 DEBUG nova.virt.disk.api [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Checking if we can resize image /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.652 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.727 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.728 186792 DEBUG nova.virt.disk.api [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Cannot resize image /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.729 186792 DEBUG nova.objects.instance [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'migration_context' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.750 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.751 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Ensure instance console log exists: /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.751 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.752 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:36 np0005531888 nova_compute[186788]: 2025-11-22 07:56:36.752 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:36.809 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:36.810 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:36.810 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.840 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.841 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.842 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.845 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:56:36.845 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:37 np0005531888 nova_compute[186788]: 2025-11-22 07:56:37.808 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:37 np0005531888 nova_compute[186788]: 2025-11-22 07:56:37.817 186792 DEBUG nova.network.neutron [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Successfully created port: 880dadfb-6870-48ad-9c4e-f8cb0370d421 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:56:39 np0005531888 nova_compute[186788]: 2025-11-22 07:56:39.152 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798184.1508436, a7aa669e-9a07-44ff-ad3d-919077eb0646 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:39 np0005531888 nova_compute[186788]: 2025-11-22 07:56:39.153 186792 INFO nova.compute.manager [-] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:56:39 np0005531888 nova_compute[186788]: 2025-11-22 07:56:39.180 186792 DEBUG nova.compute.manager [None req-ee99b072-e975-4d25-ad4f-9980594887d7 - - - - - -] [instance: a7aa669e-9a07-44ff-ad3d-919077eb0646] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:39 np0005531888 nova_compute[186788]: 2025-11-22 07:56:39.181 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:39 np0005531888 podman[223993]: 2025-11-22 07:56:39.706437883 +0000 UTC m=+0.075908098 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:56:39 np0005531888 podman[223994]: 2025-11-22 07:56:39.73153266 +0000 UTC m=+0.096650798 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:56:39 np0005531888 nova_compute[186788]: 2025-11-22 07:56:39.785 186792 DEBUG nova.network.neutron [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Successfully updated port: 880dadfb-6870-48ad-9c4e-f8cb0370d421 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:56:39 np0005531888 nova_compute[186788]: 2025-11-22 07:56:39.836 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:39 np0005531888 nova_compute[186788]: 2025-11-22 07:56:39.836 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquired lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:39 np0005531888 nova_compute[186788]: 2025-11-22 07:56:39.836 186792 DEBUG nova.network.neutron [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:56:40 np0005531888 nova_compute[186788]: 2025-11-22 07:56:40.051 186792 DEBUG nova.compute.manager [req-78b419e6-568d-4d16-83f1-12e4bc631671 req-00a84aad-2b58-49d1-87b7-fea13d0a5316 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received event network-changed-880dadfb-6870-48ad-9c4e-f8cb0370d421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:40 np0005531888 nova_compute[186788]: 2025-11-22 07:56:40.052 186792 DEBUG nova.compute.manager [req-78b419e6-568d-4d16-83f1-12e4bc631671 req-00a84aad-2b58-49d1-87b7-fea13d0a5316 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Refreshing instance network info cache due to event network-changed-880dadfb-6870-48ad-9c4e-f8cb0370d421. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:56:40 np0005531888 nova_compute[186788]: 2025-11-22 07:56:40.052 186792 DEBUG oslo_concurrency.lockutils [req-78b419e6-568d-4d16-83f1-12e4bc631671 req-00a84aad-2b58-49d1-87b7-fea13d0a5316 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:40 np0005531888 nova_compute[186788]: 2025-11-22 07:56:40.184 186792 DEBUG nova.network.neutron [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.313 186792 DEBUG nova.network.neutron [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Updating instance_info_cache with network_info: [{"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.332 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Releasing lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.332 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance network_info: |[{"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.333 186792 DEBUG oslo_concurrency.lockutils [req-78b419e6-568d-4d16-83f1-12e4bc631671 req-00a84aad-2b58-49d1-87b7-fea13d0a5316 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.333 186792 DEBUG nova.network.neutron [req-78b419e6-568d-4d16-83f1-12e4bc631671 req-00a84aad-2b58-49d1-87b7-fea13d0a5316 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Refreshing network info cache for port 880dadfb-6870-48ad-9c4e-f8cb0370d421 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.335 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Start _get_guest_xml network_info=[{"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.340 186792 WARNING nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.348 186792 DEBUG nova.virt.libvirt.host [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.348 186792 DEBUG nova.virt.libvirt.host [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.356 186792 DEBUG nova.virt.libvirt.host [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.357 186792 DEBUG nova.virt.libvirt.host [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.358 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.359 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.359 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.359 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.360 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.360 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.360 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.360 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.361 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.361 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.361 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.361 186792 DEBUG nova.virt.hardware [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.365 186792 DEBUG nova.virt.libvirt.vif [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-900662590',display_name='tempest-tempest.common.compute-instance-900662590',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-900662590',id=75,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-guig1ujw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:36Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=c2fbcd94-57ca-4226-ad75-cdf60b578fd5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.365 186792 DEBUG nova.network.os_vif_util [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.366 186792 DEBUG nova.network.os_vif_util [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.367 186792 DEBUG nova.objects.instance [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.399 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <uuid>c2fbcd94-57ca-4226-ad75-cdf60b578fd5</uuid>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <name>instance-0000004b</name>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <nova:name>tempest-tempest.common.compute-instance-900662590</nova:name>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:56:41</nova:creationTime>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        <nova:user uuid="5a5a623606e647c183360572aab20b70">tempest-ServerActionsTestOtherA-1599563713-project-member</nova:user>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        <nova:project uuid="af3a536766704caaad94e5da2e3b88e2">tempest-ServerActionsTestOtherA-1599563713</nova:project>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        <nova:port uuid="880dadfb-6870-48ad-9c4e-f8cb0370d421">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <entry name="serial">c2fbcd94-57ca-4226-ad75-cdf60b578fd5</entry>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <entry name="uuid">c2fbcd94-57ca-4226-ad75-cdf60b578fd5</entry>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.config"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:b0:c5:66"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <target dev="tap880dadfb-68"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/console.log" append="off"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:56:41 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:56:41 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:56:41 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:56:41 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.400 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Preparing to wait for external event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.401 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.401 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.401 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.402 186792 DEBUG nova.virt.libvirt.vif [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-900662590',display_name='tempest-tempest.common.compute-instance-900662590',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-900662590',id=75,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-guig1ujw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:36Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=c2fbcd94-57ca-4226-ad75-cdf60b578fd5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.403 186792 DEBUG nova.network.os_vif_util [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.403 186792 DEBUG nova.network.os_vif_util [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.404 186792 DEBUG os_vif [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.405 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.406 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.408 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.409 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap880dadfb-68, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.409 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap880dadfb-68, col_values=(('external_ids', {'iface-id': '880dadfb-6870-48ad-9c4e-f8cb0370d421', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:c5:66', 'vm-uuid': 'c2fbcd94-57ca-4226-ad75-cdf60b578fd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.411 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:41 np0005531888 NetworkManager[55166]: <info>  [1763798201.4119] manager: (tap880dadfb-68): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.413 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.417 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.418 186792 INFO os_vif [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68')#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.487 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.488 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.488 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No VIF found with MAC fa:16:3e:b0:c5:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:56:41 np0005531888 nova_compute[186788]: 2025-11-22 07:56:41.489 186792 INFO nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Using config drive#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.238 186792 INFO nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Creating config drive at /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.config#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.244 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3v1k8knw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.375 186792 DEBUG oslo_concurrency.processutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3v1k8knw" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:42 np0005531888 kernel: tap880dadfb-68: entered promiscuous mode
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.451 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:42Z|00185|binding|INFO|Claiming lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 for this chassis.
Nov 22 02:56:42 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:42Z|00186|binding|INFO|880dadfb-6870-48ad-9c4e-f8cb0370d421: Claiming fa:16:3e:b0:c5:66 10.100.0.12
Nov 22 02:56:42 np0005531888 NetworkManager[55166]: <info>  [1763798202.4530] manager: (tap880dadfb-68): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.454 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.461 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.466 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 NetworkManager[55166]: <info>  [1763798202.4700] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.469 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 NetworkManager[55166]: <info>  [1763798202.4708] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.476 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:c5:66 10.100.0.12'], port_security=['fa:16:3e:b0:c5:66 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c2fbcd94-57ca-4226-ad75-cdf60b578fd5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af3a536766704caaad94e5da2e3b88e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80d022ea-fcc6-47bf-8d54-551da59f082d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cd2a902-e9cb-4e2e-893e-0a2e3b043ce7, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=880dadfb-6870-48ad-9c4e-f8cb0370d421) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.477 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 880dadfb-6870-48ad-9c4e-f8cb0370d421 in datapath a2b438ab-8fa8-4627-8c04-99bed701c19e bound to our chassis#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.479 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2b438ab-8fa8-4627-8c04-99bed701c19e#033[00m
Nov 22 02:56:42 np0005531888 systemd-udevd[224057]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.491 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe42889-553f-4035-b9ec-3111b554f73d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.493 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2b438ab-81 in ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.494 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2b438ab-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.495 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7d48c047-a54b-4c6e-b316-ec0185406ee3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 NetworkManager[55166]: <info>  [1763798202.4962] device (tap880dadfb-68): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:56:42 np0005531888 NetworkManager[55166]: <info>  [1763798202.4969] device (tap880dadfb-68): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.495 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0648db-6e36-4b80-a584-99decade7de1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 systemd-machined[153106]: New machine qemu-34-instance-0000004b.
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.509 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1c45f7-543f-4414-a4fb-6a6f4fb82194]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.538 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4eee6599-6eef-4704-a0f5-74ffa7decbd5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.566 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b7055e4d-669d-4d98-a4a7-02a0b92369ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 systemd[1]: Started Virtual Machine qemu-34-instance-0000004b.
Nov 22 02:56:42 np0005531888 NetworkManager[55166]: <info>  [1763798202.5741] manager: (tapa2b438ab-80): new Veth device (/org/freedesktop/NetworkManager/Devices/103)
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.573 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ab8b13e1-e812-429e-b1e0-1abc5f22ac29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.575 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.588 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:42Z|00187|binding|INFO|Setting lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 ovn-installed in OVS
Nov 22 02:56:42 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:42Z|00188|binding|INFO|Setting lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 up in Southbound
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.598 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.612 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e913a7a4-3241-4e0b-ae35-247c995670ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.615 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[dd84c65a-7d0a-459c-ae35-bc537bacdb56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 NetworkManager[55166]: <info>  [1763798202.6415] device (tapa2b438ab-80): carrier: link connected
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.646 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[ae38c74f-4c66-4a77-93a7-14f01a924ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.663 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d18f8817-3409-464c-8113-2fdeeb7cfb29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b438ab-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f4:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494328, 'reachable_time': 16757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224092, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.680 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[914df507-faf9-4384-bb2f-27da2e1560c6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:f49d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494328, 'tstamp': 494328}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224094, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.700 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1b876175-d3f8-46b6-9d30-96a19ab14c61]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b438ab-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f4:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494328, 'reachable_time': 16757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224095, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.733 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0d29e8c7-3dfe-4924-9ed6-88e34d57459a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.793 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed5d642-599f-467f-8093-b22ae0fd8132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.794 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b438ab-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.794 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.795 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2b438ab-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.797 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 NetworkManager[55166]: <info>  [1763798202.7979] manager: (tapa2b438ab-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Nov 22 02:56:42 np0005531888 kernel: tapa2b438ab-80: entered promiscuous mode
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.802 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2b438ab-80, col_values=(('external_ids', {'iface-id': '1f7bc015-fb2f-41a5-82bb-16526b7a95f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.803 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:42Z|00189|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.804 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.806 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.807 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fa8b4591-8d48-4930-878a-cc6efe5df6b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.808 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:56:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:42.809 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'env', 'PROCESS_TAG=haproxy-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2b438ab-8fa8-4627-8c04-99bed701c19e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:56:42 np0005531888 nova_compute[186788]: 2025-11-22 07:56:42.815 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:43 np0005531888 podman[224130]: 2025-11-22 07:56:43.222677417 +0000 UTC m=+0.058194762 container create 53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.235 186792 DEBUG nova.network.neutron [req-78b419e6-568d-4d16-83f1-12e4bc631671 req-00a84aad-2b58-49d1-87b7-fea13d0a5316 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Updated VIF entry in instance network info cache for port 880dadfb-6870-48ad-9c4e-f8cb0370d421. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.236 186792 DEBUG nova.network.neutron [req-78b419e6-568d-4d16-83f1-12e4bc631671 req-00a84aad-2b58-49d1-87b7-fea13d0a5316 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Updating instance_info_cache with network_info: [{"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.242 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798203.2421916, c2fbcd94-57ca-4226-ad75-cdf60b578fd5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.243 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] VM Started (Lifecycle Event)#033[00m
Nov 22 02:56:43 np0005531888 systemd[1]: Started libpod-conmon-53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196.scope.
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.271 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.275 186792 DEBUG oslo_concurrency.lockutils [req-78b419e6-568d-4d16-83f1-12e4bc631671 req-00a84aad-2b58-49d1-87b7-fea13d0a5316 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.278 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798203.242296, c2fbcd94-57ca-4226-ad75-cdf60b578fd5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.278 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:56:43 np0005531888 podman[224130]: 2025-11-22 07:56:43.193954801 +0000 UTC m=+0.029472166 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:56:43 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.298 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:43 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a88bcd75ebb2857853161b229d9b35192deeb968941b5647631a53a417f9e44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.307 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:43 np0005531888 podman[224130]: 2025-11-22 07:56:43.312896307 +0000 UTC m=+0.148413662 container init 53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 02:56:43 np0005531888 podman[224130]: 2025-11-22 07:56:43.318864723 +0000 UTC m=+0.154382068 container start 53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.328 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:56:43 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224149]: [NOTICE]   (224153) : New worker (224155) forked
Nov 22 02:56:43 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224149]: [NOTICE]   (224153) : Loading success.
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.465 186792 DEBUG nova.compute.manager [req-cd49f40e-9986-4908-a657-96574430a386 req-b80147c6-959f-4f52-b396-45f8f6d45d4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.466 186792 DEBUG oslo_concurrency.lockutils [req-cd49f40e-9986-4908-a657-96574430a386 req-b80147c6-959f-4f52-b396-45f8f6d45d4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.466 186792 DEBUG oslo_concurrency.lockutils [req-cd49f40e-9986-4908-a657-96574430a386 req-b80147c6-959f-4f52-b396-45f8f6d45d4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.467 186792 DEBUG oslo_concurrency.lockutils [req-cd49f40e-9986-4908-a657-96574430a386 req-b80147c6-959f-4f52-b396-45f8f6d45d4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.467 186792 DEBUG nova.compute.manager [req-cd49f40e-9986-4908-a657-96574430a386 req-b80147c6-959f-4f52-b396-45f8f6d45d4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Processing event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.468 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.471 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798203.4713895, c2fbcd94-57ca-4226-ad75-cdf60b578fd5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.471 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.474 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.478 186792 INFO nova.virt.libvirt.driver [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance spawned successfully.#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.478 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.496 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.503 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.507 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.508 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.509 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.509 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.510 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.510 186792 DEBUG nova.virt.libvirt.driver [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.575 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.686 186792 INFO nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Took 7.30 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.686 186792 DEBUG nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.769 186792 INFO nova.compute.manager [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Took 7.97 seconds to build instance.#033[00m
Nov 22 02:56:43 np0005531888 nova_compute[186788]: 2025-11-22 07:56:43.794 186792 DEBUG oslo_concurrency.lockutils [None req-94a302c5-b2a1-429e-845f-1cb4b09c4da1 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.655 186792 DEBUG nova.compute.manager [req-bd99f64f-7c9a-45fe-94eb-047e62e88765 req-90e9ed00-a226-43aa-be89-52a720d2f094 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.655 186792 DEBUG oslo_concurrency.lockutils [req-bd99f64f-7c9a-45fe-94eb-047e62e88765 req-90e9ed00-a226-43aa-be89-52a720d2f094 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.656 186792 DEBUG oslo_concurrency.lockutils [req-bd99f64f-7c9a-45fe-94eb-047e62e88765 req-90e9ed00-a226-43aa-be89-52a720d2f094 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.656 186792 DEBUG oslo_concurrency.lockutils [req-bd99f64f-7c9a-45fe-94eb-047e62e88765 req-90e9ed00-a226-43aa-be89-52a720d2f094 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.656 186792 DEBUG nova.compute.manager [req-bd99f64f-7c9a-45fe-94eb-047e62e88765 req-90e9ed00-a226-43aa-be89-52a720d2f094 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] No waiting events found dispatching network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.656 186792 WARNING nova.compute.manager [req-bd99f64f-7c9a-45fe-94eb-047e62e88765 req-90e9ed00-a226-43aa-be89-52a720d2f094 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received unexpected event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.813 186792 DEBUG oslo_concurrency.lockutils [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.814 186792 DEBUG oslo_concurrency.lockutils [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.814 186792 DEBUG nova.compute.manager [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.818 186792 DEBUG nova.compute.manager [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.819 186792 DEBUG nova.objects.instance [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'flavor' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.842 186792 DEBUG nova.objects.instance [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'info_cache' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.863 186792 DEBUG nova.virt.libvirt.driver [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.978 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:56:45 np0005531888 nova_compute[186788]: 2025-11-22 07:56:45.978 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:46 np0005531888 nova_compute[186788]: 2025-11-22 07:56:46.413 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:47 np0005531888 podman[224164]: 2025-11-22 07:56:47.69244999 +0000 UTC m=+0.054308567 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:56:47 np0005531888 nova_compute[186788]: 2025-11-22 07:56:47.703 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Updating instance_info_cache with network_info: [{"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:47 np0005531888 nova_compute[186788]: 2025-11-22 07:56:47.723 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-c2fbcd94-57ca-4226-ad75-cdf60b578fd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:47 np0005531888 nova_compute[186788]: 2025-11-22 07:56:47.724 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:56:47 np0005531888 nova_compute[186788]: 2025-11-22 07:56:47.724 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:47 np0005531888 nova_compute[186788]: 2025-11-22 07:56:47.724 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:47 np0005531888 nova_compute[186788]: 2025-11-22 07:56:47.825 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:48 np0005531888 nova_compute[186788]: 2025-11-22 07:56:48.500 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:48 np0005531888 nova_compute[186788]: 2025-11-22 07:56:48.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:49 np0005531888 podman[224189]: 2025-11-22 07:56:49.680116409 +0000 UTC m=+0.049437727 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:56:49 np0005531888 nova_compute[186788]: 2025-11-22 07:56:49.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:51 np0005531888 nova_compute[186788]: 2025-11-22 07:56:51.419 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:51 np0005531888 nova_compute[186788]: 2025-11-22 07:56:51.626 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:52 np0005531888 nova_compute[186788]: 2025-11-22 07:56:52.829 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:55 np0005531888 nova_compute[186788]: 2025-11-22 07:56:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:55 np0005531888 nova_compute[186788]: 2025-11-22 07:56:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:55 np0005531888 nova_compute[186788]: 2025-11-22 07:56:55.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:56:56 np0005531888 nova_compute[186788]: 2025-11-22 07:56:56.107 186792 DEBUG nova.virt.libvirt.driver [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:56:56 np0005531888 nova_compute[186788]: 2025-11-22 07:56:56.428 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:56 np0005531888 podman[224227]: 2025-11-22 07:56:56.696390838 +0000 UTC m=+0.061384970 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:56:56 np0005531888 nova_compute[186788]: 2025-11-22 07:56:56.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:56 np0005531888 nova_compute[186788]: 2025-11-22 07:56:56.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:56 np0005531888 nova_compute[186788]: 2025-11-22 07:56:56.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:56 np0005531888 nova_compute[186788]: 2025-11-22 07:56:56.993 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:56 np0005531888 nova_compute[186788]: 2025-11-22 07:56:56.993 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.079 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:57 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:57Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:c5:66 10.100.0.12
Nov 22 02:56:57 np0005531888 ovn_controller[95067]: 2025-11-22T07:56:57Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:c5:66 10.100.0.12
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.152 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.153 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.222 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.401 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.402 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5539MB free_disk=73.32177352905273GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.402 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.403 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.490 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance c2fbcd94-57ca-4226-ad75-cdf60b578fd5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.491 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.491 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.546 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.563 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.594 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.594 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:57.613 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.614 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:56:57.616 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:56:57 np0005531888 nova_compute[186788]: 2025-11-22 07:56:57.830 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:59 np0005531888 podman[224252]: 2025-11-22 07:56:59.701127769 +0000 UTC m=+0.072370380 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:57:01 np0005531888 nova_compute[186788]: 2025-11-22 07:57:01.432 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:02 np0005531888 nova_compute[186788]: 2025-11-22 07:57:02.833 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:03 np0005531888 podman[224276]: 2025-11-22 07:57:03.688404282 +0000 UTC m=+0.066209669 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Nov 22 02:57:06 np0005531888 nova_compute[186788]: 2025-11-22 07:57:06.436 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:07 np0005531888 nova_compute[186788]: 2025-11-22 07:57:07.158 186792 DEBUG nova.virt.libvirt.driver [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:57:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:07.619 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:07 np0005531888 nova_compute[186788]: 2025-11-22 07:57:07.836 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531888 kernel: tap880dadfb-68 (unregistering): left promiscuous mode
Nov 22 02:57:09 np0005531888 NetworkManager[55166]: <info>  [1763798229.5523] device (tap880dadfb-68): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:57:09 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:09Z|00190|binding|INFO|Releasing lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 from this chassis (sb_readonly=0)
Nov 22 02:57:09 np0005531888 nova_compute[186788]: 2025-11-22 07:57:09.563 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:09Z|00191|binding|INFO|Setting lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 down in Southbound
Nov 22 02:57:09 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:09Z|00192|binding|INFO|Removing iface tap880dadfb-68 ovn-installed in OVS
Nov 22 02:57:09 np0005531888 nova_compute[186788]: 2025-11-22 07:57:09.568 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531888 nova_compute[186788]: 2025-11-22 07:57:09.583 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531888 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Nov 22 02:57:09 np0005531888 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004b.scope: Consumed 14.569s CPU time.
Nov 22 02:57:09 np0005531888 systemd-machined[153106]: Machine qemu-34-instance-0000004b terminated.
Nov 22 02:57:09 np0005531888 nova_compute[186788]: 2025-11-22 07:57:09.789 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531888 nova_compute[186788]: 2025-11-22 07:57:09.793 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531888 podman[224320]: 2025-11-22 07:57:09.947010882 +0000 UTC m=+0.094138717 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:57:09 np0005531888 podman[224322]: 2025-11-22 07:57:09.954319502 +0000 UTC m=+0.100991826 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.017 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:c5:66 10.100.0.12'], port_security=['fa:16:3e:b0:c5:66 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c2fbcd94-57ca-4226-ad75-cdf60b578fd5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af3a536766704caaad94e5da2e3b88e2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80d022ea-fcc6-47bf-8d54-551da59f082d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cd2a902-e9cb-4e2e-893e-0a2e3b043ce7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=880dadfb-6870-48ad-9c4e-f8cb0370d421) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.018 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 880dadfb-6870-48ad-9c4e-f8cb0370d421 in datapath a2b438ab-8fa8-4627-8c04-99bed701c19e unbound from our chassis#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.020 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2b438ab-8fa8-4627-8c04-99bed701c19e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.022 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b316bb2a-4c72-460a-8223-a18283c7e123]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.023 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e namespace which is not needed anymore#033[00m
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.178 186792 INFO nova.virt.libvirt.driver [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance shutdown successfully after 24 seconds.#033[00m
Nov 22 02:57:10 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224149]: [NOTICE]   (224153) : haproxy version is 2.8.14-c23fe91
Nov 22 02:57:10 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224149]: [NOTICE]   (224153) : path to executable is /usr/sbin/haproxy
Nov 22 02:57:10 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224149]: [WARNING]  (224153) : Exiting Master process...
Nov 22 02:57:10 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224149]: [ALERT]    (224153) : Current worker (224155) exited with code 143 (Terminated)
Nov 22 02:57:10 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224149]: [WARNING]  (224153) : All workers exited. Exiting... (0)
Nov 22 02:57:10 np0005531888 systemd[1]: libpod-53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196.scope: Deactivated successfully.
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.191 186792 INFO nova.virt.libvirt.driver [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance destroyed successfully.#033[00m
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.192 186792 DEBUG nova.objects.instance [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'numa_topology' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:10 np0005531888 podman[224378]: 2025-11-22 07:57:10.195023013 +0000 UTC m=+0.066724071 container died 53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.204 186792 DEBUG nova.compute.manager [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:10 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196-userdata-shm.mount: Deactivated successfully.
Nov 22 02:57:10 np0005531888 systemd[1]: var-lib-containers-storage-overlay-4a88bcd75ebb2857853161b229d9b35192deeb968941b5647631a53a417f9e44-merged.mount: Deactivated successfully.
Nov 22 02:57:10 np0005531888 podman[224378]: 2025-11-22 07:57:10.311362755 +0000 UTC m=+0.183063803 container cleanup 53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:57:10 np0005531888 systemd[1]: libpod-conmon-53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196.scope: Deactivated successfully.
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.422 186792 DEBUG oslo_concurrency.lockutils [None req-5fb4d6d0-207f-416d-9a7a-c0b1ef0a12e9 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 24.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:10 np0005531888 podman[224409]: 2025-11-22 07:57:10.430546558 +0000 UTC m=+0.097487959 container remove 53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.439 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b168a6-9c96-4271-8bca-db848f1b6e93]: (4, ('Sat Nov 22 07:57:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e (53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196)\n53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196\nSat Nov 22 07:57:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e (53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196)\n53849c8ccc28338e28be846b674f7c311337b3b4ec218e297e17de9d4c62a196\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.442 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[57e108c6-b433-4a52-ac28-b63420fcf8ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.444 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b438ab-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:10 np0005531888 kernel: tapa2b438ab-80: left promiscuous mode
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.447 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.466 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.470 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ca4380df-5575-4f9d-8b64-c44a554d3eba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.487 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1f4ae4-367d-4b3b-bc48-834b84e6e986]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.489 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[97714534-1968-431f-809b-cf4bbee67f81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.511 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1846876e-50d2-48f4-8149-961d8d85fa3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494320, 'reachable_time': 31698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224428, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.515 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:57:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:10.516 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f27723-6bec-4e6c-9ad2-814132cdc44c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531888 systemd[1]: run-netns-ovnmeta\x2da2b438ab\x2d8fa8\x2d4627\x2d8c04\x2d99bed701c19e.mount: Deactivated successfully.
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.840 186792 DEBUG nova.compute.manager [req-bb1743de-ad0c-4df1-a497-b321a7c5c7ee req-b67389c8-eef5-406e-8581-4dd4fcea77b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received event network-vif-unplugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.841 186792 DEBUG oslo_concurrency.lockutils [req-bb1743de-ad0c-4df1-a497-b321a7c5c7ee req-b67389c8-eef5-406e-8581-4dd4fcea77b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.841 186792 DEBUG oslo_concurrency.lockutils [req-bb1743de-ad0c-4df1-a497-b321a7c5c7ee req-b67389c8-eef5-406e-8581-4dd4fcea77b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.841 186792 DEBUG oslo_concurrency.lockutils [req-bb1743de-ad0c-4df1-a497-b321a7c5c7ee req-b67389c8-eef5-406e-8581-4dd4fcea77b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.842 186792 DEBUG nova.compute.manager [req-bb1743de-ad0c-4df1-a497-b321a7c5c7ee req-b67389c8-eef5-406e-8581-4dd4fcea77b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] No waiting events found dispatching network-vif-unplugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:10 np0005531888 nova_compute[186788]: 2025-11-22 07:57:10.842 186792 WARNING nova.compute.manager [req-bb1743de-ad0c-4df1-a497-b321a7c5c7ee req-b67389c8-eef5-406e-8581-4dd4fcea77b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received unexpected event network-vif-unplugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 for instance with vm_state stopped and task_state None.#033[00m
Nov 22 02:57:11 np0005531888 nova_compute[186788]: 2025-11-22 07:57:11.440 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:12 np0005531888 nova_compute[186788]: 2025-11-22 07:57:12.792 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:12 np0005531888 nova_compute[186788]: 2025-11-22 07:57:12.838 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:13 np0005531888 nova_compute[186788]: 2025-11-22 07:57:13.058 186792 DEBUG nova.compute.manager [req-481636f8-9fcc-4440-ae36-a2066a5d8fd3 req-b5e38a8b-3131-47dd-a3cc-e127b83877bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:13 np0005531888 nova_compute[186788]: 2025-11-22 07:57:13.059 186792 DEBUG oslo_concurrency.lockutils [req-481636f8-9fcc-4440-ae36-a2066a5d8fd3 req-b5e38a8b-3131-47dd-a3cc-e127b83877bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:13 np0005531888 nova_compute[186788]: 2025-11-22 07:57:13.059 186792 DEBUG oslo_concurrency.lockutils [req-481636f8-9fcc-4440-ae36-a2066a5d8fd3 req-b5e38a8b-3131-47dd-a3cc-e127b83877bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:13 np0005531888 nova_compute[186788]: 2025-11-22 07:57:13.059 186792 DEBUG oslo_concurrency.lockutils [req-481636f8-9fcc-4440-ae36-a2066a5d8fd3 req-b5e38a8b-3131-47dd-a3cc-e127b83877bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:13 np0005531888 nova_compute[186788]: 2025-11-22 07:57:13.060 186792 DEBUG nova.compute.manager [req-481636f8-9fcc-4440-ae36-a2066a5d8fd3 req-b5e38a8b-3131-47dd-a3cc-e127b83877bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] No waiting events found dispatching network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:13 np0005531888 nova_compute[186788]: 2025-11-22 07:57:13.060 186792 WARNING nova.compute.manager [req-481636f8-9fcc-4440-ae36-a2066a5d8fd3 req-b5e38a8b-3131-47dd-a3cc-e127b83877bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received unexpected event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 for instance with vm_state stopped and task_state None.#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.159 186792 INFO nova.compute.manager [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Rebuilding instance#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.732 186792 DEBUG nova.compute.manager [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.856 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'pci_requests' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.867 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.880 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'resources' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.891 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'migration_context' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.901 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.904 186792 INFO nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance already shutdown.#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.910 186792 INFO nova.virt.libvirt.driver [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance destroyed successfully.#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.914 186792 INFO nova.virt.libvirt.driver [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance destroyed successfully.#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.915 186792 DEBUG nova.virt.libvirt.vif [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-900662590',display_name='tempest-tempest.common.compute-instance-900662590',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-900662590',id=75,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-guig1ujw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:13Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=c2fbcd94-57ca-4226-ad75-cdf60b578fd5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.915 186792 DEBUG nova.network.os_vif_util [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.916 186792 DEBUG nova.network.os_vif_util [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.916 186792 DEBUG os_vif [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.918 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.918 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap880dadfb-68, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.919 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.921 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.923 186792 INFO os_vif [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68')#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.923 186792 INFO nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Deleting instance files /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5_del#033[00m
Nov 22 02:57:15 np0005531888 nova_compute[186788]: 2025-11-22 07:57:15.924 186792 INFO nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Deletion of /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5_del complete#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.227 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.228 186792 INFO nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Creating image(s)#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.229 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.230 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.231 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.245 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.328 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.330 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.330 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.343 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.408 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.409 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.504 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk 1073741824" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.506 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.507 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.568 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.570 186792 DEBUG nova.virt.disk.api [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Checking if we can resize image /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.570 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.636 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.637 186792 DEBUG nova.virt.disk.api [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Cannot resize image /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.638 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.638 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Ensure instance console log exists: /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.639 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.639 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.640 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.642 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Start _get_guest_xml network_info=[{"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.649 186792 WARNING nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.658 186792 DEBUG nova.virt.libvirt.host [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.660 186792 DEBUG nova.virt.libvirt.host [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.665 186792 DEBUG nova.virt.libvirt.host [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.667 186792 DEBUG nova.virt.libvirt.host [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.669 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.669 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.671 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.671 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.672 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.672 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.672 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.673 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.673 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.673 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.673 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.673 186792 DEBUG nova.virt.hardware [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.674 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.691 186792 DEBUG nova.virt.libvirt.vif [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:56:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-900662590',display_name='tempest-tempest.common.compute-instance-900662590',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-900662590',id=75,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-guig1ujw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:16Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=c2fbcd94-57ca-4226-ad75-cdf60b578fd5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.691 186792 DEBUG nova.network.os_vif_util [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.692 186792 DEBUG nova.network.os_vif_util [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.694 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <uuid>c2fbcd94-57ca-4226-ad75-cdf60b578fd5</uuid>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <name>instance-0000004b</name>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <nova:name>tempest-tempest.common.compute-instance-900662590</nova:name>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:57:16</nova:creationTime>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        <nova:user uuid="5a5a623606e647c183360572aab20b70">tempest-ServerActionsTestOtherA-1599563713-project-member</nova:user>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        <nova:project uuid="af3a536766704caaad94e5da2e3b88e2">tempest-ServerActionsTestOtherA-1599563713</nova:project>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        <nova:port uuid="880dadfb-6870-48ad-9c4e-f8cb0370d421">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <entry name="serial">c2fbcd94-57ca-4226-ad75-cdf60b578fd5</entry>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <entry name="uuid">c2fbcd94-57ca-4226-ad75-cdf60b578fd5</entry>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.config"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:b0:c5:66"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <target dev="tap880dadfb-68"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/console.log" append="off"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:57:16 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:57:16 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:57:16 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:57:16 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.696 186792 DEBUG nova.virt.libvirt.vif [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:56:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-900662590',display_name='tempest-tempest.common.compute-instance-900662590',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-900662590',id=75,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-guig1ujw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:16Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=c2fbcd94-57ca-4226-ad75-cdf60b578fd5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.696 186792 DEBUG nova.network.os_vif_util [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.697 186792 DEBUG nova.network.os_vif_util [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.697 186792 DEBUG os_vif [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.698 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.698 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.699 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.701 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.702 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap880dadfb-68, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.702 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap880dadfb-68, col_values=(('external_ids', {'iface-id': '880dadfb-6870-48ad-9c4e-f8cb0370d421', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:c5:66', 'vm-uuid': 'c2fbcd94-57ca-4226-ad75-cdf60b578fd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:16 np0005531888 NetworkManager[55166]: <info>  [1763798236.7050] manager: (tap880dadfb-68): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.706 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.710 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.713 186792 INFO os_vif [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68')#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.782 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.783 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.783 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No VIF found with MAC fa:16:3e:b0:c5:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.784 186792 INFO nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Using config drive#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.798 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'ec2_ids' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:16 np0005531888 nova_compute[186788]: 2025-11-22 07:57:16.837 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'keypairs' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:17 np0005531888 nova_compute[186788]: 2025-11-22 07:57:17.840 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:18 np0005531888 podman[224447]: 2025-11-22 07:57:18.704783736 +0000 UTC m=+0.067109512 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.085 186792 INFO nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Creating config drive at /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.config#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.090 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7giuc13_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.219 186792 DEBUG oslo_concurrency.processutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7giuc13_" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:19 np0005531888 kernel: tap880dadfb-68: entered promiscuous mode
Nov 22 02:57:19 np0005531888 NetworkManager[55166]: <info>  [1763798239.3205] manager: (tap880dadfb-68): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Nov 22 02:57:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:19Z|00193|binding|INFO|Claiming lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 for this chassis.
Nov 22 02:57:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:19Z|00194|binding|INFO|880dadfb-6870-48ad-9c4e-f8cb0370d421: Claiming fa:16:3e:b0:c5:66 10.100.0.12
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.322 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:19Z|00195|binding|INFO|Setting lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 ovn-installed in OVS
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.338 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:19Z|00196|binding|INFO|Setting lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 up in Southbound
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.344 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:c5:66 10.100.0.12'], port_security=['fa:16:3e:b0:c5:66 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c2fbcd94-57ca-4226-ad75-cdf60b578fd5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af3a536766704caaad94e5da2e3b88e2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '80d022ea-fcc6-47bf-8d54-551da59f082d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cd2a902-e9cb-4e2e-893e-0a2e3b043ce7, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=880dadfb-6870-48ad-9c4e-f8cb0370d421) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.345 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.346 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 880dadfb-6870-48ad-9c4e-f8cb0370d421 in datapath a2b438ab-8fa8-4627-8c04-99bed701c19e bound to our chassis#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.348 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2b438ab-8fa8-4627-8c04-99bed701c19e#033[00m
Nov 22 02:57:19 np0005531888 systemd-udevd[224486]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.362 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5202a9-af11-4b4b-b04f-08eca6798565]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.364 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2b438ab-81 in ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.367 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2b438ab-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.367 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[94022574-0456-4732-9fec-72528e8fe0d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.369 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c1d17b92-43d4-4b98-b0ad-1dc60a7a91b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 systemd-machined[153106]: New machine qemu-35-instance-0000004b.
Nov 22 02:57:19 np0005531888 NetworkManager[55166]: <info>  [1763798239.3787] device (tap880dadfb-68): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:57:19 np0005531888 NetworkManager[55166]: <info>  [1763798239.3807] device (tap880dadfb-68): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:57:19 np0005531888 systemd[1]: Started Virtual Machine qemu-35-instance-0000004b.
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.388 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0f457ab4-3493-4922-9921-6acb7d7c366a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.409 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f7286746-f355-440c-8d8c-6ed3a3ff5e3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.447 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[352e4dd6-94fc-4dc0-a4f0-34bcd19d524f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.453 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8de71c-7ee0-4130-a659-e5c7927d5578]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 NetworkManager[55166]: <info>  [1763798239.4547] manager: (tapa2b438ab-80): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.487 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f21bc447-7759-4c35-acf8-13d631c7a16d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.490 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[faaa6c87-17a1-4a91-a2a5-dad373d3d2c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 NetworkManager[55166]: <info>  [1763798239.5121] device (tapa2b438ab-80): carrier: link connected
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.516 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[93150f16-d729-40b4-8da7-c53bde912e2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.533 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f18050b3-f622-4e99-87b0-51df4041f9f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b438ab-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f4:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498015, 'reachable_time': 28758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224520, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.548 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[95585aee-a4ed-4df2-b6fe-2b05d4e1c415]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:f49d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498015, 'tstamp': 498015}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224521, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.565 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e1e4c0-aef2-4242-8fd9-6d94904eea4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b438ab-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f4:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498015, 'reachable_time': 28758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224522, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.599 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7c957f37-ca25-4bef-a2f8-e6754c186bd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.670 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[57d8538f-6b9c-475c-9a47-451023e9521e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.673 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b438ab-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.673 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.674 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2b438ab-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:19 np0005531888 kernel: tapa2b438ab-80: entered promiscuous mode
Nov 22 02:57:19 np0005531888 NetworkManager[55166]: <info>  [1763798239.6781] manager: (tapa2b438ab-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.676 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.680 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.682 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2b438ab-80, col_values=(('external_ids', {'iface-id': '1f7bc015-fb2f-41a5-82bb-16526b7a95f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:19Z|00197|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.685 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.686 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.687 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.688 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[917972b3-6cde-4dd0-bc1e-3a9b355bc785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.689 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:57:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:19.690 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'env', 'PROCESS_TAG=haproxy-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2b438ab-8fa8-4627-8c04-99bed701c19e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.699 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.763 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for c2fbcd94-57ca-4226-ad75-cdf60b578fd5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.763 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798239.7621508, c2fbcd94-57ca-4226-ad75-cdf60b578fd5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.764 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.768 186792 DEBUG nova.compute.manager [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.769 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.776 186792 INFO nova.virt.libvirt.driver [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance spawned successfully.#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.776 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.824 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.834 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.839 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.840 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.840 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.841 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.841 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.841 186792 DEBUG nova.virt.libvirt.driver [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.880 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.881 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798239.7689517, c2fbcd94-57ca-4226-ad75-cdf60b578fd5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.881 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] VM Started (Lifecycle Event)#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.949 186792 DEBUG nova.compute.manager [req-1d0f7ffe-810d-4063-981b-38c3005fc679 req-8ad1d21b-32b9-4eff-a73b-36d5d2a26e69 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.949 186792 DEBUG oslo_concurrency.lockutils [req-1d0f7ffe-810d-4063-981b-38c3005fc679 req-8ad1d21b-32b9-4eff-a73b-36d5d2a26e69 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.950 186792 DEBUG oslo_concurrency.lockutils [req-1d0f7ffe-810d-4063-981b-38c3005fc679 req-8ad1d21b-32b9-4eff-a73b-36d5d2a26e69 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.950 186792 DEBUG oslo_concurrency.lockutils [req-1d0f7ffe-810d-4063-981b-38c3005fc679 req-8ad1d21b-32b9-4eff-a73b-36d5d2a26e69 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.950 186792 DEBUG nova.compute.manager [req-1d0f7ffe-810d-4063-981b-38c3005fc679 req-8ad1d21b-32b9-4eff-a73b-36d5d2a26e69 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] No waiting events found dispatching network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.950 186792 WARNING nova.compute.manager [req-1d0f7ffe-810d-4063-981b-38c3005fc679 req-8ad1d21b-32b9-4eff-a73b-36d5d2a26e69 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received unexpected event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 for instance with vm_state stopped and task_state rebuild_spawning.#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.952 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:19 np0005531888 nova_compute[186788]: 2025-11-22 07:57:19.956 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.186 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 02:57:20 np0005531888 podman[224559]: 2025-11-22 07:57:20.100650437 +0000 UTC m=+0.027560849 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.227 186792 DEBUG nova.compute.manager [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:20 np0005531888 podman[224559]: 2025-11-22 07:57:20.32801124 +0000 UTC m=+0.254921632 container create 50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.408 186792 INFO nova.compute.manager [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] bringing vm to original state: 'stopped'#033[00m
Nov 22 02:57:20 np0005531888 systemd[1]: Started libpod-conmon-50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0.scope.
Nov 22 02:57:20 np0005531888 podman[224572]: 2025-11-22 07:57:20.487799162 +0000 UTC m=+0.110306035 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 02:57:20 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:57:20 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e95d1894feb5ecf38ba5d55c0ac6f20d9ce4ea6f5f8a829ebf835ba1da11b29c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:57:20 np0005531888 podman[224559]: 2025-11-22 07:57:20.581417444 +0000 UTC m=+0.508327866 container init 50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:57:20 np0005531888 podman[224559]: 2025-11-22 07:57:20.588590922 +0000 UTC m=+0.515501314 container start 50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.607 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.607 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.608 186792 DEBUG nova.compute.manager [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:20 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224592]: [NOTICE]   (224597) : New worker (224599) forked
Nov 22 02:57:20 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224592]: [NOTICE]   (224597) : Loading success.
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.613 186792 DEBUG nova.compute.manager [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 22 02:57:20 np0005531888 kernel: tap880dadfb-68 (unregistering): left promiscuous mode
Nov 22 02:57:20 np0005531888 NetworkManager[55166]: <info>  [1763798240.6586] device (tap880dadfb-68): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:57:20 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:20Z|00198|binding|INFO|Releasing lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 from this chassis (sb_readonly=0)
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.667 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:20 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:20Z|00199|binding|INFO|Setting lport 880dadfb-6870-48ad-9c4e-f8cb0370d421 down in Southbound
Nov 22 02:57:20 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:20Z|00200|binding|INFO|Removing iface tap880dadfb-68 ovn-installed in OVS
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.678 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.689 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:20.711 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:c5:66 10.100.0.12'], port_security=['fa:16:3e:b0:c5:66 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c2fbcd94-57ca-4226-ad75-cdf60b578fd5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af3a536766704caaad94e5da2e3b88e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '80d022ea-fcc6-47bf-8d54-551da59f082d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cd2a902-e9cb-4e2e-893e-0a2e3b043ce7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=880dadfb-6870-48ad-9c4e-f8cb0370d421) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:20.713 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 880dadfb-6870-48ad-9c4e-f8cb0370d421 in datapath a2b438ab-8fa8-4627-8c04-99bed701c19e unbound from our chassis#033[00m
Nov 22 02:57:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:20.715 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2b438ab-8fa8-4627-8c04-99bed701c19e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:57:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:20.716 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b028c3-003b-4973-bbe3-c6377b9284d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:20.717 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e namespace which is not needed anymore#033[00m
Nov 22 02:57:20 np0005531888 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Nov 22 02:57:20 np0005531888 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004b.scope: Consumed 1.159s CPU time.
Nov 22 02:57:20 np0005531888 systemd-machined[153106]: Machine qemu-35-instance-0000004b terminated.
Nov 22 02:57:20 np0005531888 NetworkManager[55166]: <info>  [1763798240.8542] manager: (tap880dadfb-68): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.856 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.861 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.907 186792 INFO nova.virt.libvirt.driver [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance destroyed successfully.#033[00m
Nov 22 02:57:20 np0005531888 nova_compute[186788]: 2025-11-22 07:57:20.907 186792 DEBUG nova.compute.manager [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:20 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224592]: [NOTICE]   (224597) : haproxy version is 2.8.14-c23fe91
Nov 22 02:57:20 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224592]: [NOTICE]   (224597) : path to executable is /usr/sbin/haproxy
Nov 22 02:57:20 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224592]: [WARNING]  (224597) : Exiting Master process...
Nov 22 02:57:20 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224592]: [ALERT]    (224597) : Current worker (224599) exited with code 143 (Terminated)
Nov 22 02:57:20 np0005531888 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224592]: [WARNING]  (224597) : All workers exited. Exiting... (0)
Nov 22 02:57:21 np0005531888 systemd[1]: libpod-50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0.scope: Deactivated successfully.
Nov 22 02:57:21 np0005531888 podman[224628]: 2025-11-22 07:57:21.00967556 +0000 UTC m=+0.186405547 container died 50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 02:57:21 np0005531888 nova_compute[186788]: 2025-11-22 07:57:21.049 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 0.442s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0-userdata-shm.mount: Deactivated successfully.
Nov 22 02:57:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay-e95d1894feb5ecf38ba5d55c0ac6f20d9ce4ea6f5f8a829ebf835ba1da11b29c-merged.mount: Deactivated successfully.
Nov 22 02:57:21 np0005531888 nova_compute[186788]: 2025-11-22 07:57:21.704 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:21 np0005531888 nova_compute[186788]: 2025-11-22 07:57:21.710 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:21 np0005531888 nova_compute[186788]: 2025-11-22 07:57:21.711 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:21 np0005531888 nova_compute[186788]: 2025-11-22 07:57:21.711 186792 DEBUG nova.objects.instance [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:57:21 np0005531888 podman[224628]: 2025-11-22 07:57:21.939970317 +0000 UTC m=+1.116700284 container cleanup 50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:57:21 np0005531888 systemd[1]: libpod-conmon-50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0.scope: Deactivated successfully.
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.277 186792 DEBUG nova.compute.manager [req-0916071b-72af-4e57-aae4-bfeb42fa86a1 req-64a044cf-705e-482b-9ecd-3e525495d7d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.278 186792 DEBUG oslo_concurrency.lockutils [req-0916071b-72af-4e57-aae4-bfeb42fa86a1 req-64a044cf-705e-482b-9ecd-3e525495d7d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.279 186792 DEBUG oslo_concurrency.lockutils [req-0916071b-72af-4e57-aae4-bfeb42fa86a1 req-64a044cf-705e-482b-9ecd-3e525495d7d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.280 186792 DEBUG oslo_concurrency.lockutils [req-0916071b-72af-4e57-aae4-bfeb42fa86a1 req-64a044cf-705e-482b-9ecd-3e525495d7d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.280 186792 DEBUG nova.compute.manager [req-0916071b-72af-4e57-aae4-bfeb42fa86a1 req-64a044cf-705e-482b-9ecd-3e525495d7d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] No waiting events found dispatching network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.281 186792 WARNING nova.compute.manager [req-0916071b-72af-4e57-aae4-bfeb42fa86a1 req-64a044cf-705e-482b-9ecd-3e525495d7d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received unexpected event network-vif-plugged-880dadfb-6870-48ad-9c4e-f8cb0370d421 for instance with vm_state stopped and task_state None.#033[00m
Nov 22 02:57:22 np0005531888 podman[224669]: 2025-11-22 07:57:22.353245105 +0000 UTC m=+0.383532597 container remove 50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.361 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4db50964-759d-4c7b-90b0-cdac9d5e5062]: (4, ('Sat Nov 22 07:57:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e (50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0)\n50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0\nSat Nov 22 07:57:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e (50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0)\n50fba3e38cae5ca94c09faa66d2fd7219e6c486f245b5f494ba4e5efd976f8c0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.365 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f74bd3-2d23-4496-bf11-41d4837fb13a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.367 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b438ab-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.370 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:22 np0005531888 kernel: tapa2b438ab-80: left promiscuous mode
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.389 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.396 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d3145c56-e227-4696-bc01-7d8a48b64ee5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.416 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[35b44474-aa54-4548-88c6-92a939424224]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.418 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4f80b4bc-e292-4f53-86d8-a9b18940ff22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.440 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[90b24d9e-448d-4589-8ed5-d53540740649]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498008, 'reachable_time': 40079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224687, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.444 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:57:22 np0005531888 systemd[1]: run-netns-ovnmeta\x2da2b438ab\x2d8fa8\x2d4627\x2d8c04\x2d99bed701c19e.mount: Deactivated successfully.
Nov 22 02:57:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:22.444 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[2b421b67-a609-4625-8b56-3c94d8c82b6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.818 186792 DEBUG oslo_concurrency.lockutils [None req-f21fa177-0f64-4401-b14f-0e39dc7f1b53 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 1.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:22 np0005531888 nova_compute[186788]: 2025-11-22 07:57:22.844 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:26 np0005531888 nova_compute[186788]: 2025-11-22 07:57:26.709 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.378 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:27 np0005531888 podman[224692]: 2025-11-22 07:57:27.705119398 +0000 UTC m=+0.070349111 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.846 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.961 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.962 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.962 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.963 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.963 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.974 186792 INFO nova.compute.manager [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Terminating instance#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.983 186792 DEBUG nova.compute.manager [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.992 186792 INFO nova.virt.libvirt.driver [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Instance destroyed successfully.#033[00m
Nov 22 02:57:27 np0005531888 nova_compute[186788]: 2025-11-22 07:57:27.992 186792 DEBUG nova.objects.instance [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'resources' on Instance uuid c2fbcd94-57ca-4226-ad75-cdf60b578fd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.009 186792 DEBUG nova.virt.libvirt.vif [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:56:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-900662590',display_name='tempest-tempest.common.compute-instance-900662590',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-900662590',id=75,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:57:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-guig1ujw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:57:22Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=c2fbcd94-57ca-4226-ad75-cdf60b578fd5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.010 186792 DEBUG nova.network.os_vif_util [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "address": "fa:16:3e:b0:c5:66", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap880dadfb-68", "ovs_interfaceid": "880dadfb-6870-48ad-9c4e-f8cb0370d421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.011 186792 DEBUG nova.network.os_vif_util [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.011 186792 DEBUG os_vif [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.013 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.014 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap880dadfb-68, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.016 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.018 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.022 186792 INFO os_vif [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c5:66,bridge_name='br-int',has_traffic_filtering=True,id=880dadfb-6870-48ad-9c4e-f8cb0370d421,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap880dadfb-68')#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.022 186792 INFO nova.virt.libvirt.driver [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Deleting instance files /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5_del#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.023 186792 INFO nova.virt.libvirt.driver [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Deletion of /var/lib/nova/instances/c2fbcd94-57ca-4226-ad75-cdf60b578fd5_del complete#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.104 186792 INFO nova.compute.manager [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Took 0.12 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.105 186792 DEBUG oslo.service.loopingcall [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.105 186792 DEBUG nova.compute.manager [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:57:28 np0005531888 nova_compute[186788]: 2025-11-22 07:57:28.105 186792 DEBUG nova.network.neutron [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:57:29 np0005531888 nova_compute[186788]: 2025-11-22 07:57:29.953 186792 DEBUG nova.network.neutron [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.025 186792 DEBUG nova.compute.manager [req-8cef00ec-e293-4d97-a6d7-06c5ead8ba26 req-a5e04345-5808-4d58-9812-e3af1e48d706 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Received event network-vif-deleted-880dadfb-6870-48ad-9c4e-f8cb0370d421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.026 186792 INFO nova.compute.manager [req-8cef00ec-e293-4d97-a6d7-06c5ead8ba26 req-a5e04345-5808-4d58-9812-e3af1e48d706 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Neutron deleted interface 880dadfb-6870-48ad-9c4e-f8cb0370d421; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.026 186792 DEBUG nova.network.neutron [req-8cef00ec-e293-4d97-a6d7-06c5ead8ba26 req-a5e04345-5808-4d58-9812-e3af1e48d706 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.040 186792 INFO nova.compute.manager [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Took 1.93 seconds to deallocate network for instance.#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.049 186792 DEBUG nova.compute.manager [req-8cef00ec-e293-4d97-a6d7-06c5ead8ba26 req-a5e04345-5808-4d58-9812-e3af1e48d706 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Detach interface failed, port_id=880dadfb-6870-48ad-9c4e-f8cb0370d421, reason: Instance c2fbcd94-57ca-4226-ad75-cdf60b578fd5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.157 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.158 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.219 186792 DEBUG nova.compute.provider_tree [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.235 186792 DEBUG nova.scheduler.client.report [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.275 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.392 186792 INFO nova.scheduler.client.report [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Deleted allocations for instance c2fbcd94-57ca-4226-ad75-cdf60b578fd5#033[00m
Nov 22 02:57:30 np0005531888 nova_compute[186788]: 2025-11-22 07:57:30.497 186792 DEBUG oslo_concurrency.lockutils [None req-ba342f3b-99de-4731-b1ac-39cb2e2c0a9a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "c2fbcd94-57ca-4226-ad75-cdf60b578fd5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:30 np0005531888 podman[224712]: 2025-11-22 07:57:30.720021209 +0000 UTC m=+0.087897963 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:57:32 np0005531888 nova_compute[186788]: 2025-11-22 07:57:32.848 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:33 np0005531888 nova_compute[186788]: 2025-11-22 07:57:33.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:34 np0005531888 podman[224737]: 2025-11-22 07:57:34.699079999 +0000 UTC m=+0.063433262 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Nov 22 02:57:35 np0005531888 nova_compute[186788]: 2025-11-22 07:57:35.904 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798240.903185, c2fbcd94-57ca-4226-ad75-cdf60b578fd5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:35 np0005531888 nova_compute[186788]: 2025-11-22 07:57:35.905 186792 INFO nova.compute.manager [-] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:57:35 np0005531888 nova_compute[186788]: 2025-11-22 07:57:35.924 186792 DEBUG nova.compute.manager [None req-df6fc3b5-6902-4de8-98e7-175f1c94f7b1 - - - - - -] [instance: c2fbcd94-57ca-4226-ad75-cdf60b578fd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:36.810 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:36.811 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:36.811 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:37 np0005531888 nova_compute[186788]: 2025-11-22 07:57:37.850 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:38 np0005531888 nova_compute[186788]: 2025-11-22 07:57:38.018 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:40 np0005531888 podman[224758]: 2025-11-22 07:57:40.693223384 +0000 UTC m=+0.068411354 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:57:40 np0005531888 podman[224759]: 2025-11-22 07:57:40.713916112 +0000 UTC m=+0.084669893 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:57:42 np0005531888 nova_compute[186788]: 2025-11-22 07:57:42.851 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:43 np0005531888 nova_compute[186788]: 2025-11-22 07:57:43.020 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.393 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.393 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.410 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.555 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.556 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.561 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.561 186792 INFO nova.compute.claims [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.733 186792 DEBUG nova.compute.provider_tree [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:57:44 np0005531888 nova_compute[186788]: 2025-11-22 07:57:44.757 186792 DEBUG nova.scheduler.client.report [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.048 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.049 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.146 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.147 186792 DEBUG nova.network.neutron [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.175 186792 INFO nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.203 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.390 186792 DEBUG nova.policy [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e24c302b62fb470aa189b76d4676733b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '063bf16c91af408ca075c690797e09d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.438 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.439 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.439 186792 INFO nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Creating image(s)#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.440 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.440 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.441 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.495 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.557 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.558 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.558 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.569 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.625 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:45 np0005531888 nova_compute[186788]: 2025-11-22 07:57:45.626 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.091 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk 1073741824" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.092 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.093 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.163 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.164 186792 DEBUG nova.virt.disk.api [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.164 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.222 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.224 186792 DEBUG nova.virt.disk.api [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.224 186792 DEBUG nova.objects.instance [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.245 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.246 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Ensure instance console log exists: /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.247 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.247 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.248 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:46 np0005531888 nova_compute[186788]: 2025-11-22 07:57:46.594 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.109 186792 DEBUG nova.network.neutron [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Successfully created port: 7d226e71-e4bc-4841-b81e-fc63ab1cab60 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:57:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:47.114 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:47.115 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.117 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.853 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.949 186792 DEBUG nova.network.neutron [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Successfully updated port: 7d226e71-e4bc-4841-b81e-fc63ab1cab60 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.950 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.966 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.967 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.977 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.978 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:47 np0005531888 nova_compute[186788]: 2025-11-22 07:57:47.978 186792 DEBUG nova.network.neutron [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:57:48 np0005531888 nova_compute[186788]: 2025-11-22 07:57:48.022 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:48 np0005531888 nova_compute[186788]: 2025-11-22 07:57:48.035 186792 DEBUG nova.compute.manager [req-adf7fbb8-0d7a-4e93-b5d9-88bdc879f33c req-d5107a67-04f0-4677-b7e1-810a9a89d4b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received event network-changed-7d226e71-e4bc-4841-b81e-fc63ab1cab60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:48 np0005531888 nova_compute[186788]: 2025-11-22 07:57:48.035 186792 DEBUG nova.compute.manager [req-adf7fbb8-0d7a-4e93-b5d9-88bdc879f33c req-d5107a67-04f0-4677-b7e1-810a9a89d4b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Refreshing instance network info cache due to event network-changed-7d226e71-e4bc-4841-b81e-fc63ab1cab60. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:57:48 np0005531888 nova_compute[186788]: 2025-11-22 07:57:48.036 186792 DEBUG oslo_concurrency.lockutils [req-adf7fbb8-0d7a-4e93-b5d9-88bdc879f33c req-d5107a67-04f0-4677-b7e1-810a9a89d4b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:48 np0005531888 nova_compute[186788]: 2025-11-22 07:57:48.130 186792 DEBUG nova.network.neutron [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:57:49 np0005531888 podman[224819]: 2025-11-22 07:57:49.6783744 +0000 UTC m=+0.049147839 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.596 186792 DEBUG nova.network.neutron [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Updating instance_info_cache with network_info: [{"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.672 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.673 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Instance network_info: |[{"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.673 186792 DEBUG oslo_concurrency.lockutils [req-adf7fbb8-0d7a-4e93-b5d9-88bdc879f33c req-d5107a67-04f0-4677-b7e1-810a9a89d4b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.673 186792 DEBUG nova.network.neutron [req-adf7fbb8-0d7a-4e93-b5d9-88bdc879f33c req-d5107a67-04f0-4677-b7e1-810a9a89d4b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Refreshing network info cache for port 7d226e71-e4bc-4841-b81e-fc63ab1cab60 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:57:50 np0005531888 podman[224844]: 2025-11-22 07:57:50.675300646 +0000 UTC m=+0.049334384 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.676 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Start _get_guest_xml network_info=[{"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.680 186792 WARNING nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.685 186792 DEBUG nova.virt.libvirt.host [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.685 186792 DEBUG nova.virt.libvirt.host [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.689 186792 DEBUG nova.virt.libvirt.host [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.689 186792 DEBUG nova.virt.libvirt.host [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.690 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.691 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.691 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.691 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.691 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.692 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.692 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.692 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.692 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.692 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.693 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.693 186792 DEBUG nova.virt.hardware [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.696 186792 DEBUG nova.virt.libvirt.vif [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:57:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-112250648',display_name='tempest-ServerDiskConfigTestJSON-server-112250648',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-112250648',id=78,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qh5vq0is',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:45Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=4aa9cf15-4a61-49f4-ac9e-28ce42abc31d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.697 186792 DEBUG nova.network.os_vif_util [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.697 186792 DEBUG nova.network.os_vif_util [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:b5:76,bridge_name='br-int',has_traffic_filtering=True,id=7d226e71-e4bc-4841-b81e-fc63ab1cab60,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d226e71-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.698 186792 DEBUG nova.objects.instance [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.712 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <uuid>4aa9cf15-4a61-49f4-ac9e-28ce42abc31d</uuid>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <name>instance-0000004e</name>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-112250648</nova:name>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:57:50</nova:creationTime>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        <nova:port uuid="7d226e71-e4bc-4841-b81e-fc63ab1cab60">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <entry name="serial">4aa9cf15-4a61-49f4-ac9e-28ce42abc31d</entry>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <entry name="uuid">4aa9cf15-4a61-49f4-ac9e-28ce42abc31d</entry>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk.config"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:69:b5:76"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <target dev="tap7d226e71-e4"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/console.log" append="off"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:57:50 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:57:50 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:57:50 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:57:50 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.713 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Preparing to wait for external event network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.714 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.714 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.715 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.715 186792 DEBUG nova.virt.libvirt.vif [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:57:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-112250648',display_name='tempest-ServerDiskConfigTestJSON-server-112250648',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-112250648',id=78,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qh5vq0is',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:45Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=4aa9cf15-4a61-49f4-ac9e-28ce42abc31d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.716 186792 DEBUG nova.network.os_vif_util [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.716 186792 DEBUG nova.network.os_vif_util [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:b5:76,bridge_name='br-int',has_traffic_filtering=True,id=7d226e71-e4bc-4841-b81e-fc63ab1cab60,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d226e71-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.717 186792 DEBUG os_vif [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:b5:76,bridge_name='br-int',has_traffic_filtering=True,id=7d226e71-e4bc-4841-b81e-fc63ab1cab60,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d226e71-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.717 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.718 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.718 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.722 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.722 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d226e71-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.722 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7d226e71-e4, col_values=(('external_ids', {'iface-id': '7d226e71-e4bc-4841-b81e-fc63ab1cab60', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:b5:76', 'vm-uuid': '4aa9cf15-4a61-49f4-ac9e-28ce42abc31d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.724 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:50 np0005531888 NetworkManager[55166]: <info>  [1763798270.7252] manager: (tap7d226e71-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.726 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.729 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.729 186792 INFO os_vif [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:b5:76,bridge_name='br-int',has_traffic_filtering=True,id=7d226e71-e4bc-4841-b81e-fc63ab1cab60,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d226e71-e4')#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:50 np0005531888 nova_compute[186788]: 2025-11-22 07:57:50.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:51 np0005531888 nova_compute[186788]: 2025-11-22 07:57:51.452 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:51 np0005531888 nova_compute[186788]: 2025-11-22 07:57:51.453 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:51 np0005531888 nova_compute[186788]: 2025-11-22 07:57:51.453 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:69:b5:76, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:57:51 np0005531888 nova_compute[186788]: 2025-11-22 07:57:51.454 186792 INFO nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Using config drive#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.439 186792 INFO nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Creating config drive at /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk.config#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.446 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfa0_rp90 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.570 186792 DEBUG oslo_concurrency.processutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfa0_rp90" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:52 np0005531888 kernel: tap7d226e71-e4: entered promiscuous mode
Nov 22 02:57:52 np0005531888 NetworkManager[55166]: <info>  [1763798272.6314] manager: (tap7d226e71-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.632 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:52Z|00201|binding|INFO|Claiming lport 7d226e71-e4bc-4841-b81e-fc63ab1cab60 for this chassis.
Nov 22 02:57:52 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:52Z|00202|binding|INFO|7d226e71-e4bc-4841-b81e-fc63ab1cab60: Claiming fa:16:3e:69:b5:76 10.100.0.9
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.645 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:b5:76 10.100.0.9'], port_security=['fa:16:3e:69:b5:76 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4aa9cf15-4a61-49f4-ac9e-28ce42abc31d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=7d226e71-e4bc-4841-b81e-fc63ab1cab60) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:52 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:52Z|00203|binding|INFO|Setting lport 7d226e71-e4bc-4841-b81e-fc63ab1cab60 ovn-installed in OVS
Nov 22 02:57:52 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:52Z|00204|binding|INFO|Setting lport 7d226e71-e4bc-4841-b81e-fc63ab1cab60 up in Southbound
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.647 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 7d226e71-e4bc-4841-b81e-fc63ab1cab60 in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.648 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.648 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.659 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4c27b1-a6e4-480a-8e3d-aedd7493cbbf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.660 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.662 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.662 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dcac59a6-bd22-4088-b458-94d625d4f358]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.663 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[688d24a4-de2d-4df1-9899-ace8fc5dbbe6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 systemd-udevd[224885]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:57:52 np0005531888 systemd-machined[153106]: New machine qemu-36-instance-0000004e.
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.674 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[3327613e-6361-4408-8986-7f92df3d7779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 systemd[1]: Started Virtual Machine qemu-36-instance-0000004e.
Nov 22 02:57:52 np0005531888 NetworkManager[55166]: <info>  [1763798272.6801] device (tap7d226e71-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:57:52 np0005531888 NetworkManager[55166]: <info>  [1763798272.6808] device (tap7d226e71-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.687 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[22dc7607-d41f-4c33-9600-7f0676302214]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.713 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[02d0c439-b235-415a-aff3-72818a66b6c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.719 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0841f47f-3c30-4575-b6ce-10e1a2fbd8ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 NetworkManager[55166]: <info>  [1763798272.7202] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/112)
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.755 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4b9734-65b2-4eac-b3ec-8daf11200e7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.758 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[76623794-2f16-447d-8162-c3bc27af22e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 NetworkManager[55166]: <info>  [1763798272.7794] device (tapd54e232a-50): carrier: link connected
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.783 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cd792f-d720-444f-9e95-6335f87e89bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.800 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef2720f-3868-42bc-9921-4d41aee23781]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501342, 'reachable_time': 17205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224917, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.816 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[90861eba-f46e-48dc-9e82-2399c60d372a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501342, 'tstamp': 501342}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224918, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.830 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[554ffb9c-b971-49c1-a88a-1ae4a4bbeadb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501342, 'reachable_time': 17205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224919, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.854 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.861 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[86f3f0b5-6c71-47eb-a223-b7a14cd5126f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.916 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[72636386-7d60-42f3-8dea-a821eed5d92f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.918 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.919 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.919 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.921 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531888 NetworkManager[55166]: <info>  [1763798272.9218] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Nov 22 02:57:52 np0005531888 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.923 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.926 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:52 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:52Z|00205|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.928 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.932 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.933 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[978e6ab0-5bd7-4b58-ac2e-c13dbd015347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.934 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:57:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:52.934 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.940 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.992 186792 DEBUG nova.compute.manager [req-bc47f1df-9697-4e3b-8ea6-f37f33b225bd req-f2860951-2b14-4043-9871-f048f2d79a73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received event network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.992 186792 DEBUG oslo_concurrency.lockutils [req-bc47f1df-9697-4e3b-8ea6-f37f33b225bd req-f2860951-2b14-4043-9871-f048f2d79a73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.993 186792 DEBUG oslo_concurrency.lockutils [req-bc47f1df-9697-4e3b-8ea6-f37f33b225bd req-f2860951-2b14-4043-9871-f048f2d79a73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.993 186792 DEBUG oslo_concurrency.lockutils [req-bc47f1df-9697-4e3b-8ea6-f37f33b225bd req-f2860951-2b14-4043-9871-f048f2d79a73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:52 np0005531888 nova_compute[186788]: 2025-11-22 07:57:52.993 186792 DEBUG nova.compute.manager [req-bc47f1df-9697-4e3b-8ea6-f37f33b225bd req-f2860951-2b14-4043-9871-f048f2d79a73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Processing event network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.101 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.103 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798273.1012301, 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.103 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] VM Started (Lifecycle Event)#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.106 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.110 186792 INFO nova.virt.libvirt.driver [-] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Instance spawned successfully.#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.110 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.124 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.130 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.133 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.133 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.134 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.134 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.134 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.135 186792 DEBUG nova.virt.libvirt.driver [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.182 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.183 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798273.1029856, 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.183 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.201 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.206 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798273.1060102, 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.206 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.215 186792 DEBUG nova.network.neutron [req-adf7fbb8-0d7a-4e93-b5d9-88bdc879f33c req-d5107a67-04f0-4677-b7e1-810a9a89d4b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Updated VIF entry in instance network info cache for port 7d226e71-e4bc-4841-b81e-fc63ab1cab60. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.215 186792 DEBUG nova.network.neutron [req-adf7fbb8-0d7a-4e93-b5d9-88bdc879f33c req-d5107a67-04f0-4677-b7e1-810a9a89d4b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Updating instance_info_cache with network_info: [{"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.238 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.244 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.247 186792 DEBUG oslo_concurrency.lockutils [req-adf7fbb8-0d7a-4e93-b5d9-88bdc879f33c req-d5107a67-04f0-4677-b7e1-810a9a89d4b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.279 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:57:53 np0005531888 podman[224957]: 2025-11-22 07:57:53.278399006 +0000 UTC m=+0.022420663 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:57:53 np0005531888 podman[224957]: 2025-11-22 07:57:53.376808716 +0000 UTC m=+0.120830353 container create 5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:57:53 np0005531888 systemd[1]: Started libpod-conmon-5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329.scope.
Nov 22 02:57:53 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:57:53 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8be64e35a5bdb1936fc40e842c51e434da57d00a43728afe22cf70f2219bf3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:57:53 np0005531888 podman[224957]: 2025-11-22 07:57:53.541957599 +0000 UTC m=+0.285979256 container init 5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:57:53 np0005531888 podman[224957]: 2025-11-22 07:57:53.546994374 +0000 UTC m=+0.291016011 container start 5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.566 186792 INFO nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Took 8.13 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.566 186792 DEBUG nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:53 np0005531888 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224973]: [NOTICE]   (224977) : New worker (224979) forked
Nov 22 02:57:53 np0005531888 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224973]: [NOTICE]   (224977) : Loading success.
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.691 186792 INFO nova.compute.manager [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Took 9.18 seconds to build instance.#033[00m
Nov 22 02:57:53 np0005531888 nova_compute[186788]: 2025-11-22 07:57:53.719 186792 DEBUG oslo_concurrency.lockutils [None req-7171b1c0-24ea-4e6f-b4ec-b359ded42afd e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.074 186792 DEBUG nova.compute.manager [req-8e76e51d-f4b7-4a65-b656-3ec299d5e259 req-5fd860e2-4470-4ef8-b56e-13bc434c4601 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received event network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.075 186792 DEBUG oslo_concurrency.lockutils [req-8e76e51d-f4b7-4a65-b656-3ec299d5e259 req-5fd860e2-4470-4ef8-b56e-13bc434c4601 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.075 186792 DEBUG oslo_concurrency.lockutils [req-8e76e51d-f4b7-4a65-b656-3ec299d5e259 req-5fd860e2-4470-4ef8-b56e-13bc434c4601 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.075 186792 DEBUG oslo_concurrency.lockutils [req-8e76e51d-f4b7-4a65-b656-3ec299d5e259 req-5fd860e2-4470-4ef8-b56e-13bc434c4601 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.076 186792 DEBUG nova.compute.manager [req-8e76e51d-f4b7-4a65-b656-3ec299d5e259 req-5fd860e2-4470-4ef8-b56e-13bc434c4601 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] No waiting events found dispatching network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.076 186792 WARNING nova.compute.manager [req-8e76e51d-f4b7-4a65-b656-3ec299d5e259 req-5fd860e2-4470-4ef8-b56e-13bc434c4601 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received unexpected event network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:57:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:55.117 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.726 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:55 np0005531888 nova_compute[186788]: 2025-11-22 07:57:55.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:57:57 np0005531888 nova_compute[186788]: 2025-11-22 07:57:57.858 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:57 np0005531888 nova_compute[186788]: 2025-11-22 07:57:57.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:58 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:58Z|00206|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.536 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:58 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:58Z|00207|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.691 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:58 np0005531888 podman[224988]: 2025-11-22 07:57:58.729146803 +0000 UTC m=+0.065873932 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.897 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.898 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.899 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.899 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.899 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.911 186792 INFO nova.compute.manager [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Terminating instance#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.921 186792 DEBUG nova.compute.manager [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:57:58 np0005531888 kernel: tap7d226e71-e4 (unregistering): left promiscuous mode
Nov 22 02:57:58 np0005531888 NetworkManager[55166]: <info>  [1763798278.9532] device (tap7d226e71-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:58 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:58Z|00208|binding|INFO|Releasing lport 7d226e71-e4bc-4841-b81e-fc63ab1cab60 from this chassis (sb_readonly=0)
Nov 22 02:57:58 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:58Z|00209|binding|INFO|Setting lport 7d226e71-e4bc-4841-b81e-fc63ab1cab60 down in Southbound
Nov 22 02:57:58 np0005531888 ovn_controller[95067]: 2025-11-22T07:57:58Z|00210|binding|INFO|Removing iface tap7d226e71-e4 ovn-installed in OVS
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.961 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:58.966 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:b5:76 10.100.0.9'], port_security=['fa:16:3e:69:b5:76 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4aa9cf15-4a61-49f4-ac9e-28ce42abc31d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=7d226e71-e4bc-4841-b81e-fc63ab1cab60) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:58.969 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 7d226e71-e4bc-4841-b81e-fc63ab1cab60 in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis#033[00m
Nov 22 02:57:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:58.971 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:57:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:58.973 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[abfeff9b-511d-4047-a06f-3476244fdd5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:57:58.974 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.976 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:58 np0005531888 nova_compute[186788]: 2025-11-22 07:57:58.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:57:59 np0005531888 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Nov 22 02:57:59 np0005531888 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004e.scope: Consumed 6.307s CPU time.
Nov 22 02:57:59 np0005531888 systemd-machined[153106]: Machine qemu-36-instance-0000004e terminated.
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.142 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.148 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.185 186792 INFO nova.virt.libvirt.driver [-] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Instance destroyed successfully.#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.186 186792 DEBUG nova.objects.instance [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.196 186792 DEBUG nova.virt.libvirt.vif [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:57:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-112250648',display_name='tempest-ServerDiskConfigTestJSON-server-112250648',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-112250648',id=78,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:57:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qh5vq0is',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:57:57Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=4aa9cf15-4a61-49f4-ac9e-28ce42abc31d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.196 186792 DEBUG nova.network.os_vif_util [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "address": "fa:16:3e:69:b5:76", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d226e71-e4", "ovs_interfaceid": "7d226e71-e4bc-4841-b81e-fc63ab1cab60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.197 186792 DEBUG nova.network.os_vif_util [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:b5:76,bridge_name='br-int',has_traffic_filtering=True,id=7d226e71-e4bc-4841-b81e-fc63ab1cab60,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d226e71-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.197 186792 DEBUG os_vif [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:b5:76,bridge_name='br-int',has_traffic_filtering=True,id=7d226e71-e4bc-4841-b81e-fc63ab1cab60,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d226e71-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.199 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.200 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d226e71-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.202 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.205 186792 INFO os_vif [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:b5:76,bridge_name='br-int',has_traffic_filtering=True,id=7d226e71-e4bc-4841-b81e-fc63ab1cab60,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d226e71-e4')#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.206 186792 INFO nova.virt.libvirt.driver [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Deleting instance files /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d_del#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.206 186792 INFO nova.virt.libvirt.driver [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Deletion of /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d_del complete#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.288 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000004e, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/4aa9cf15-4a61-49f4-ac9e-28ce42abc31d/disk#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.307 186792 INFO nova.compute.manager [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.307 186792 DEBUG oslo.service.loopingcall [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.307 186792 DEBUG nova.compute.manager [-] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.307 186792 DEBUG nova.network.neutron [-] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:57:59 np0005531888 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224973]: [NOTICE]   (224977) : haproxy version is 2.8.14-c23fe91
Nov 22 02:57:59 np0005531888 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224973]: [NOTICE]   (224977) : path to executable is /usr/sbin/haproxy
Nov 22 02:57:59 np0005531888 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224973]: [WARNING]  (224977) : Exiting Master process...
Nov 22 02:57:59 np0005531888 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224973]: [WARNING]  (224977) : Exiting Master process...
Nov 22 02:57:59 np0005531888 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224973]: [ALERT]    (224977) : Current worker (224979) exited with code 143 (Terminated)
Nov 22 02:57:59 np0005531888 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224973]: [WARNING]  (224977) : All workers exited. Exiting... (0)
Nov 22 02:57:59 np0005531888 systemd[1]: libpod-5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329.scope: Deactivated successfully.
Nov 22 02:57:59 np0005531888 podman[225030]: 2025-11-22 07:57:59.405389449 +0000 UTC m=+0.338871397 container died 5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.472 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.473 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5663MB free_disk=73.34917831420898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.473 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.473 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.476 186792 DEBUG nova.compute.manager [req-30901b48-f903-4f54-9388-983d83ca0f3a req-796eb052-714a-4a19-8573-4bacffd8d862 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received event network-vif-unplugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.476 186792 DEBUG oslo_concurrency.lockutils [req-30901b48-f903-4f54-9388-983d83ca0f3a req-796eb052-714a-4a19-8573-4bacffd8d862 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.476 186792 DEBUG oslo_concurrency.lockutils [req-30901b48-f903-4f54-9388-983d83ca0f3a req-796eb052-714a-4a19-8573-4bacffd8d862 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.476 186792 DEBUG oslo_concurrency.lockutils [req-30901b48-f903-4f54-9388-983d83ca0f3a req-796eb052-714a-4a19-8573-4bacffd8d862 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.476 186792 DEBUG nova.compute.manager [req-30901b48-f903-4f54-9388-983d83ca0f3a req-796eb052-714a-4a19-8573-4bacffd8d862 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] No waiting events found dispatching network-vif-unplugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.477 186792 DEBUG nova.compute.manager [req-30901b48-f903-4f54-9388-983d83ca0f3a req-796eb052-714a-4a19-8573-4bacffd8d862 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received event network-vif-unplugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.558 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.558 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.558 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.615 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.630 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.676 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.677 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.922 186792 DEBUG nova.network.neutron [-] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:59 np0005531888 nova_compute[186788]: 2025-11-22 07:57:59.938 186792 INFO nova.compute.manager [-] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Took 0.63 seconds to deallocate network for instance.#033[00m
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.028 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.029 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.032 186792 DEBUG nova.compute.manager [req-330ff7d3-6221-4549-8a40-cf4ae9a0b0e9 req-7e53c634-e9a6-43ee-8823-98e9783b2e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received event network-vif-deleted-7d226e71-e4bc-4841-b81e-fc63ab1cab60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:00 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329-userdata-shm.mount: Deactivated successfully.
Nov 22 02:58:00 np0005531888 systemd[1]: var-lib-containers-storage-overlay-e8be64e35a5bdb1936fc40e842c51e434da57d00a43728afe22cf70f2219bf3f-merged.mount: Deactivated successfully.
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.079 186792 DEBUG nova.compute.provider_tree [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.095 186792 DEBUG nova.scheduler.client.report [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.122 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.168 186792 INFO nova.scheduler.client.report [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Deleted allocations for instance 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d#033[00m
Nov 22 02:58:00 np0005531888 podman[225030]: 2025-11-22 07:58:00.225713 +0000 UTC m=+1.159194948 container cleanup 5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:58:00 np0005531888 systemd[1]: libpod-conmon-5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329.scope: Deactivated successfully.
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.266 186792 DEBUG oslo_concurrency.lockutils [None req-2ed5365f-f30a-4a97-a27b-769e88a2120f e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:00 np0005531888 podman[225077]: 2025-11-22 07:58:00.543968791 +0000 UTC m=+0.297351447 container remove 5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.550 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[78f4cf7f-4da0-4192-a51e-4f729f7b632a]: (4, ('Sat Nov 22 07:57:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329)\n5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329\nSat Nov 22 07:58:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329)\n5fc18056505ecfe86b766b2b43b7d03b0399affdb6796bee60acdfc078623329\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.552 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aa98dc61-a376-4247-a905-d6a883313f67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.553 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:00 np0005531888 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.556 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:00 np0005531888 nova_compute[186788]: 2025-11-22 07:58:00.567 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.570 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b5476636-0960-499b-929d-9463dea03391]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.584 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[779a49a1-919f-4149-9d7d-e69fb7bc9d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.585 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8cddcbd9-5f77-4430-8e47-268f61521d4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.602 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b6611b-2377-42d7-b7cf-497b789c33d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501335, 'reachable_time': 32837, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225095, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.604 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:58:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:00.605 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[03e27cb0-8beb-46e5-a5b3-c6367143c7cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:00 np0005531888 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 02:58:00 np0005531888 podman[225096]: 2025-11-22 07:58:00.989996474 +0000 UTC m=+0.053965510 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:58:01 np0005531888 nova_compute[186788]: 2025-11-22 07:58:01.669 186792 DEBUG nova.compute.manager [req-e85bcef8-7aec-4b12-addb-45942262a9f0 req-b22e7c20-90f5-4fc2-9843-efa45f41c165 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received event network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:01 np0005531888 nova_compute[186788]: 2025-11-22 07:58:01.670 186792 DEBUG oslo_concurrency.lockutils [req-e85bcef8-7aec-4b12-addb-45942262a9f0 req-b22e7c20-90f5-4fc2-9843-efa45f41c165 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:01 np0005531888 nova_compute[186788]: 2025-11-22 07:58:01.670 186792 DEBUG oslo_concurrency.lockutils [req-e85bcef8-7aec-4b12-addb-45942262a9f0 req-b22e7c20-90f5-4fc2-9843-efa45f41c165 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:01 np0005531888 nova_compute[186788]: 2025-11-22 07:58:01.670 186792 DEBUG oslo_concurrency.lockutils [req-e85bcef8-7aec-4b12-addb-45942262a9f0 req-b22e7c20-90f5-4fc2-9843-efa45f41c165 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4aa9cf15-4a61-49f4-ac9e-28ce42abc31d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:01 np0005531888 nova_compute[186788]: 2025-11-22 07:58:01.671 186792 DEBUG nova.compute.manager [req-e85bcef8-7aec-4b12-addb-45942262a9f0 req-b22e7c20-90f5-4fc2-9843-efa45f41c165 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] No waiting events found dispatching network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:01 np0005531888 nova_compute[186788]: 2025-11-22 07:58:01.671 186792 WARNING nova.compute.manager [req-e85bcef8-7aec-4b12-addb-45942262a9f0 req-b22e7c20-90f5-4fc2-9843-efa45f41c165 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Received unexpected event network-vif-plugged-7d226e71-e4bc-4841-b81e-fc63ab1cab60 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:58:02 np0005531888 nova_compute[186788]: 2025-11-22 07:58:02.859 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:04 np0005531888 nova_compute[186788]: 2025-11-22 07:58:04.203 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:05 np0005531888 podman[225119]: 2025-11-22 07:58:05.688024461 +0000 UTC m=+0.063053352 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal)
Nov 22 02:58:07 np0005531888 nova_compute[186788]: 2025-11-22 07:58:07.303 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:07 np0005531888 nova_compute[186788]: 2025-11-22 07:58:07.861 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:09 np0005531888 nova_compute[186788]: 2025-11-22 07:58:09.207 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:11 np0005531888 podman[225141]: 2025-11-22 07:58:11.682555996 +0000 UTC m=+0.055915557 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true)
Nov 22 02:58:11 np0005531888 podman[225142]: 2025-11-22 07:58:11.721899253 +0000 UTC m=+0.092207639 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 02:58:12 np0005531888 nova_compute[186788]: 2025-11-22 07:58:12.862 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:14 np0005531888 nova_compute[186788]: 2025-11-22 07:58:14.184 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798279.1831915, 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:14 np0005531888 nova_compute[186788]: 2025-11-22 07:58:14.184 186792 INFO nova.compute.manager [-] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:58:14 np0005531888 nova_compute[186788]: 2025-11-22 07:58:14.210 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:14 np0005531888 nova_compute[186788]: 2025-11-22 07:58:14.216 186792 DEBUG nova.compute.manager [None req-c9d02f9e-d939-4b18-8d8b-6323cda37174 - - - - - -] [instance: 4aa9cf15-4a61-49f4-ac9e-28ce42abc31d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:17 np0005531888 nova_compute[186788]: 2025-11-22 07:58:17.863 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:18 np0005531888 nova_compute[186788]: 2025-11-22 07:58:18.859 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:18 np0005531888 nova_compute[186788]: 2025-11-22 07:58:18.860 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:18 np0005531888 nova_compute[186788]: 2025-11-22 07:58:18.998 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.094 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.094 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.102 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.103 186792 INFO nova.compute.claims [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.214 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.256 186792 DEBUG nova.compute.provider_tree [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.269 186792 DEBUG nova.scheduler.client.report [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.303 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.304 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.371 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.372 186792 DEBUG nova.network.neutron [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.459 186792 INFO nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.475 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.660 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.664 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.665 186792 INFO nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Creating image(s)#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.666 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "/var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.666 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "/var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.667 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "/var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.680 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.734 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.736 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.736 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.747 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.803 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:19 np0005531888 nova_compute[186788]: 2025-11-22 07:58:19.804 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.201 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk 1073741824" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.202 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.203 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.259 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.260 186792 DEBUG nova.virt.disk.api [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Checking if we can resize image /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.260 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.314 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.315 186792 DEBUG nova.virt.disk.api [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Cannot resize image /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.316 186792 DEBUG nova.objects.instance [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'migration_context' on Instance uuid c1f4b900-94ac-4865-bbb3-cb003a35e9ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.328 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.328 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Ensure instance console log exists: /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.329 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.329 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.329 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:20 np0005531888 nova_compute[186788]: 2025-11-22 07:58:20.441 186792 DEBUG nova.policy [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:58:20 np0005531888 podman[225200]: 2025-11-22 07:58:20.683396438 +0000 UTC m=+0.056032780 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:58:20 np0005531888 podman[225225]: 2025-11-22 07:58:20.760623818 +0000 UTC m=+0.046619408 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:58:22 np0005531888 nova_compute[186788]: 2025-11-22 07:58:22.081 186792 DEBUG nova.network.neutron [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Successfully created port: 0ee87554-bfe6-414b-a745-e97be6123cf6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:58:22 np0005531888 nova_compute[186788]: 2025-11-22 07:58:22.865 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.216 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.374 186792 DEBUG nova.network.neutron [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Successfully updated port: 0ee87554-bfe6-414b-a745-e97be6123cf6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.430 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.431 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquired lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.431 186792 DEBUG nova.network.neutron [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.514 186792 DEBUG nova.compute.manager [req-69bee5dc-22cd-4d88-9833-bcb4b5ba4769 req-afd649ca-dd32-43ca-9ad2-4d6d173f0fcb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-changed-0ee87554-bfe6-414b-a745-e97be6123cf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.514 186792 DEBUG nova.compute.manager [req-69bee5dc-22cd-4d88-9833-bcb4b5ba4769 req-afd649ca-dd32-43ca-9ad2-4d6d173f0fcb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Refreshing instance network info cache due to event network-changed-0ee87554-bfe6-414b-a745-e97be6123cf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.514 186792 DEBUG oslo_concurrency.lockutils [req-69bee5dc-22cd-4d88-9833-bcb4b5ba4769 req-afd649ca-dd32-43ca-9ad2-4d6d173f0fcb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:24 np0005531888 nova_compute[186788]: 2025-11-22 07:58:24.641 186792 DEBUG nova.network.neutron [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.678 186792 DEBUG nova.network.neutron [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updating instance_info_cache with network_info: [{"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.745 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Releasing lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.745 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Instance network_info: |[{"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.746 186792 DEBUG oslo_concurrency.lockutils [req-69bee5dc-22cd-4d88-9833-bcb4b5ba4769 req-afd649ca-dd32-43ca-9ad2-4d6d173f0fcb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.746 186792 DEBUG nova.network.neutron [req-69bee5dc-22cd-4d88-9833-bcb4b5ba4769 req-afd649ca-dd32-43ca-9ad2-4d6d173f0fcb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Refreshing network info cache for port 0ee87554-bfe6-414b-a745-e97be6123cf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.749 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Start _get_guest_xml network_info=[{"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.753 186792 WARNING nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.759 186792 DEBUG nova.virt.libvirt.host [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.760 186792 DEBUG nova.virt.libvirt.host [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.768 186792 DEBUG nova.virt.libvirt.host [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.769 186792 DEBUG nova.virt.libvirt.host [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.770 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.770 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.770 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.771 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.771 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.771 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.771 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.771 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.772 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.772 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.772 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.772 186792 DEBUG nova.virt.hardware [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.775 186792 DEBUG nova.virt.libvirt.vif [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-322704032',display_name='tempest-SecurityGroupsTestJSON-server-322704032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-322704032',id=81,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-b530zn28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:19Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=c1f4b900-94ac-4865-bbb3-cb003a35e9ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.776 186792 DEBUG nova.network.os_vif_util [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.776 186792 DEBUG nova.network.os_vif_util [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:f0:34,bridge_name='br-int',has_traffic_filtering=True,id=0ee87554-bfe6-414b-a745-e97be6123cf6,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ee87554-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.777 186792 DEBUG nova.objects.instance [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'pci_devices' on Instance uuid c1f4b900-94ac-4865-bbb3-cb003a35e9ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.794 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <uuid>c1f4b900-94ac-4865-bbb3-cb003a35e9ee</uuid>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <name>instance-00000051</name>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <nova:name>tempest-SecurityGroupsTestJSON-server-322704032</nova:name>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:58:26</nova:creationTime>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        <nova:user uuid="d77b927940494160bce27934c565fda7">tempest-SecurityGroupsTestJSON-2135176549-project-member</nova:user>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        <nova:project uuid="d2bcbcf3720f46be9fea7fc4685dfecd">tempest-SecurityGroupsTestJSON-2135176549</nova:project>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        <nova:port uuid="0ee87554-bfe6-414b-a745-e97be6123cf6">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <entry name="serial">c1f4b900-94ac-4865-bbb3-cb003a35e9ee</entry>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <entry name="uuid">c1f4b900-94ac-4865-bbb3-cb003a35e9ee</entry>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.config"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:cc:f0:34"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <target dev="tap0ee87554-bf"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/console.log" append="off"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:58:26 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:58:26 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:58:26 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:58:26 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.795 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Preparing to wait for external event network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.795 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.796 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.796 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.796 186792 DEBUG nova.virt.libvirt.vif [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-322704032',display_name='tempest-SecurityGroupsTestJSON-server-322704032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-322704032',id=81,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-b530zn28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:19Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=c1f4b900-94ac-4865-bbb3-cb003a35e9ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.797 186792 DEBUG nova.network.os_vif_util [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.797 186792 DEBUG nova.network.os_vif_util [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:f0:34,bridge_name='br-int',has_traffic_filtering=True,id=0ee87554-bfe6-414b-a745-e97be6123cf6,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ee87554-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.798 186792 DEBUG os_vif [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:f0:34,bridge_name='br-int',has_traffic_filtering=True,id=0ee87554-bfe6-414b-a745-e97be6123cf6,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ee87554-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.798 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.798 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.799 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.801 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ee87554-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.801 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ee87554-bf, col_values=(('external_ids', {'iface-id': '0ee87554-bfe6-414b-a745-e97be6123cf6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:f0:34', 'vm-uuid': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.803 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:26 np0005531888 NetworkManager[55166]: <info>  [1763798306.8039] manager: (tap0ee87554-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.805 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.808 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.809 186792 INFO os_vif [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:f0:34,bridge_name='br-int',has_traffic_filtering=True,id=0ee87554-bfe6-414b-a745-e97be6123cf6,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ee87554-bf')#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.868 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.868 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.868 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] No VIF found with MAC fa:16:3e:cc:f0:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:58:26 np0005531888 nova_compute[186788]: 2025-11-22 07:58:26.868 186792 INFO nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Using config drive#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.245 186792 INFO nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Creating config drive at /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.config#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.250 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzurppk4o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.375 186792 DEBUG oslo_concurrency.processutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzurppk4o" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:27 np0005531888 kernel: tap0ee87554-bf: entered promiscuous mode
Nov 22 02:58:27 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:27Z|00211|binding|INFO|Claiming lport 0ee87554-bfe6-414b-a745-e97be6123cf6 for this chassis.
Nov 22 02:58:27 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:27Z|00212|binding|INFO|0ee87554-bfe6-414b-a745-e97be6123cf6: Claiming fa:16:3e:cc:f0:34 10.100.0.6
Nov 22 02:58:27 np0005531888 NetworkManager[55166]: <info>  [1763798307.4486] manager: (tap0ee87554-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/115)
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.447 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 systemd-udevd[225262]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.487 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:f0:34 10.100.0.6'], port_security=['fa:16:3e:cc:f0:34 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '726ed215-2cc1-4cd0-860c-0d95ad883b6b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63d51e5f-a087-4eb1-a0c4-4a9ee7856c37, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=0ee87554-bfe6-414b-a745-e97be6123cf6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.488 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 0ee87554-bfe6-414b-a745-e97be6123cf6 in datapath 9f740f05-d312-4e00-a27d-4d2a45e526b6 bound to our chassis#033[00m
Nov 22 02:58:27 np0005531888 systemd-machined[153106]: New machine qemu-37-instance-00000051.
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.490 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f740f05-d312-4e00-a27d-4d2a45e526b6#033[00m
Nov 22 02:58:27 np0005531888 NetworkManager[55166]: <info>  [1763798307.4984] device (tap0ee87554-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:58:27 np0005531888 NetworkManager[55166]: <info>  [1763798307.5002] device (tap0ee87554-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.502 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6275ba5e-c4f7-4b02-b945-4cfa6892d2f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.503 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f740f05-d1 in ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.506 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f740f05-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.506 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[794f0b05-f4f9-44b1-a68f-d8b0486b182e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.507 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5443814e-1fb9-42bd-8390-306f2f4cd1f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.509 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 systemd[1]: Started Virtual Machine qemu-37-instance-00000051.
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.516 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:27Z|00213|binding|INFO|Setting lport 0ee87554-bfe6-414b-a745-e97be6123cf6 ovn-installed in OVS
Nov 22 02:58:27 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:27Z|00214|binding|INFO|Setting lport 0ee87554-bfe6-414b-a745-e97be6123cf6 up in Southbound
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.521 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.525 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[236cd0d8-fde0-4dfe-a9b5-123fdb371a96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.550 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ad81a416-d9ef-4feb-bcff-0fa85de8cae0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.586 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6dce7edd-c864-4cf0-9d03-694d3bfa4aee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.593 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd66d30-ee8b-4d61-a6eb-0ec191a87839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 NetworkManager[55166]: <info>  [1763798307.5944] manager: (tap9f740f05-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/116)
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.629 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[64510bc9-bf7a-45f5-8a90-f3e92a131346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.633 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[ded9e501-ea23-4e4d-a3ca-38686554ef01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 NetworkManager[55166]: <info>  [1763798307.6616] device (tap9f740f05-d0): carrier: link connected
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.670 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8c601774-8820-4d37-b513-2f4f82313cb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.688 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6e1707-8d95-4750-8e86-81a05bf7ebbe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f740f05-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:0d:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504830, 'reachable_time': 25244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225296, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.707 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ff093116-11a4-4bbf-a03f-fb3fc16bac8a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe56:d1d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504830, 'tstamp': 504830}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225297, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.730 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[36dbbf87-f8e4-4a64-9b2f-e990bd9a4c5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f740f05-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:0d:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504830, 'reachable_time': 25244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225299, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.760 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[837c39ef-6d6e-48f4-802b-39aad1dbdfdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.832 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798307.8314207, c1f4b900-94ac-4865-bbb3-cb003a35e9ee => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.832 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] VM Started (Lifecycle Event)#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.834 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e66ffc26-374d-404e-a0dd-34ddcbc6e914]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.836 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f740f05-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.836 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.837 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f740f05-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.839 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 NetworkManager[55166]: <info>  [1763798307.8402] manager: (tap9f740f05-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Nov 22 02:58:27 np0005531888 kernel: tap9f740f05-d0: entered promiscuous mode
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.843 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.845 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f740f05-d0, col_values=(('external_ids', {'iface-id': 'a92e4d0c-d7b2-40f9-9251-db8a7ccb6b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.847 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:27Z|00215|binding|INFO|Releasing lport a92e4d0c-d7b2-40f9-9251-db8a7ccb6b31 from this chassis (sb_readonly=0)
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.848 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.850 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.851 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b39ab4e2-88e5-4624-93e5-e421318ada14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.853 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-9f740f05-d312-4e00-a27d-4d2a45e526b6
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 9f740f05-d312-4e00-a27d-4d2a45e526b6
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:58:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:27.854 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'env', 'PROCESS_TAG=haproxy-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f740f05-d312-4e00-a27d-4d2a45e526b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.862 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.863 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.866 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.869 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798307.8316174, c1f4b900-94ac-4865-bbb3-cb003a35e9ee => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.869 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.884 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.890 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.893 186792 DEBUG nova.compute.manager [req-2ef39ffc-da60-4349-a59b-4735d9a34fbb req-7ed6037a-f6c7-4ebb-a2bc-77cf2d237826 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.893 186792 DEBUG oslo_concurrency.lockutils [req-2ef39ffc-da60-4349-a59b-4735d9a34fbb req-7ed6037a-f6c7-4ebb-a2bc-77cf2d237826 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.894 186792 DEBUG oslo_concurrency.lockutils [req-2ef39ffc-da60-4349-a59b-4735d9a34fbb req-7ed6037a-f6c7-4ebb-a2bc-77cf2d237826 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.894 186792 DEBUG oslo_concurrency.lockutils [req-2ef39ffc-da60-4349-a59b-4735d9a34fbb req-7ed6037a-f6c7-4ebb-a2bc-77cf2d237826 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.894 186792 DEBUG nova.compute.manager [req-2ef39ffc-da60-4349-a59b-4735d9a34fbb req-7ed6037a-f6c7-4ebb-a2bc-77cf2d237826 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Processing event network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.895 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.899 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.905 186792 INFO nova.virt.libvirt.driver [-] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Instance spawned successfully.#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.906 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.912 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.913 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798307.8991027, c1f4b900-94ac-4865-bbb3-cb003a35e9ee => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.913 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.934 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.943 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.944 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.945 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.945 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.946 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.946 186792 DEBUG nova.virt.libvirt.driver [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.953 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:27 np0005531888 nova_compute[186788]: 2025-11-22 07:58:27.985 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:58:28 np0005531888 nova_compute[186788]: 2025-11-22 07:58:28.037 186792 INFO nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Took 8.38 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:58:28 np0005531888 nova_compute[186788]: 2025-11-22 07:58:28.037 186792 DEBUG nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:28 np0005531888 nova_compute[186788]: 2025-11-22 07:58:28.115 186792 INFO nova.compute.manager [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Took 9.06 seconds to build instance.#033[00m
Nov 22 02:58:28 np0005531888 nova_compute[186788]: 2025-11-22 07:58:28.142 186792 DEBUG oslo_concurrency.lockutils [None req-d6252c99-0f60-4bef-acc9-6826fcbb0b76 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:28 np0005531888 podman[225336]: 2025-11-22 07:58:28.288022682 +0000 UTC m=+0.077680672 container create 81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:58:28 np0005531888 podman[225336]: 2025-11-22 07:58:28.236234358 +0000 UTC m=+0.025892378 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:58:28 np0005531888 systemd[1]: Started libpod-conmon-81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828.scope.
Nov 22 02:58:28 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:58:28 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e351f87d2ee69ebac02e5e0e4c7d750aedeaf42356480c744c6168e33afbc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:58:28 np0005531888 podman[225336]: 2025-11-22 07:58:28.381104332 +0000 UTC m=+0.170762342 container init 81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:58:28 np0005531888 podman[225336]: 2025-11-22 07:58:28.38794561 +0000 UTC m=+0.177603590 container start 81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:58:28 np0005531888 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[225352]: [NOTICE]   (225356) : New worker (225358) forked
Nov 22 02:58:28 np0005531888 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[225352]: [NOTICE]   (225356) : Loading success.
Nov 22 02:58:28 np0005531888 nova_compute[186788]: 2025-11-22 07:58:28.488 186792 DEBUG nova.network.neutron [req-69bee5dc-22cd-4d88-9833-bcb4b5ba4769 req-afd649ca-dd32-43ca-9ad2-4d6d173f0fcb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updated VIF entry in instance network info cache for port 0ee87554-bfe6-414b-a745-e97be6123cf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:58:28 np0005531888 nova_compute[186788]: 2025-11-22 07:58:28.489 186792 DEBUG nova.network.neutron [req-69bee5dc-22cd-4d88-9833-bcb4b5ba4769 req-afd649ca-dd32-43ca-9ad2-4d6d173f0fcb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updating instance_info_cache with network_info: [{"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:28 np0005531888 nova_compute[186788]: 2025-11-22 07:58:28.511 186792 DEBUG oslo_concurrency.lockutils [req-69bee5dc-22cd-4d88-9833-bcb4b5ba4769 req-afd649ca-dd32-43ca-9ad2-4d6d173f0fcb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:29 np0005531888 podman[225367]: 2025-11-22 07:58:29.69028832 +0000 UTC m=+0.060088160 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:58:29 np0005531888 nova_compute[186788]: 2025-11-22 07:58:29.990 186792 DEBUG nova.compute.manager [req-5c71b4f0-d785-4167-b4be-ecb257714c34 req-58f99de8-0129-40be-ab97-8edaa9dca551 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:29 np0005531888 nova_compute[186788]: 2025-11-22 07:58:29.991 186792 DEBUG oslo_concurrency.lockutils [req-5c71b4f0-d785-4167-b4be-ecb257714c34 req-58f99de8-0129-40be-ab97-8edaa9dca551 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:29 np0005531888 nova_compute[186788]: 2025-11-22 07:58:29.991 186792 DEBUG oslo_concurrency.lockutils [req-5c71b4f0-d785-4167-b4be-ecb257714c34 req-58f99de8-0129-40be-ab97-8edaa9dca551 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:29 np0005531888 nova_compute[186788]: 2025-11-22 07:58:29.991 186792 DEBUG oslo_concurrency.lockutils [req-5c71b4f0-d785-4167-b4be-ecb257714c34 req-58f99de8-0129-40be-ab97-8edaa9dca551 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:29 np0005531888 nova_compute[186788]: 2025-11-22 07:58:29.992 186792 DEBUG nova.compute.manager [req-5c71b4f0-d785-4167-b4be-ecb257714c34 req-58f99de8-0129-40be-ab97-8edaa9dca551 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] No waiting events found dispatching network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:29 np0005531888 nova_compute[186788]: 2025-11-22 07:58:29.992 186792 WARNING nova.compute.manager [req-5c71b4f0-d785-4167-b4be-ecb257714c34 req-58f99de8-0129-40be-ab97-8edaa9dca551 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received unexpected event network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:58:30 np0005531888 nova_compute[186788]: 2025-11-22 07:58:30.330 186792 DEBUG nova.compute.manager [req-44e64db1-400a-46ff-b134-07759908c5a2 req-50eb395d-dacb-40ee-92c7-35bd21f38040 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-changed-0ee87554-bfe6-414b-a745-e97be6123cf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:30 np0005531888 nova_compute[186788]: 2025-11-22 07:58:30.331 186792 DEBUG nova.compute.manager [req-44e64db1-400a-46ff-b134-07759908c5a2 req-50eb395d-dacb-40ee-92c7-35bd21f38040 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Refreshing instance network info cache due to event network-changed-0ee87554-bfe6-414b-a745-e97be6123cf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:58:30 np0005531888 nova_compute[186788]: 2025-11-22 07:58:30.331 186792 DEBUG oslo_concurrency.lockutils [req-44e64db1-400a-46ff-b134-07759908c5a2 req-50eb395d-dacb-40ee-92c7-35bd21f38040 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:30 np0005531888 nova_compute[186788]: 2025-11-22 07:58:30.332 186792 DEBUG oslo_concurrency.lockutils [req-44e64db1-400a-46ff-b134-07759908c5a2 req-50eb395d-dacb-40ee-92c7-35bd21f38040 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:30 np0005531888 nova_compute[186788]: 2025-11-22 07:58:30.332 186792 DEBUG nova.network.neutron [req-44e64db1-400a-46ff-b134-07759908c5a2 req-50eb395d-dacb-40ee-92c7-35bd21f38040 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Refreshing network info cache for port 0ee87554-bfe6-414b-a745-e97be6123cf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:58:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:30.723 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:30.725 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:58:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:30.726 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:30 np0005531888 nova_compute[186788]: 2025-11-22 07:58:30.727 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:31 np0005531888 podman[225387]: 2025-11-22 07:58:31.688468608 +0000 UTC m=+0.058026918 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:58:31 np0005531888 nova_compute[186788]: 2025-11-22 07:58:31.804 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:32 np0005531888 nova_compute[186788]: 2025-11-22 07:58:32.777 186792 DEBUG nova.compute.manager [req-22c3e5cb-6303-4385-9f26-c2e3c0558dda req-90b6fd5e-9d07-433b-a96d-d69fbf46dcf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-changed-0ee87554-bfe6-414b-a745-e97be6123cf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:32 np0005531888 nova_compute[186788]: 2025-11-22 07:58:32.777 186792 DEBUG nova.compute.manager [req-22c3e5cb-6303-4385-9f26-c2e3c0558dda req-90b6fd5e-9d07-433b-a96d-d69fbf46dcf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Refreshing instance network info cache due to event network-changed-0ee87554-bfe6-414b-a745-e97be6123cf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:58:32 np0005531888 nova_compute[186788]: 2025-11-22 07:58:32.778 186792 DEBUG oslo_concurrency.lockutils [req-22c3e5cb-6303-4385-9f26-c2e3c0558dda req-90b6fd5e-9d07-433b-a96d-d69fbf46dcf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:32 np0005531888 nova_compute[186788]: 2025-11-22 07:58:32.871 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:33 np0005531888 nova_compute[186788]: 2025-11-22 07:58:33.046 186792 DEBUG nova.network.neutron [req-44e64db1-400a-46ff-b134-07759908c5a2 req-50eb395d-dacb-40ee-92c7-35bd21f38040 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updated VIF entry in instance network info cache for port 0ee87554-bfe6-414b-a745-e97be6123cf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:58:33 np0005531888 nova_compute[186788]: 2025-11-22 07:58:33.047 186792 DEBUG nova.network.neutron [req-44e64db1-400a-46ff-b134-07759908c5a2 req-50eb395d-dacb-40ee-92c7-35bd21f38040 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updating instance_info_cache with network_info: [{"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:33 np0005531888 nova_compute[186788]: 2025-11-22 07:58:33.084 186792 DEBUG oslo_concurrency.lockutils [req-44e64db1-400a-46ff-b134-07759908c5a2 req-50eb395d-dacb-40ee-92c7-35bd21f38040 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:33 np0005531888 nova_compute[186788]: 2025-11-22 07:58:33.085 186792 DEBUG oslo_concurrency.lockutils [req-22c3e5cb-6303-4385-9f26-c2e3c0558dda req-90b6fd5e-9d07-433b-a96d-d69fbf46dcf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:33 np0005531888 nova_compute[186788]: 2025-11-22 07:58:33.085 186792 DEBUG nova.network.neutron [req-22c3e5cb-6303-4385-9f26-c2e3c0558dda req-90b6fd5e-9d07-433b-a96d-d69fbf46dcf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Refreshing network info cache for port 0ee87554-bfe6-414b-a745-e97be6123cf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:58:34 np0005531888 nova_compute[186788]: 2025-11-22 07:58:34.547 186792 DEBUG nova.network.neutron [req-22c3e5cb-6303-4385-9f26-c2e3c0558dda req-90b6fd5e-9d07-433b-a96d-d69fbf46dcf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updated VIF entry in instance network info cache for port 0ee87554-bfe6-414b-a745-e97be6123cf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:58:34 np0005531888 nova_compute[186788]: 2025-11-22 07:58:34.547 186792 DEBUG nova.network.neutron [req-22c3e5cb-6303-4385-9f26-c2e3c0558dda req-90b6fd5e-9d07-433b-a96d-d69fbf46dcf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updating instance_info_cache with network_info: [{"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:34 np0005531888 nova_compute[186788]: 2025-11-22 07:58:34.562 186792 DEBUG oslo_concurrency.lockutils [req-22c3e5cb-6303-4385-9f26-c2e3c0558dda req-90b6fd5e-9d07-433b-a96d-d69fbf46dcf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:36 np0005531888 podman[225411]: 2025-11-22 07:58:36.680813267 +0000 UTC m=+0.050846302 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Nov 22 02:58:36 np0005531888 nova_compute[186788]: 2025-11-22 07:58:36.805 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:36.811 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:36.812 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:36.812 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.842 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000051', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'user_id': 'd77b927940494160bce27934c565fda7', 'hostId': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.843 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.855 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.855 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '879df591-c511-4f5b-89d7-27aa3bea21c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.843655', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cd23054-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.543144597, 'message_signature': 'dc982345bdeeffb288891546ed7645e0681962289f1682639c2baadeb44af82c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.843655', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cd23e14-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.543144597, 'message_signature': '6f0b0780049f966bd4f9e70e1e257da1f7cc5244491277262836c5a1ffaf67ab'}]}, 'timestamp': '2025-11-22 07:58:36.856301', '_unique_id': '446c8457176647e698702053b511ec83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.857 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.858 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.864 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c1f4b900-94ac-4865-bbb3-cb003a35e9ee / tap0ee87554-bf inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.865 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70bce0cd-bd86-445d-b87f-5076ca62dc97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.858471', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cd3a970-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': '9536037b7b9980ba51dcb3791e0d33bfe9a24fcc70a136b27b9c9643d5f26f1f'}]}, 'timestamp': '2025-11-22 07:58:36.865543', '_unique_id': 'd5eb034c99bf4655947e1c5e7133736d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.894 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.895 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67122e5f-780a-45a9-b106-7c14f34125e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.867378', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cd83aee-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': '43c3f7a4754fbed114108791bcd5cba3b2cb6973f06957d7a0e44cc5937ae9fe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.867378', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cd84912-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': 'd9d3b26823de7d10b085480754ade01b85cf37f5f11d96fa897c697d62c5597b'}]}, 'timestamp': '2025-11-22 07:58:36.895786', '_unique_id': '15232deff6bc4c88a3c937ba7b98744a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.896 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.897 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.898 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-322704032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-322704032>]
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.898 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.898 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.898 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-322704032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-322704032>]
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.898 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a52d6d21-bf63-4385-a57f-565a3fb2dc4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.898918', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cd8ce82-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': '93fda84111e0f808b828890fa51250705016ba6689b532b484042cf1170c0655'}]}, 'timestamp': '2025-11-22 07:58:36.899203', '_unique_id': '0c2f0b4735b74ce893bb4676f685bec1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.899 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.900 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.919 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/cpu volume: 8740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a822f98a-7f23-4573-b85f-1732919c923d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8740000000, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'timestamp': '2025-11-22T07:58:36.900759', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0cdbfff8-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.618954872, 'message_signature': '70bcdf3602649aca05e693cef9c550e1ac7bc3b2c61b4fb23314c5d679d96723'}]}, 'timestamp': '2025-11-22 07:58:36.920541', '_unique_id': 'da05e84b953d40d7b1f741c6ef8b65ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.922 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a00912bc-03b8-428f-9b50-06866fe0ec59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.922504', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cdc6920-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': '91ef752ecade65868bf5a4a32a6595a9a24556447be2a4b6555eaed947938719'}]}, 'timestamp': '2025-11-22 07:58:36.922820', '_unique_id': '5ea079f12bac4e1e93f4c511736ce518'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.924 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ca6651d-faba-408f-93f1-4385c296f184', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.924265', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cdcac14-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': '51e0d8e0b9dac98cdfff72ab114b330ec2202892721967ef32742811deb1f36d'}]}, 'timestamp': '2025-11-22 07:58:36.924529', '_unique_id': '1f064193ab84488f94154721f26e9070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab7b40c0-ed2c-4e1f-a4ae-7fefb3798154', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.926044', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cdcf192-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': '908afb962fa9208c88cdae63dbefa6ad0dc5725f632beb12dc93a3cff5547f10'}]}, 'timestamp': '2025-11-22 07:58:36.926315', '_unique_id': 'd0957e0362934fe9901d221e1c87174d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.927 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cde577ef-2ee0-4d68-b9a3-b41dfaf03dc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.927808', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cdd36a2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': 'cecac045992e5f0a1348ba81c121d774ea426acafe3e72e6678edc23a8153ad4'}]}, 'timestamp': '2025-11-22 07:58:36.928075', '_unique_id': '9a9328906b854f3bbce378bc286b595d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.929 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.929 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67508823-8d0e-45dc-a7d2-0edb651bc793', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.929508', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cdd7a18-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': '962af7e12d0b1e4ecb3326826de05bba8b5db6f56b1ca5786a7e7bab4df7fc27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.929508', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cdd8382-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': '92ecaca951abfbe4540ad5465429b5f29af1339b7c98edf22e95fb26339f466e'}]}, 'timestamp': '2025-11-22 07:58:36.930030', '_unique_id': '6bce7e103cab459caf691e9d2751a409'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.931 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.931 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-322704032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-322704032>]
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.931 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.932 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.932 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c1f4b900-94ac-4865-bbb3-cb003a35e9ee: ceilometer.compute.pollsters.NoVolumeException
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.932 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.932 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd448e20b-56bb-4e9b-88c6-657a01571ead', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.932376', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cdde8c2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': '9a1510d2edaf4b1a3f075862ad623ef143d843656e3021ef8bc54a8551333ca7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.932376', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cddf358-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': 'b8a12582c9b9efe4c63b4178e9e032bec06d112d887a775a73a059724de1ec5c'}]}, 'timestamp': '2025-11-22 07:58:36.932894', '_unique_id': 'd2772aa8094e433ca603d80c10c7941a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.934 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.934 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-322704032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-322704032>]
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.934 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3be49a74-a8e0-4b39-8967-85d40565d7b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.934763', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cde460a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': 'a0f75b736563b32b269f4ad8e0993b27d40a6dadaaec102c8e35b0e86e7ed46a'}]}, 'timestamp': '2025-11-22 07:58:36.935063', '_unique_id': 'f24acda0803444b98ebe672f4d8e93c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.936 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.936 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '830299e7-0032-4e7d-83ae-4b8378ca8a8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.936782', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cde9506-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': '005cc8b03b0d0ace9737d3e37f582ea8da3cba936241321096b4f3babd9667ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.936782', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cde9e48-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': 'c62ed41883c7f431f933bdb105a436ad78c3b8847124b2657e92b45b37a2feeb'}]}, 'timestamp': '2025-11-22 07:58:36.937270', '_unique_id': '29b07b977d344032b044e8fcc349b52a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.938 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47097f0b-d0c0-4409-b56a-58b714dccee3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.938862', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cdee664-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': 'e3ec7f32af0cc00057aee84a571922bfd15899af4193f0edbf62795d3a75c358'}]}, 'timestamp': '2025-11-22 07:58:36.939137', '_unique_id': '32324e86b2604fe3a0d3484b6d3930eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.940 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d28d957-abf2-49bb-83c1-bbad2ad4d808', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.940753', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cdf3042-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': 'd3294828ea1f3c95c09cae4a88b4439848d60173e6ee9c2ef9ace668814dae7f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.940753', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cdf3bbe-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': '440ae9f0ee46cc6bc7e20c2def1ae6237a0a3e097aef55061b82b5e9eac25194'}]}, 'timestamp': '2025-11-22 07:58:36.941308', '_unique_id': '48d3ab5fcd574877affb67a6d0cde5b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.942 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e9164a0-2fe4-4e96-9794-baaa42acb5dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.942830', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0cdf816e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': '431e5672c2b34d8e0458b7c60c746fa53cc34989edb778db492490441c2c3973'}]}, 'timestamp': '2025-11-22 07:58:36.943100', '_unique_id': '1137e43dd11445c6a5f59cb0e33ea85e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.944 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.read.latency volume: 718868787 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.944 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.read.latency volume: 2644505 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57d73803-aca4-4182-9286-41be41148517', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 718868787, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.944682', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cdfcb06-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': 'e7649480925c579c7b21ddaaaf7cb9a3ac6c73ed48dfd60e46e9fb1825b384ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2644505, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.944682', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cdfd47a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.566889571, 'message_signature': '5e613476516c17dd653ea3533eb8cee0a8239620f89678c81d57aec2a247529f'}]}, 'timestamp': '2025-11-22 07:58:36.945215', '_unique_id': 'b698221b030d4d329ec1a11fb2a0ec99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.946 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.947 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.947 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2762a2d-3435-42bf-bee8-650473789e31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.947064', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ce027e0-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.543144597, 'message_signature': 'a9fde0d87c5a4269a278239a05caddaaa13cd6c94e9c45269c7b6eb21e9a9022'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.947064', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ce03168-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.543144597, 'message_signature': '4c2f6f81ed7e8ef92cf5a57ac436a3fd60514455cedaaf10178d4f6433768606'}]}, 'timestamp': '2025-11-22 07:58:36.947627', '_unique_id': 'ca3180084f0f405bba3806d10292b97c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.949 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.949 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.949 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21560cfa-2d8a-402d-9ad2-2acac04c80c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-vda', 'timestamp': '2025-11-22T07:58:36.949202', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ce07a24-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.543144597, 'message_signature': 'b8efbf20d49fa91bbdfe2b1627b4c4ed7fe331abb89d27629730c99ce02c7005'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee-sda', 'timestamp': '2025-11-22T07:58:36.949202', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'instance-00000051', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ce084ce-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.543144597, 'message_signature': '1b134135ff1abcafc3a7137df1a1e78e857e7b1ca0d6db6f706daed5a4b3b660'}]}, 'timestamp': '2025-11-22 07:58:36.949724', '_unique_id': '6db3193ca14e4799bdaea8b242b0e65c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.950 12 DEBUG ceilometer.compute.pollsters [-] c1f4b900-94ac-4865-bbb3-cb003a35e9ee/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b3e8a29-fed5-4c18-99ad-f33944ca16ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_name': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_name': None, 'resource_id': 'instance-00000051-c1f4b900-94ac-4865-bbb3-cb003a35e9ee-tap0ee87554-bf', 'timestamp': '2025-11-22T07:58:36.950945', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-322704032', 'name': 'tap0ee87554-bf', 'instance_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'instance_type': 'm1.nano', 'host': '5c3ef1d833a43f6a2a417764cbed45bf57c01a370aea71c8d1b5c06b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cc:f0:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ee87554-bf'}, 'message_id': '0ce0bcf0-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5057.557977893, 'message_signature': '02d50ab23ce3233f4661f71f57cc83a2e4b6698bc09a24c2a381fcc585f93438'}]}, 'timestamp': '2025-11-22 07:58:36.951170', '_unique_id': 'a84b8954763549739ed4d4ef7ff2d8b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 07:58:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531888 nova_compute[186788]: 2025-11-22 07:58:37.872 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.422 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.422 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.436 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.535 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.536 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.544 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.545 186792 INFO nova.compute.claims [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.671 186792 DEBUG nova.compute.provider_tree [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.683 186792 DEBUG nova.scheduler.client.report [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.710 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.710 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.760 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.761 186792 DEBUG nova.network.neutron [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.778 186792 INFO nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.794 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.806 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.921 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.923 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.923 186792 INFO nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Creating image(s)#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.924 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.924 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.925 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:41 np0005531888 nova_compute[186788]: 2025-11-22 07:58:41.942 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.001 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.002 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.003 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.017 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.036 186792 DEBUG nova.policy [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.074 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.075 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.245 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk 1073741824" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.246 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.246 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.308 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.309 186792 DEBUG nova.virt.disk.api [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Checking if we can resize image /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.310 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.371 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.373 186792 DEBUG nova.virt.disk.api [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Cannot resize image /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.373 186792 DEBUG nova.objects.instance [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'migration_context' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.392 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.393 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Ensure instance console log exists: /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.394 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.394 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.394 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:42 np0005531888 podman[225460]: 2025-11-22 07:58:42.686165719 +0000 UTC m=+0.056100151 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 02:58:42 np0005531888 podman[225461]: 2025-11-22 07:58:42.717474139 +0000 UTC m=+0.083693891 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.874 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:42 np0005531888 nova_compute[186788]: 2025-11-22 07:58:42.901 186792 DEBUG nova.network.neutron [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Successfully created port: e963f21d-d8c0-4f76-b5bc-4a3f577d4055 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:58:43 np0005531888 nova_compute[186788]: 2025-11-22 07:58:43.836 186792 DEBUG nova.network.neutron [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Successfully updated port: e963f21d-d8c0-4f76-b5bc-4a3f577d4055 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:58:43 np0005531888 nova_compute[186788]: 2025-11-22 07:58:43.849 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:43 np0005531888 nova_compute[186788]: 2025-11-22 07:58:43.850 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:43 np0005531888 nova_compute[186788]: 2025-11-22 07:58:43.850 186792 DEBUG nova.network.neutron [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:58:43 np0005531888 nova_compute[186788]: 2025-11-22 07:58:43.932 186792 DEBUG nova.compute.manager [req-95039378-38af-412b-a187-29ed7a359109 req-e159dc5f-2059-4f9e-b783-1a449141b63c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-changed-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:43 np0005531888 nova_compute[186788]: 2025-11-22 07:58:43.932 186792 DEBUG nova.compute.manager [req-95039378-38af-412b-a187-29ed7a359109 req-e159dc5f-2059-4f9e-b783-1a449141b63c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Refreshing instance network info cache due to event network-changed-e963f21d-d8c0-4f76-b5bc-4a3f577d4055. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:58:43 np0005531888 nova_compute[186788]: 2025-11-22 07:58:43.933 186792 DEBUG oslo_concurrency.lockutils [req-95039378-38af-412b-a187-29ed7a359109 req-e159dc5f-2059-4f9e-b783-1a449141b63c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:44 np0005531888 nova_compute[186788]: 2025-11-22 07:58:44.029 186792 DEBUG nova.network.neutron [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:58:45 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:45Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cc:f0:34 10.100.0.6
Nov 22 02:58:45 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:45Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:f0:34 10.100.0.6
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.229 186792 DEBUG nova.network.neutron [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.265 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.266 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance network_info: |[{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.266 186792 DEBUG oslo_concurrency.lockutils [req-95039378-38af-412b-a187-29ed7a359109 req-e159dc5f-2059-4f9e-b783-1a449141b63c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.266 186792 DEBUG nova.network.neutron [req-95039378-38af-412b-a187-29ed7a359109 req-e159dc5f-2059-4f9e-b783-1a449141b63c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Refreshing network info cache for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.269 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Start _get_guest_xml network_info=[{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.275 186792 WARNING nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.279 186792 DEBUG nova.virt.libvirt.host [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.280 186792 DEBUG nova.virt.libvirt.host [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.287 186792 DEBUG nova.virt.libvirt.host [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.287 186792 DEBUG nova.virt.libvirt.host [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.288 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.289 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.289 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.289 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.290 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.290 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.290 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.290 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.291 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.291 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.291 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.291 186792 DEBUG nova.virt.hardware [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.295 186792 DEBUG nova.virt.libvirt.vif [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.296 186792 DEBUG nova.network.os_vif_util [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.297 186792 DEBUG nova.network.os_vif_util [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.297 186792 DEBUG nova.objects.instance [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_devices' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.307 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <uuid>92409a46-2dd7-4b20-ac9d-958bbb30993d</uuid>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <name>instance-00000053</name>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestOtherB-server-342710330</nova:name>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:58:46</nova:creationTime>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        <nova:user uuid="d0c5153b41c5499bac372d2df10b9b03">tempest-ServerActionsTestOtherB-270195081-project-member</nova:user>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        <nova:project uuid="62d9a4a13f5d41529bc273c278fae96b">tempest-ServerActionsTestOtherB-270195081</nova:project>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        <nova:port uuid="e963f21d-d8c0-4f76-b5bc-4a3f577d4055">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <entry name="serial">92409a46-2dd7-4b20-ac9d-958bbb30993d</entry>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <entry name="uuid">92409a46-2dd7-4b20-ac9d-958bbb30993d</entry>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:b9:5f:a6"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <target dev="tape963f21d-d8"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/console.log" append="off"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:58:46 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:58:46 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:58:46 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:58:46 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.308 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Preparing to wait for external event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.308 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.309 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.309 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.309 186792 DEBUG nova.virt.libvirt.vif [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.310 186792 DEBUG nova.network.os_vif_util [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.310 186792 DEBUG nova.network.os_vif_util [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.311 186792 DEBUG os_vif [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.311 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.311 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.312 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.318 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.318 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape963f21d-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.319 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape963f21d-d8, col_values=(('external_ids', {'iface-id': 'e963f21d-d8c0-4f76-b5bc-4a3f577d4055', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:5f:a6', 'vm-uuid': '92409a46-2dd7-4b20-ac9d-958bbb30993d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:46 np0005531888 NetworkManager[55166]: <info>  [1763798326.3210] manager: (tape963f21d-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.322 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.327 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.328 186792 INFO os_vif [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8')#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.398 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.398 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.399 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No VIF found with MAC fa:16:3e:b9:5f:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.399 186792 INFO nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Using config drive#033[00m
Nov 22 02:58:46 np0005531888 nova_compute[186788]: 2025-11-22 07:58:46.993 186792 INFO nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Creating config drive at /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.006 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg8ftrvxz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.135 186792 DEBUG oslo_concurrency.processutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg8ftrvxz" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:47 np0005531888 kernel: tape963f21d-d8: entered promiscuous mode
Nov 22 02:58:47 np0005531888 NetworkManager[55166]: <info>  [1763798327.2207] manager: (tape963f21d-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/119)
Nov 22 02:58:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:47Z|00216|binding|INFO|Claiming lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for this chassis.
Nov 22 02:58:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:47Z|00217|binding|INFO|e963f21d-d8c0-4f76-b5bc-4a3f577d4055: Claiming fa:16:3e:b9:5f:a6 10.100.0.4
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.220 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.233 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:5f:a6 10.100.0.4'], port_security=['fa:16:3e:b9:5f:a6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '92409a46-2dd7-4b20-ac9d-958bbb30993d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '320b38f4-6497-45cc-9e33-00f741d5a1b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e963f21d-d8c0-4f76-b5bc-4a3f577d4055) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.235 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 bound to our chassis#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.238 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087#033[00m
Nov 22 02:58:47 np0005531888 systemd-udevd[225534]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.253 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5109e26a-8bcd-4837-97b1-610e8972e57d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.254 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7727db5-41 in ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.256 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7727db5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.256 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[50cf79f2-5db3-4810-bb7b-11edac58cb82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.256 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[08000c51-8abc-4f47-913a-3210a2689adc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 systemd-machined[153106]: New machine qemu-38-instance-00000053.
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.268 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[8b36baa3-1375-4411-94a6-44f1cd8456c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 NetworkManager[55166]: <info>  [1763798327.2702] device (tape963f21d-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:58:47 np0005531888 NetworkManager[55166]: <info>  [1763798327.2708] device (tape963f21d-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531888 systemd[1]: Started Virtual Machine qemu-38-instance-00000053.
Nov 22 02:58:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:47Z|00218|binding|INFO|Setting lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 ovn-installed in OVS
Nov 22 02:58:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:47Z|00219|binding|INFO|Setting lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 up in Southbound
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.283 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.293 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d3adacd9-248e-4e27-80ea-b7bbb8660c3c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.325 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[03756dfa-5a29-48e5-968f-36a06ba0aee0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 NetworkManager[55166]: <info>  [1763798327.3322] manager: (tapf7727db5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/120)
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.331 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[762685a8-ed93-4aec-81d6-1f952f8a4554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.365 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ce2984-6e93-4260-81a2-b79ebfc2331b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.369 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ca7737-6563-407c-bb5f-9f49e80a6c21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 NetworkManager[55166]: <info>  [1763798327.3936] device (tapf7727db5-40): carrier: link connected
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.400 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[84862388-b4b9-4588-9a78-08ba638c388a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.415 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d769c08f-fd02-4a4c-b386-f8bd6f9d7b53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506803, 'reachable_time': 17029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225567, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.429 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[707825b7-12a7-4a8b-8a23-0be53e8089b3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:3e1a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506803, 'tstamp': 506803}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225568, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.442 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7e2b6551-8884-4971-8ae2-3b74dc142e27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506803, 'reachable_time': 17029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225569, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.469 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa774a1-de59-43e9-8529-91e23340c59c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.522 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7cac65ea-d787-49d5-aafc-3f9c7016c64f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.523 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.523 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.523 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.525 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531888 NetworkManager[55166]: <info>  [1763798327.5260] manager: (tapf7727db5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Nov 22 02:58:47 np0005531888 kernel: tapf7727db5-40: entered promiscuous mode
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.528 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.530 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.531 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:47Z|00220|binding|INFO|Releasing lport 188249cb-6e2b-4c68-9c53-aaa0a3da466f from this chassis (sb_readonly=0)
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.534 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.535 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e8ec1f-3f9f-452a-ade1-f71688a8ef89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.536 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-f7727db5-43a6-48f6-abbf-aa184d8ad087
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID f7727db5-43a6-48f6-abbf-aa184d8ad087
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:58:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:58:47.536 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'env', 'PROCESS_TAG=haproxy-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7727db5-43a6-48f6-abbf-aa184d8ad087.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.543 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.679 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.876 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:47 np0005531888 nova_compute[186788]: 2025-11-22 07:58:47.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:58:47 np0005531888 podman[225600]: 2025-11-22 07:58:47.868568464 +0000 UTC m=+0.020566717 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:58:48 np0005531888 podman[225600]: 2025-11-22 07:58:48.228503499 +0000 UTC m=+0.380501732 container create f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 02:58:48 np0005531888 systemd[1]: Started libpod-conmon-f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac.scope.
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.328 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798328.3283281, 92409a46-2dd7-4b20-ac9d-958bbb30993d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.329 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] VM Started (Lifecycle Event)#033[00m
Nov 22 02:58:48 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:58:48 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26a18f2fa3531b624cc4901bea16207a881a8862117de9a8a0d94429734c42a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.361 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.364 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798328.3312118, 92409a46-2dd7-4b20-ac9d-958bbb30993d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.364 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.388 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.391 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:48 np0005531888 podman[225600]: 2025-11-22 07:58:48.405190026 +0000 UTC m=+0.557188289 container init f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:58:48 np0005531888 podman[225600]: 2025-11-22 07:58:48.412396524 +0000 UTC m=+0.564394757 container start f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.416 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:58:48 np0005531888 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[225622]: [NOTICE]   (225626) : New worker (225628) forked
Nov 22 02:58:48 np0005531888 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[225622]: [NOTICE]   (225626) : Loading success.
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.444 186792 DEBUG nova.network.neutron [req-95039378-38af-412b-a187-29ed7a359109 req-e159dc5f-2059-4f9e-b783-1a449141b63c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updated VIF entry in instance network info cache for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.446 186792 DEBUG nova.network.neutron [req-95039378-38af-412b-a187-29ed7a359109 req-e159dc5f-2059-4f9e-b783-1a449141b63c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.468 186792 DEBUG oslo_concurrency.lockutils [req-95039378-38af-412b-a187-29ed7a359109 req-e159dc5f-2059-4f9e-b783-1a449141b63c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:48 np0005531888 nova_compute[186788]: 2025-11-22 07:58:48.967 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.538 186792 DEBUG nova.compute.manager [req-94a4cb45-e234-4961-83a8-382fc5a210d0 req-8c3fcc20-051e-48a2-a6d6-809b2fa54277 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.539 186792 DEBUG oslo_concurrency.lockutils [req-94a4cb45-e234-4961-83a8-382fc5a210d0 req-8c3fcc20-051e-48a2-a6d6-809b2fa54277 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.539 186792 DEBUG oslo_concurrency.lockutils [req-94a4cb45-e234-4961-83a8-382fc5a210d0 req-8c3fcc20-051e-48a2-a6d6-809b2fa54277 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.539 186792 DEBUG oslo_concurrency.lockutils [req-94a4cb45-e234-4961-83a8-382fc5a210d0 req-8c3fcc20-051e-48a2-a6d6-809b2fa54277 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.539 186792 DEBUG nova.compute.manager [req-94a4cb45-e234-4961-83a8-382fc5a210d0 req-8c3fcc20-051e-48a2-a6d6-809b2fa54277 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Processing event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.540 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.544 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798329.5442915, 92409a46-2dd7-4b20-ac9d-958bbb30993d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.545 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.548 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.554 186792 INFO nova.virt.libvirt.driver [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance spawned successfully.#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.555 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.576 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.583 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.588 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.588 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.589 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.589 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.590 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.590 186792 DEBUG nova.virt.libvirt.driver [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.613 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.648 186792 INFO nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Took 7.73 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.648 186792 DEBUG nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.716 186792 INFO nova.compute.manager [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Took 8.23 seconds to build instance.#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.729 186792 DEBUG oslo_concurrency.lockutils [None req-48c4e21b-efaa-438f-ba51-aca35338c425 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:58:49 np0005531888 nova_compute[186788]: 2025-11-22 07:58:49.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:58:50 np0005531888 nova_compute[186788]: 2025-11-22 07:58:50.393 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:50 np0005531888 nova_compute[186788]: 2025-11-22 07:58:50.393 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:50 np0005531888 nova_compute[186788]: 2025-11-22 07:58:50.393 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:58:50 np0005531888 nova_compute[186788]: 2025-11-22 07:58:50.393 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c1f4b900-94ac-4865-bbb3-cb003a35e9ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:51 np0005531888 nova_compute[186788]: 2025-11-22 07:58:51.322 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:51 np0005531888 nova_compute[186788]: 2025-11-22 07:58:51.813 186792 DEBUG nova.compute.manager [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:51 np0005531888 nova_compute[186788]: 2025-11-22 07:58:51.814 186792 DEBUG oslo_concurrency.lockutils [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:51 np0005531888 nova_compute[186788]: 2025-11-22 07:58:51.814 186792 DEBUG oslo_concurrency.lockutils [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:51 np0005531888 nova_compute[186788]: 2025-11-22 07:58:51.814 186792 DEBUG oslo_concurrency.lockutils [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:51 np0005531888 nova_compute[186788]: 2025-11-22 07:58:51.815 186792 DEBUG nova.compute.manager [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] No waiting events found dispatching network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:51 np0005531888 nova_compute[186788]: 2025-11-22 07:58:51.815 186792 WARNING nova.compute.manager [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received unexpected event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:58:51 np0005531888 podman[225637]: 2025-11-22 07:58:51.840398278 +0000 UTC m=+0.059954256 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:58:51 np0005531888 podman[225638]: 2025-11-22 07:58:51.840809788 +0000 UTC m=+0.056051330 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.484 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updating instance_info_cache with network_info: [{"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.498 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-c1f4b900-94ac-4865-bbb3-cb003a35e9ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.498 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.499 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.500 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.500 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.500 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.516 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:58:52 np0005531888 nova_compute[186788]: 2025-11-22 07:58:52.880 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:54 np0005531888 NetworkManager[55166]: <info>  [1763798334.9614] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Nov 22 02:58:54 np0005531888 NetworkManager[55166]: <info>  [1763798334.9623] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Nov 22 02:58:54 np0005531888 nova_compute[186788]: 2025-11-22 07:58:54.962 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.023 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:55 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:55Z|00221|binding|INFO|Releasing lport a92e4d0c-d7b2-40f9-9251-db8a7ccb6b31 from this chassis (sb_readonly=0)
Nov 22 02:58:55 np0005531888 ovn_controller[95067]: 2025-11-22T07:58:55Z|00222|binding|INFO|Releasing lport 188249cb-6e2b-4c68-9c53-aaa0a3da466f from this chassis (sb_readonly=0)
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.038 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.938 186792 DEBUG nova.compute.manager [req-d83d0d56-d801-443e-a24e-5ed3d9e6766e req-35fb373c-d612-433e-b8a2-198168052e5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-changed-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.938 186792 DEBUG nova.compute.manager [req-d83d0d56-d801-443e-a24e-5ed3d9e6766e req-35fb373c-d612-433e-b8a2-198168052e5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Refreshing instance network info cache due to event network-changed-e963f21d-d8c0-4f76-b5bc-4a3f577d4055. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.938 186792 DEBUG oslo_concurrency.lockutils [req-d83d0d56-d801-443e-a24e-5ed3d9e6766e req-35fb373c-d612-433e-b8a2-198168052e5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.939 186792 DEBUG oslo_concurrency.lockutils [req-d83d0d56-d801-443e-a24e-5ed3d9e6766e req-35fb373c-d612-433e-b8a2-198168052e5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.939 186792 DEBUG nova.network.neutron [req-d83d0d56-d801-443e-a24e-5ed3d9e6766e req-35fb373c-d612-433e-b8a2-198168052e5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Refreshing network info cache for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.969 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.970 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:55 np0005531888 nova_compute[186788]: 2025-11-22 07:58:55.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:58:56 np0005531888 nova_compute[186788]: 2025-11-22 07:58:56.325 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:57 np0005531888 nova_compute[186788]: 2025-11-22 07:58:57.432 186792 DEBUG nova.network.neutron [req-d83d0d56-d801-443e-a24e-5ed3d9e6766e req-35fb373c-d612-433e-b8a2-198168052e5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updated VIF entry in instance network info cache for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:58:57 np0005531888 nova_compute[186788]: 2025-11-22 07:58:57.433 186792 DEBUG nova.network.neutron [req-d83d0d56-d801-443e-a24e-5ed3d9e6766e req-35fb373c-d612-433e-b8a2-198168052e5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:57 np0005531888 nova_compute[186788]: 2025-11-22 07:58:57.450 186792 DEBUG oslo_concurrency.lockutils [req-d83d0d56-d801-443e-a24e-5ed3d9e6766e req-35fb373c-d612-433e-b8a2-198168052e5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:57 np0005531888 nova_compute[186788]: 2025-11-22 07:58:57.882 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:58 np0005531888 nova_compute[186788]: 2025-11-22 07:58:58.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:58 np0005531888 nova_compute[186788]: 2025-11-22 07:58:58.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:58 np0005531888 nova_compute[186788]: 2025-11-22 07:58:58.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:58 np0005531888 nova_compute[186788]: 2025-11-22 07:58:58.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:58 np0005531888 nova_compute[186788]: 2025-11-22 07:58:58.977 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.058 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.130 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.133 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.204 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.211 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.308 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.309 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.367 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.531 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.532 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5419MB free_disk=73.3204574584961GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.532 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.533 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.714 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance c1f4b900-94ac-4865-bbb3-cb003a35e9ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.715 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.715 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.715 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.877 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.896 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.920 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:58:59 np0005531888 nova_compute[186788]: 2025-11-22 07:58:59.920 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:00 np0005531888 podman[225691]: 2025-11-22 07:59:00.69443651 +0000 UTC m=+0.068093037 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Nov 22 02:59:01 np0005531888 nova_compute[186788]: 2025-11-22 07:59:01.329 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:01 np0005531888 nova_compute[186788]: 2025-11-22 07:59:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:02 np0005531888 nova_compute[186788]: 2025-11-22 07:59:02.459 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:02 np0005531888 podman[225709]: 2025-11-22 07:59:02.698640095 +0000 UTC m=+0.070729001 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:59:02 np0005531888 nova_compute[186788]: 2025-11-22 07:59:02.883 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:06 np0005531888 nova_compute[186788]: 2025-11-22 07:59:06.336 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:06 np0005531888 nova_compute[186788]: 2025-11-22 07:59:06.744 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:06 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:06Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b9:5f:a6 10.100.0.4
Nov 22 02:59:06 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:06Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:5f:a6 10.100.0.4
Nov 22 02:59:07 np0005531888 podman[225749]: 2025-11-22 07:59:07.695370222 +0000 UTC m=+0.068687041 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:59:07 np0005531888 nova_compute[186788]: 2025-11-22 07:59:07.885 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.357 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.358 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.370 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.475 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.476 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.483 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.484 186792 INFO nova.compute.claims [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.664 186792 DEBUG nova.compute.provider_tree [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.685 186792 DEBUG nova.scheduler.client.report [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.706 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.708 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.764 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.765 186792 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.791 186792 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:59:11 np0005531888 nova_compute[186788]: 2025-11-22 07:59:11.837 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.036 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.038 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.038 186792 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Creating image(s)#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.039 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "/var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.039 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "/var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.040 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "/var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.058 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.116 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.118 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.118 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.130 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.183 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.184 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.227 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.228 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.228 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.291 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.292 186792 DEBUG nova.virt.disk.api [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Checking if we can resize image /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.293 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.359 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.361 186792 DEBUG nova.virt.disk.api [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Cannot resize image /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.361 186792 DEBUG nova.objects.instance [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d8dfcf7-03f2-4533-bda6-95bb65d3f359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.374 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.375 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Ensure instance console log exists: /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.376 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.376 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.377 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.441 186792 DEBUG nova.policy [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf1790780fd64791b117114d170d6d90', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:59:12 np0005531888 nova_compute[186788]: 2025-11-22 07:59:12.888 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:13 np0005531888 podman[225785]: 2025-11-22 07:59:13.715923725 +0000 UTC m=+0.083532667 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Nov 22 02:59:13 np0005531888 podman[225786]: 2025-11-22 07:59:13.741931194 +0000 UTC m=+0.105572308 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:59:14 np0005531888 nova_compute[186788]: 2025-11-22 07:59:14.334 186792 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Successfully created port: ef99e0b6-d66f-4a41-9288-b64272928405 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:59:15 np0005531888 nova_compute[186788]: 2025-11-22 07:59:15.743 186792 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Successfully updated port: ef99e0b6-d66f-4a41-9288-b64272928405 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:59:15 np0005531888 nova_compute[186788]: 2025-11-22 07:59:15.758 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "refresh_cache-6d8dfcf7-03f2-4533-bda6-95bb65d3f359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:59:15 np0005531888 nova_compute[186788]: 2025-11-22 07:59:15.758 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquired lock "refresh_cache-6d8dfcf7-03f2-4533-bda6-95bb65d3f359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:59:15 np0005531888 nova_compute[186788]: 2025-11-22 07:59:15.758 186792 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:59:15 np0005531888 nova_compute[186788]: 2025-11-22 07:59:15.898 186792 DEBUG nova.compute.manager [req-32cdf77c-7ef2-4421-9507-6fc97409bef9 req-da2fc08f-37b1-4e59-a020-dea09ed2443d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received event network-changed-ef99e0b6-d66f-4a41-9288-b64272928405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:15 np0005531888 nova_compute[186788]: 2025-11-22 07:59:15.899 186792 DEBUG nova.compute.manager [req-32cdf77c-7ef2-4421-9507-6fc97409bef9 req-da2fc08f-37b1-4e59-a020-dea09ed2443d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Refreshing instance network info cache due to event network-changed-ef99e0b6-d66f-4a41-9288-b64272928405. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:59:15 np0005531888 nova_compute[186788]: 2025-11-22 07:59:15.900 186792 DEBUG oslo_concurrency.lockutils [req-32cdf77c-7ef2-4421-9507-6fc97409bef9 req-da2fc08f-37b1-4e59-a020-dea09ed2443d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6d8dfcf7-03f2-4533-bda6-95bb65d3f359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:59:16 np0005531888 nova_compute[186788]: 2025-11-22 07:59:16.008 186792 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:59:16 np0005531888 nova_compute[186788]: 2025-11-22 07:59:16.344 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.786 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.786 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.787 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.787 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.787 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.801 186792 INFO nova.compute.manager [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Terminating instance#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.810 186792 DEBUG nova.compute.manager [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.890 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.939 186792 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Updating instance_info_cache with network_info: [{"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.961 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Releasing lock "refresh_cache-6d8dfcf7-03f2-4533-bda6-95bb65d3f359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.961 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Instance network_info: |[{"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.961 186792 DEBUG oslo_concurrency.lockutils [req-32cdf77c-7ef2-4421-9507-6fc97409bef9 req-da2fc08f-37b1-4e59-a020-dea09ed2443d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6d8dfcf7-03f2-4533-bda6-95bb65d3f359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.962 186792 DEBUG nova.network.neutron [req-32cdf77c-7ef2-4421-9507-6fc97409bef9 req-da2fc08f-37b1-4e59-a020-dea09ed2443d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Refreshing network info cache for port ef99e0b6-d66f-4a41-9288-b64272928405 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.964 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Start _get_guest_xml network_info=[{"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.969 186792 WARNING nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.973 186792 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.973 186792 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.982 186792 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.983 186792 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.984 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.984 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.985 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.985 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.985 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.985 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.986 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.986 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.986 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.986 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.987 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.987 186792 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.990 186792 DEBUG nova.virt.libvirt.vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:59:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-1',id=86,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:11Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=6d8dfcf7-03f2-4533-bda6-95bb65d3f359,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.991 186792 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.992 186792 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:f6:7a,bridge_name='br-int',has_traffic_filtering=True,id=ef99e0b6-d66f-4a41-9288-b64272928405,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99e0b6-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:17 np0005531888 nova_compute[186788]: 2025-11-22 07:59:17.993 186792 DEBUG nova.objects.instance [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d8dfcf7-03f2-4533-bda6-95bb65d3f359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:18 np0005531888 kernel: tap0ee87554-bf (unregistering): left promiscuous mode
Nov 22 02:59:18 np0005531888 NetworkManager[55166]: <info>  [1763798358.0033] device (tap0ee87554-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.006 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <uuid>6d8dfcf7-03f2-4533-bda6-95bb65d3f359</uuid>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <name>instance-00000056</name>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <nova:name>tempest-ListServersNegativeTestJSON-server-1317504943-1</nova:name>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 07:59:17</nova:creationTime>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        <nova:user uuid="cf1790780fd64791b117114d170d6d90">tempest-ListServersNegativeTestJSON-1715955177-project-member</nova:user>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        <nova:project uuid="16ccb24424c54ae1a1b0d7eef6f7d690">tempest-ListServersNegativeTestJSON-1715955177</nova:project>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        <nova:port uuid="ef99e0b6-d66f-4a41-9288-b64272928405">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <system>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <entry name="serial">6d8dfcf7-03f2-4533-bda6-95bb65d3f359</entry>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <entry name="uuid">6d8dfcf7-03f2-4533-bda6-95bb65d3f359</entry>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </system>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <os>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  </os>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <features>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  </features>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  </clock>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  <devices>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk.config"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </disk>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:94:f6:7a"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <target dev="tapef99e0b6-d6"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </interface>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/console.log" append="off"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </serial>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <video>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </video>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </rng>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 02:59:18 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 02:59:18 np0005531888 nova_compute[186788]:  </devices>
Nov 22 02:59:18 np0005531888 nova_compute[186788]: </domain>
Nov 22 02:59:18 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:59:18 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:18Z|00223|binding|INFO|Releasing lport 0ee87554-bfe6-414b-a745-e97be6123cf6 from this chassis (sb_readonly=0)
Nov 22 02:59:18 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:18Z|00224|binding|INFO|Setting lport 0ee87554-bfe6-414b-a745-e97be6123cf6 down in Southbound
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.008 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Preparing to wait for external event network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.010 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:18 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:18Z|00225|binding|INFO|Removing iface tap0ee87554-bf ovn-installed in OVS
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.010 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.010 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.012 186792 DEBUG nova.virt.libvirt.vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:59:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-1',id=86,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:11Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=6d8dfcf7-03f2-4533-bda6-95bb65d3f359,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.012 186792 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.013 186792 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:f6:7a,bridge_name='br-int',has_traffic_filtering=True,id=ef99e0b6-d66f-4a41-9288-b64272928405,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99e0b6-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.014 186792 DEBUG os_vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:f6:7a,bridge_name='br-int',has_traffic_filtering=True,id=ef99e0b6-d66f-4a41-9288-b64272928405,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99e0b6-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.014 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.016 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.016 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.016 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.020 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.020 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef99e0b6-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.019 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:f0:34 10.100.0.6'], port_security=['fa:16:3e:cc:f0:34 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c1f4b900-94ac-4865-bbb3-cb003a35e9ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '565a9d7c-3f19-4203-8a5b-2a4d77b7ea7e 726ed215-2cc1-4cd0-860c-0d95ad883b6b f21ff6c8-202c-4eaf-8b6c-03ee8432b50d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63d51e5f-a087-4eb1-a0c4-4a9ee7856c37, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=0ee87554-bfe6-414b-a745-e97be6123cf6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.020 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 0ee87554-bfe6-414b-a745-e97be6123cf6 in datapath 9f740f05-d312-4e00-a27d-4d2a45e526b6 unbound from our chassis#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.020 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapef99e0b6-d6, col_values=(('external_ids', {'iface-id': 'ef99e0b6-d66f-4a41-9288-b64272928405', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:f6:7a', 'vm-uuid': '6d8dfcf7-03f2-4533-bda6-95bb65d3f359'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.022 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f740f05-d312-4e00-a27d-4d2a45e526b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:59:18 np0005531888 NetworkManager[55166]: <info>  [1763798358.0236] manager: (tapef99e0b6-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.024 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[34e9a57a-249f-4de5-8dfd-3e432974b0df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.024 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 namespace which is not needed anymore#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.025 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.030 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.037 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.039 186792 INFO os_vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:f6:7a,bridge_name='br-int',has_traffic_filtering=True,id=ef99e0b6-d66f-4a41-9288-b64272928405,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99e0b6-d6')#033[00m
Nov 22 02:59:18 np0005531888 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000051.scope: Deactivated successfully.
Nov 22 02:59:18 np0005531888 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000051.scope: Consumed 15.919s CPU time.
Nov 22 02:59:18 np0005531888 systemd-machined[153106]: Machine qemu-37-instance-00000051 terminated.
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.116 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.116 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.116 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No VIF found with MAC fa:16:3e:94:f6:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.117 186792 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Using config drive#033[00m
Nov 22 02:59:18 np0005531888 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[225352]: [NOTICE]   (225356) : haproxy version is 2.8.14-c23fe91
Nov 22 02:59:18 np0005531888 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[225352]: [NOTICE]   (225356) : path to executable is /usr/sbin/haproxy
Nov 22 02:59:18 np0005531888 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[225352]: [WARNING]  (225356) : Exiting Master process...
Nov 22 02:59:18 np0005531888 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[225352]: [ALERT]    (225356) : Current worker (225358) exited with code 143 (Terminated)
Nov 22 02:59:18 np0005531888 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[225352]: [WARNING]  (225356) : All workers exited. Exiting... (0)
Nov 22 02:59:18 np0005531888 systemd[1]: libpod-81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828.scope: Deactivated successfully.
Nov 22 02:59:18 np0005531888 podman[225856]: 2025-11-22 07:59:18.204838048 +0000 UTC m=+0.068622789 container died 81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.283 186792 INFO nova.virt.libvirt.driver [-] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Instance destroyed successfully.#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.284 186792 DEBUG nova.objects.instance [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'resources' on Instance uuid c1f4b900-94ac-4865-bbb3-cb003a35e9ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.305 186792 DEBUG nova.virt.libvirt.vif [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-322704032',display_name='tempest-SecurityGroupsTestJSON-server-322704032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-322704032',id=81,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-b530zn28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:58:28Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=c1f4b900-94ac-4865-bbb3-cb003a35e9ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.305 186792 DEBUG nova.network.os_vif_util [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "0ee87554-bfe6-414b-a745-e97be6123cf6", "address": "fa:16:3e:cc:f0:34", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ee87554-bf", "ovs_interfaceid": "0ee87554-bfe6-414b-a745-e97be6123cf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.306 186792 DEBUG nova.network.os_vif_util [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cc:f0:34,bridge_name='br-int',has_traffic_filtering=True,id=0ee87554-bfe6-414b-a745-e97be6123cf6,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ee87554-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.307 186792 DEBUG os_vif [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:f0:34,bridge_name='br-int',has_traffic_filtering=True,id=0ee87554-bfe6-414b-a745-e97be6123cf6,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ee87554-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.309 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ee87554-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:18 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828-userdata-shm.mount: Deactivated successfully.
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.311 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.313 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:59:18 np0005531888 systemd[1]: var-lib-containers-storage-overlay-59e351f87d2ee69ebac02e5e0e4c7d750aedeaf42356480c744c6168e33afbc9-merged.mount: Deactivated successfully.
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.315 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.318 186792 INFO os_vif [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:f0:34,bridge_name='br-int',has_traffic_filtering=True,id=0ee87554-bfe6-414b-a745-e97be6123cf6,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ee87554-bf')#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.318 186792 INFO nova.virt.libvirt.driver [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Deleting instance files /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee_del#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.319 186792 INFO nova.virt.libvirt.driver [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Deletion of /var/lib/nova/instances/c1f4b900-94ac-4865-bbb3-cb003a35e9ee_del complete#033[00m
Nov 22 02:59:18 np0005531888 podman[225856]: 2025-11-22 07:59:18.384092468 +0000 UTC m=+0.247877209 container cleanup 81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.389 186792 INFO nova.compute.manager [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Took 0.58 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.389 186792 DEBUG oslo.service.loopingcall [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.389 186792 DEBUG nova.compute.manager [-] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.389 186792 DEBUG nova.network.neutron [-] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:59:18 np0005531888 systemd[1]: libpod-conmon-81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828.scope: Deactivated successfully.
Nov 22 02:59:18 np0005531888 podman[225908]: 2025-11-22 07:59:18.635153184 +0000 UTC m=+0.225609991 container remove 81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.642 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc28084-347f-4bc2-9f99-3d2f448dcbee]: (4, ('Sat Nov 22 07:59:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 (81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828)\n81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828\nSat Nov 22 07:59:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 (81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828)\n81ee96c995acb66b4b30b786ddf01b6ca50fd58640265ce98d158e6f3f4c2828\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.645 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4458a0-cc18-4120-8533-6ab6c9559d24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.646 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f740f05-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.648 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 kernel: tap9f740f05-d0: left promiscuous mode
Nov 22 02:59:18 np0005531888 nova_compute[186788]: 2025-11-22 07:59:18.663 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.666 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[496f70c3-a61e-4af0-ab53-8994ff9b7496]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.681 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[692db76c-f1fe-473f-893a-30d6b286a042]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.682 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a5eb79d8-f8c1-4894-879e-3d20cb03cf32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.696 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5c0065-ec5c-45d9-a5aa-a44a698498e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504822, 'reachable_time': 34537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225927, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.698 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:59:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:18.698 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0d80a4fa-3860-4286-8a4b-1be63f478db4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531888 systemd[1]: run-netns-ovnmeta\x2d9f740f05\x2dd312\x2d4e00\x2da27d\x2d4d2a45e526b6.mount: Deactivated successfully.
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.443 186792 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Creating config drive at /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk.config#033[00m
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.448 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe2kqs6n2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.573 186792 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe2kqs6n2" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:19 np0005531888 kernel: tapef99e0b6-d6: entered promiscuous mode
Nov 22 02:59:19 np0005531888 NetworkManager[55166]: <info>  [1763798359.6295] manager: (tapef99e0b6-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Nov 22 02:59:19 np0005531888 systemd-udevd[225836]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.631 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:19Z|00226|binding|INFO|Claiming lport ef99e0b6-d66f-4a41-9288-b64272928405 for this chassis.
Nov 22 02:59:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:19Z|00227|binding|INFO|ef99e0b6-d66f-4a41-9288-b64272928405: Claiming fa:16:3e:94:f6:7a 10.100.0.14
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.640 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:f6:7a 10.100.0.14'], port_security=['fa:16:3e:94:f6:7a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6d8dfcf7-03f2-4533-bda6-95bb65d3f359', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4820a7f-a658-410a-b393-c754d89b7982', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ac2bec8-4c70-4af1-8a46-6da94edec63d, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=ef99e0b6-d66f-4a41-9288-b64272928405) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.641 104023 INFO neutron.agent.ovn.metadata.agent [-] Port ef99e0b6-d66f-4a41-9288-b64272928405 in datapath d6148823-d007-4a7e-be44-4329f8ecc6e5 bound to our chassis#033[00m
Nov 22 02:59:19 np0005531888 NetworkManager[55166]: <info>  [1763798359.6428] device (tapef99e0b6-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.642 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6148823-d007-4a7e-be44-4329f8ecc6e5#033[00m
Nov 22 02:59:19 np0005531888 NetworkManager[55166]: <info>  [1763798359.6434] device (tapef99e0b6-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:59:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:19Z|00228|binding|INFO|Setting lport ef99e0b6-d66f-4a41-9288-b64272928405 ovn-installed in OVS
Nov 22 02:59:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:19Z|00229|binding|INFO|Setting lport ef99e0b6-d66f-4a41-9288-b64272928405 up in Southbound
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.655 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[046739b4-b68f-460b-b24f-146447192930]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.656 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6148823-d1 in ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.659 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6148823-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.659 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0a31612d-2e19-4f36-96ef-16269b5d3f30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.660 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eba33678-5447-4665-8e61-a035a7bb621f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 systemd-machined[153106]: New machine qemu-39-instance-00000056.
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.673 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[474219b2-b186-4611-9188-b999564c8b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 systemd[1]: Started Virtual Machine qemu-39-instance-00000056.
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.686 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[75dc7d03-9be4-4b0b-83be-57f54c87a705]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.718 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f5feba5c-6480-457a-9f37-2de834f163dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 NetworkManager[55166]: <info>  [1763798359.7249] manager: (tapd6148823-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/126)
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.725 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8bf896-f3ef-4870-8ab2-07bc343c438a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.768 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8439f6fe-4563-44cd-9246-e73612a8ab50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.772 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[31c5d7de-810d-4269-ac1e-9e0449cb9bb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 NetworkManager[55166]: <info>  [1763798359.7946] device (tapd6148823-d0): carrier: link connected
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.800 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[02ac5e7f-fbb1-440b-92d2-680c6aa22564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.816 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c39241-28ed-4587-888d-94400fbf2042]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6148823-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:f2:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510043, 'reachable_time': 30008, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225977, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.831 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[086e4af7-45e1-4246-ad5e-94d18deec4bc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:f2ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510043, 'tstamp': 510043}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225978, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.847 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[447f8814-b6a5-4ba5-88d2-7f0cbe2b015d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6148823-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:f2:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510043, 'reachable_time': 30008, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225979, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.873 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed131fd-e9ec-4e05-9d06-14c8b581bdc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.932 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c61e9b-33c8-4574-86c0-6371bb9dfcbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.933 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6148823-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.934 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.934 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6148823-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.935 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:19 np0005531888 NetworkManager[55166]: <info>  [1763798359.9364] manager: (tapd6148823-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Nov 22 02:59:19 np0005531888 kernel: tapd6148823-d0: entered promiscuous mode
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.939 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.942 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6148823-d0, col_values=(('external_ids', {'iface-id': '2f86d506-522f-4def-915e-a14693535092'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.942 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.943 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:19 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:19Z|00230|binding|INFO|Releasing lport 2f86d506-522f-4def-915e-a14693535092 from this chassis (sb_readonly=0)
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.946 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.947 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a1427e-8cfe-447f-b7f6-4eaa24ca500e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.948 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-d6148823-d007-4a7e-be44-4329f8ecc6e5
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID d6148823-d007-4a7e-be44-4329f8ecc6e5
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:59:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:19.949 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'env', 'PROCESS_TAG=haproxy-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6148823-d007-4a7e-be44-4329f8ecc6e5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:59:19 np0005531888 nova_compute[186788]: 2025-11-22 07:59:19.954 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.061 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798360.0612473, 6d8dfcf7-03f2-4533-bda6-95bb65d3f359 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.069 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] VM Started (Lifecycle Event)#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.084 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.088 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798360.0621374, 6d8dfcf7-03f2-4533-bda6-95bb65d3f359 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.089 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.108 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.112 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.129 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:59:20 np0005531888 podman[226018]: 2025-11-22 07:59:20.360332537 +0000 UTC m=+0.099337954 container create 9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 02:59:20 np0005531888 podman[226018]: 2025-11-22 07:59:20.28644139 +0000 UTC m=+0.025446827 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:59:20 np0005531888 systemd[1]: Started libpod-conmon-9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586.scope.
Nov 22 02:59:20 np0005531888 systemd[1]: Started libcrun container.
Nov 22 02:59:20 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/840549ca7f80155847ae72fd9aad4ce574fedc7f8cd33857fefa9b6a34b2e488/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:59:20 np0005531888 podman[226018]: 2025-11-22 07:59:20.52667604 +0000 UTC m=+0.265681477 container init 9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 02:59:20 np0005531888 podman[226018]: 2025-11-22 07:59:20.532124854 +0000 UTC m=+0.271130261 container start 9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 02:59:20 np0005531888 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226033]: [NOTICE]   (226037) : New worker (226039) forked
Nov 22 02:59:20 np0005531888 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226033]: [NOTICE]   (226037) : Loading success.
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.636 186792 DEBUG nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-vif-unplugged-0ee87554-bfe6-414b-a745-e97be6123cf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.636 186792 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.637 186792 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.637 186792 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.638 186792 DEBUG nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] No waiting events found dispatching network-vif-unplugged-0ee87554-bfe6-414b-a745-e97be6123cf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.638 186792 DEBUG nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-vif-unplugged-0ee87554-bfe6-414b-a745-e97be6123cf6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.639 186792 DEBUG nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.639 186792 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.639 186792 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.639 186792 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.640 186792 DEBUG nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] No waiting events found dispatching network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.640 186792 WARNING nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received unexpected event network-vif-plugged-0ee87554-bfe6-414b-a745-e97be6123cf6 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.726 186792 DEBUG nova.compute.manager [req-53db4253-70f2-4a21-a8ed-e84f31f55e77 req-c73ad0a5-feb2-4ae9-b13c-b9a8503d9457 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received event network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.726 186792 DEBUG oslo_concurrency.lockutils [req-53db4253-70f2-4a21-a8ed-e84f31f55e77 req-c73ad0a5-feb2-4ae9-b13c-b9a8503d9457 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.727 186792 DEBUG oslo_concurrency.lockutils [req-53db4253-70f2-4a21-a8ed-e84f31f55e77 req-c73ad0a5-feb2-4ae9-b13c-b9a8503d9457 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.727 186792 DEBUG oslo_concurrency.lockutils [req-53db4253-70f2-4a21-a8ed-e84f31f55e77 req-c73ad0a5-feb2-4ae9-b13c-b9a8503d9457 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.727 186792 DEBUG nova.compute.manager [req-53db4253-70f2-4a21-a8ed-e84f31f55e77 req-c73ad0a5-feb2-4ae9-b13c-b9a8503d9457 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Processing event network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.728 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.732 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798360.7324579, 6d8dfcf7-03f2-4533-bda6-95bb65d3f359 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.733 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.734 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.738 186792 INFO nova.virt.libvirt.driver [-] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Instance spawned successfully.#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.739 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.753 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.760 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.765 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.766 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.766 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.766 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.767 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.767 186792 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.801 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.835 186792 DEBUG nova.network.neutron [req-32cdf77c-7ef2-4421-9507-6fc97409bef9 req-da2fc08f-37b1-4e59-a020-dea09ed2443d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Updated VIF entry in instance network info cache for port ef99e0b6-d66f-4a41-9288-b64272928405. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.836 186792 DEBUG nova.network.neutron [req-32cdf77c-7ef2-4421-9507-6fc97409bef9 req-da2fc08f-37b1-4e59-a020-dea09ed2443d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Updating instance_info_cache with network_info: [{"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.848 186792 INFO nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Took 8.81 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.849 186792 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.850 186792 DEBUG nova.network.neutron [-] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.854 186792 DEBUG oslo_concurrency.lockutils [req-32cdf77c-7ef2-4421-9507-6fc97409bef9 req-da2fc08f-37b1-4e59-a020-dea09ed2443d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6d8dfcf7-03f2-4533-bda6-95bb65d3f359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.876 186792 INFO nova.compute.manager [-] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Took 2.49 seconds to deallocate network for instance.#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.959 186792 INFO nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Took 9.52 seconds to build instance.#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.980 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.981 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:20 np0005531888 nova_compute[186788]: 2025-11-22 07:59:20.983 186792 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:21 np0005531888 nova_compute[186788]: 2025-11-22 07:59:21.086 186792 DEBUG nova.compute.provider_tree [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:21 np0005531888 nova_compute[186788]: 2025-11-22 07:59:21.104 186792 DEBUG nova.scheduler.client.report [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:21 np0005531888 nova_compute[186788]: 2025-11-22 07:59:21.158 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:21 np0005531888 nova_compute[186788]: 2025-11-22 07:59:21.189 186792 INFO nova.scheduler.client.report [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Deleted allocations for instance c1f4b900-94ac-4865-bbb3-cb003a35e9ee#033[00m
Nov 22 02:59:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:21.259 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:21.260 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:59:21 np0005531888 nova_compute[186788]: 2025-11-22 07:59:21.261 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:21 np0005531888 nova_compute[186788]: 2025-11-22 07:59:21.285 186792 DEBUG oslo_concurrency.lockutils [None req-2a768c8e-c137-4fc4-ad23-58c5e4f96ab1 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "c1f4b900-94ac-4865-bbb3-cb003a35e9ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.498s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:22 np0005531888 podman[226049]: 2025-11-22 07:59:22.68755161 +0000 UTC m=+0.055006074 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:59:22 np0005531888 podman[226048]: 2025-11-22 07:59:22.693283341 +0000 UTC m=+0.060785646 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:59:22 np0005531888 nova_compute[186788]: 2025-11-22 07:59:22.812 186792 DEBUG nova.compute.manager [req-bfd6aaa0-2501-40cb-b0b6-62cfe7e1b4f0 req-aa648e23-d0d8-4563-9334-ea58f7657204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Received event network-vif-deleted-0ee87554-bfe6-414b-a745-e97be6123cf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:22 np0005531888 nova_compute[186788]: 2025-11-22 07:59:22.812 186792 DEBUG nova.compute.manager [req-bfd6aaa0-2501-40cb-b0b6-62cfe7e1b4f0 req-aa648e23-d0d8-4563-9334-ea58f7657204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received event network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:22 np0005531888 nova_compute[186788]: 2025-11-22 07:59:22.812 186792 DEBUG oslo_concurrency.lockutils [req-bfd6aaa0-2501-40cb-b0b6-62cfe7e1b4f0 req-aa648e23-d0d8-4563-9334-ea58f7657204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:22 np0005531888 nova_compute[186788]: 2025-11-22 07:59:22.813 186792 DEBUG oslo_concurrency.lockutils [req-bfd6aaa0-2501-40cb-b0b6-62cfe7e1b4f0 req-aa648e23-d0d8-4563-9334-ea58f7657204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:22 np0005531888 nova_compute[186788]: 2025-11-22 07:59:22.813 186792 DEBUG oslo_concurrency.lockutils [req-bfd6aaa0-2501-40cb-b0b6-62cfe7e1b4f0 req-aa648e23-d0d8-4563-9334-ea58f7657204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:22 np0005531888 nova_compute[186788]: 2025-11-22 07:59:22.813 186792 DEBUG nova.compute.manager [req-bfd6aaa0-2501-40cb-b0b6-62cfe7e1b4f0 req-aa648e23-d0d8-4563-9334-ea58f7657204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] No waiting events found dispatching network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:22 np0005531888 nova_compute[186788]: 2025-11-22 07:59:22.814 186792 WARNING nova.compute.manager [req-bfd6aaa0-2501-40cb-b0b6-62cfe7e1b4f0 req-aa648e23-d0d8-4563-9334-ea58f7657204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received unexpected event network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:59:22 np0005531888 nova_compute[186788]: 2025-11-22 07:59:22.893 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:23 np0005531888 nova_compute[186788]: 2025-11-22 07:59:23.316 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.626 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.626 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.627 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.627 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.627 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.633 186792 INFO nova.compute.manager [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Terminating instance#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.639 186792 DEBUG nova.compute.manager [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:59:26 np0005531888 kernel: tapef99e0b6-d6 (unregistering): left promiscuous mode
Nov 22 02:59:26 np0005531888 NetworkManager[55166]: <info>  [1763798366.6645] device (tapef99e0b6-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.676 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:26 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:26Z|00231|binding|INFO|Releasing lport ef99e0b6-d66f-4a41-9288-b64272928405 from this chassis (sb_readonly=0)
Nov 22 02:59:26 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:26Z|00232|binding|INFO|Setting lport ef99e0b6-d66f-4a41-9288-b64272928405 down in Southbound
Nov 22 02:59:26 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:26Z|00233|binding|INFO|Removing iface tapef99e0b6-d6 ovn-installed in OVS
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.678 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:26.685 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:f6:7a 10.100.0.14'], port_security=['fa:16:3e:94:f6:7a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6d8dfcf7-03f2-4533-bda6-95bb65d3f359', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4820a7f-a658-410a-b393-c754d89b7982', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ac2bec8-4c70-4af1-8a46-6da94edec63d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=ef99e0b6-d66f-4a41-9288-b64272928405) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:26.686 104023 INFO neutron.agent.ovn.metadata.agent [-] Port ef99e0b6-d66f-4a41-9288-b64272928405 in datapath d6148823-d007-4a7e-be44-4329f8ecc6e5 unbound from our chassis#033[00m
Nov 22 02:59:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:26.688 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6148823-d007-4a7e-be44-4329f8ecc6e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:59:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:26.689 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b3abd3db-5e65-42ff-bdc5-05043d2b2e3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:26.690 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 namespace which is not needed anymore#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.690 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:26 np0005531888 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000056.scope: Deactivated successfully.
Nov 22 02:59:26 np0005531888 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000056.scope: Consumed 6.342s CPU time.
Nov 22 02:59:26 np0005531888 systemd-machined[153106]: Machine qemu-39-instance-00000056 terminated.
Nov 22 02:59:26 np0005531888 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226033]: [NOTICE]   (226037) : haproxy version is 2.8.14-c23fe91
Nov 22 02:59:26 np0005531888 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226033]: [NOTICE]   (226037) : path to executable is /usr/sbin/haproxy
Nov 22 02:59:26 np0005531888 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226033]: [WARNING]  (226037) : Exiting Master process...
Nov 22 02:59:26 np0005531888 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226033]: [ALERT]    (226037) : Current worker (226039) exited with code 143 (Terminated)
Nov 22 02:59:26 np0005531888 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226033]: [WARNING]  (226037) : All workers exited. Exiting... (0)
Nov 22 02:59:26 np0005531888 systemd[1]: libpod-9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586.scope: Deactivated successfully.
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.868 186792 DEBUG nova.compute.manager [req-2e0f7030-9022-4495-afbd-c788357bbe20 req-eeb6c8b4-65b4-4584-8057-b1bf67ba3231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received event network-vif-unplugged-ef99e0b6-d66f-4a41-9288-b64272928405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.868 186792 DEBUG oslo_concurrency.lockutils [req-2e0f7030-9022-4495-afbd-c788357bbe20 req-eeb6c8b4-65b4-4584-8057-b1bf67ba3231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.869 186792 DEBUG oslo_concurrency.lockutils [req-2e0f7030-9022-4495-afbd-c788357bbe20 req-eeb6c8b4-65b4-4584-8057-b1bf67ba3231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.869 186792 DEBUG oslo_concurrency.lockutils [req-2e0f7030-9022-4495-afbd-c788357bbe20 req-eeb6c8b4-65b4-4584-8057-b1bf67ba3231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.869 186792 DEBUG nova.compute.manager [req-2e0f7030-9022-4495-afbd-c788357bbe20 req-eeb6c8b4-65b4-4584-8057-b1bf67ba3231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] No waiting events found dispatching network-vif-unplugged-ef99e0b6-d66f-4a41-9288-b64272928405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.870 186792 DEBUG nova.compute.manager [req-2e0f7030-9022-4495-afbd-c788357bbe20 req-eeb6c8b4-65b4-4584-8057-b1bf67ba3231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received event network-vif-unplugged-ef99e0b6-d66f-4a41-9288-b64272928405 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:59:26 np0005531888 podman[226113]: 2025-11-22 07:59:26.874365913 +0000 UTC m=+0.091701138 container died 9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.907 186792 INFO nova.virt.libvirt.driver [-] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Instance destroyed successfully.#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.909 186792 DEBUG nova.objects.instance [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'resources' on Instance uuid 6d8dfcf7-03f2-4533-bda6-95bb65d3f359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.930 186792 DEBUG nova.virt.libvirt.vif [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-1',id=86,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:59:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:59:20Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=6d8dfcf7-03f2-4533-bda6-95bb65d3f359,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.931 186792 DEBUG nova.network.os_vif_util [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "ef99e0b6-d66f-4a41-9288-b64272928405", "address": "fa:16:3e:94:f6:7a", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99e0b6-d6", "ovs_interfaceid": "ef99e0b6-d66f-4a41-9288-b64272928405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.932 186792 DEBUG nova.network.os_vif_util [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:f6:7a,bridge_name='br-int',has_traffic_filtering=True,id=ef99e0b6-d66f-4a41-9288-b64272928405,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99e0b6-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.932 186792 DEBUG os_vif [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:f6:7a,bridge_name='br-int',has_traffic_filtering=True,id=ef99e0b6-d66f-4a41-9288-b64272928405,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99e0b6-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.933 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.934 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef99e0b6-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.937 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.939 186792 INFO os_vif [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:f6:7a,bridge_name='br-int',has_traffic_filtering=True,id=ef99e0b6-d66f-4a41-9288-b64272928405,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99e0b6-d6')#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.940 186792 INFO nova.virt.libvirt.driver [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Deleting instance files /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359_del#033[00m
Nov 22 02:59:26 np0005531888 nova_compute[186788]: 2025-11-22 07:59:26.940 186792 INFO nova.virt.libvirt.driver [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Deletion of /var/lib/nova/instances/6d8dfcf7-03f2-4533-bda6-95bb65d3f359_del complete#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.001 186792 INFO nova.compute.manager [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.002 186792 DEBUG oslo.service.loopingcall [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.003 186792 DEBUG nova.compute.manager [-] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.003 186792 DEBUG nova.network.neutron [-] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:59:27 np0005531888 systemd[1]: var-lib-containers-storage-overlay-840549ca7f80155847ae72fd9aad4ce574fedc7f8cd33857fefa9b6a34b2e488-merged.mount: Deactivated successfully.
Nov 22 02:59:27 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586-userdata-shm.mount: Deactivated successfully.
Nov 22 02:59:27 np0005531888 podman[226113]: 2025-11-22 07:59:27.346337214 +0000 UTC m=+0.563672449 container cleanup 9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 02:59:27 np0005531888 systemd[1]: libpod-conmon-9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586.scope: Deactivated successfully.
Nov 22 02:59:27 np0005531888 podman[226161]: 2025-11-22 07:59:27.864497161 +0000 UTC m=+0.494394933 container remove 9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.872 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e688857a-f30b-47fa-bdd4-5efa85d405f0]: (4, ('Sat Nov 22 07:59:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 (9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586)\n9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586\nSat Nov 22 07:59:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 (9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586)\n9712f310237b7693c4e3982bf8d2173e501a46e48238696b4b883199a6c9e586\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.873 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba49724-d1f6-4a11-a74c-59aeca5b6f44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.874 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6148823-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.876 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:27 np0005531888 kernel: tapd6148823-d0: left promiscuous mode
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.888 186792 DEBUG nova.network.neutron [-] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.906 186792 INFO nova.compute.manager [-] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Took 0.90 seconds to deallocate network for instance.#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.922 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.927 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[07513fcd-99da-466f-9850-3d9074f20efc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.955 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[41edac71-62f7-4cea-ad70-14316e7a5584]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.956 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6154b2a0-f25b-4d8e-9e1c-e938b9e2fc14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.972 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5cbc51-0965-4138-8e9d-4f8f650d31c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510035, 'reachable_time': 43711, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226178, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:27 np0005531888 systemd[1]: run-netns-ovnmeta\x2dd6148823\x2dd007\x2d4a7e\x2dbe44\x2d4329f8ecc6e5.mount: Deactivated successfully.
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.976 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:59:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:27.976 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[13dfb8bd-4f9f-48ae-9f88-f8a20d76b36c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.981 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.982 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:27 np0005531888 nova_compute[186788]: 2025-11-22 07:59:27.985 186792 DEBUG nova.compute.manager [req-b94470b0-0c1c-4235-85c7-5696c517f43d req-d396d78f-13ad-4983-bc4c-f78fa223ae25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received event network-vif-deleted-ef99e0b6-d66f-4a41-9288-b64272928405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:28 np0005531888 nova_compute[186788]: 2025-11-22 07:59:28.052 186792 DEBUG nova.compute.provider_tree [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:28 np0005531888 nova_compute[186788]: 2025-11-22 07:59:28.063 186792 DEBUG nova.scheduler.client.report [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:28 np0005531888 nova_compute[186788]: 2025-11-22 07:59:28.083 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:28 np0005531888 nova_compute[186788]: 2025-11-22 07:59:28.107 186792 INFO nova.scheduler.client.report [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Deleted allocations for instance 6d8dfcf7-03f2-4533-bda6-95bb65d3f359#033[00m
Nov 22 02:59:28 np0005531888 nova_compute[186788]: 2025-11-22 07:59:28.179 186792 DEBUG oslo_concurrency.lockutils [None req-181b0f23-e7d9-40f4-a4fe-9ab3b5801981 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.425 186792 DEBUG nova.compute.manager [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.484 186792 INFO nova.compute.manager [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] instance snapshotting#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.485 186792 DEBUG nova.objects.instance [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'flavor' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.704 186792 DEBUG nova.compute.manager [req-81902254-33f5-46b5-9558-f53b07be02f2 req-4aee7ad0-27a6-40c2-b2a9-febf0301d4f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received event network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.704 186792 DEBUG oslo_concurrency.lockutils [req-81902254-33f5-46b5-9558-f53b07be02f2 req-4aee7ad0-27a6-40c2-b2a9-febf0301d4f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.704 186792 DEBUG oslo_concurrency.lockutils [req-81902254-33f5-46b5-9558-f53b07be02f2 req-4aee7ad0-27a6-40c2-b2a9-febf0301d4f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.705 186792 DEBUG oslo_concurrency.lockutils [req-81902254-33f5-46b5-9558-f53b07be02f2 req-4aee7ad0-27a6-40c2-b2a9-febf0301d4f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d8dfcf7-03f2-4533-bda6-95bb65d3f359-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.705 186792 DEBUG nova.compute.manager [req-81902254-33f5-46b5-9558-f53b07be02f2 req-4aee7ad0-27a6-40c2-b2a9-febf0301d4f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] No waiting events found dispatching network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:29 np0005531888 nova_compute[186788]: 2025-11-22 07:59:29.705 186792 WARNING nova.compute.manager [req-81902254-33f5-46b5-9558-f53b07be02f2 req-4aee7ad0-27a6-40c2-b2a9-febf0301d4f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Received unexpected event network-vif-plugged-ef99e0b6-d66f-4a41-9288-b64272928405 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.005 186792 INFO nova.virt.libvirt.driver [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Beginning live snapshot process#033[00m
Nov 22 02:59:30 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:30Z|00234|binding|INFO|Releasing lport 188249cb-6e2b-4c68-9c53-aaa0a3da466f from this chassis (sb_readonly=0)
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.059 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:30 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.300 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.358 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.360 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.417 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.434 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.492 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.493 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpcdp59kns/b2410313f7b84823baa0f9ca261adc29.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.538 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpcdp59kns/b2410313f7b84823baa0f9ca261adc29.delta 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.539 186792 INFO nova.virt.libvirt.driver [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:59:30 np0005531888 nova_compute[186788]: 2025-11-22 07:59:30.591 186792 DEBUG nova.virt.libvirt.guest [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:31 np0005531888 nova_compute[186788]: 2025-11-22 07:59:31.095 186792 DEBUG nova.virt.libvirt.guest [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:31 np0005531888 nova_compute[186788]: 2025-11-22 07:59:31.100 186792 INFO nova.virt.libvirt.driver [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:59:31 np0005531888 nova_compute[186788]: 2025-11-22 07:59:31.144 186792 DEBUG nova.privsep.utils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:59:31 np0005531888 nova_compute[186788]: 2025-11-22 07:59:31.145 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpcdp59kns/b2410313f7b84823baa0f9ca261adc29.delta /var/lib/nova/instances/snapshots/tmpcdp59kns/b2410313f7b84823baa0f9ca261adc29 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:31 np0005531888 podman[226206]: 2025-11-22 07:59:31.234396136 +0000 UTC m=+0.081652401 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:59:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:31.263 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:31 np0005531888 nova_compute[186788]: 2025-11-22 07:59:31.890 186792 DEBUG oslo_concurrency.processutils [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpcdp59kns/b2410313f7b84823baa0f9ca261adc29.delta /var/lib/nova/instances/snapshots/tmpcdp59kns/b2410313f7b84823baa0f9ca261adc29" returned: 0 in 0.744s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:31 np0005531888 nova_compute[186788]: 2025-11-22 07:59:31.896 186792 INFO nova.virt.libvirt.driver [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:59:31 np0005531888 nova_compute[186788]: 2025-11-22 07:59:31.936 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531888 nova_compute[186788]: 2025-11-22 07:59:32.924 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:33 np0005531888 nova_compute[186788]: 2025-11-22 07:59:33.281 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798358.2793918, c1f4b900-94ac-4865-bbb3-cb003a35e9ee => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:33 np0005531888 nova_compute[186788]: 2025-11-22 07:59:33.281 186792 INFO nova.compute.manager [-] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:59:33 np0005531888 nova_compute[186788]: 2025-11-22 07:59:33.294 186792 DEBUG nova.compute.manager [None req-12ca1488-aa68-4798-ba00-a7d0ab9db6dd - - - - - -] [instance: c1f4b900-94ac-4865-bbb3-cb003a35e9ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:33 np0005531888 podman[226237]: 2025-11-22 07:59:33.66621311 +0000 UTC m=+0.044632728 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:59:35 np0005531888 nova_compute[186788]: 2025-11-22 07:59:35.086 186792 INFO nova.virt.libvirt.driver [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Snapshot image upload complete#033[00m
Nov 22 02:59:35 np0005531888 nova_compute[186788]: 2025-11-22 07:59:35.087 186792 INFO nova.compute.manager [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Took 5.57 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:59:35 np0005531888 nova_compute[186788]: 2025-11-22 07:59:35.524 186792 DEBUG nova.compute.manager [None req-170d87cf-c0e1-4359-a343-d107bed06d6c d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found 1 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Nov 22 02:59:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:36.812 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:36.812 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 07:59:36.813 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:36 np0005531888 nova_compute[186788]: 2025-11-22 07:59:36.937 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:36 np0005531888 nova_compute[186788]: 2025-11-22 07:59:36.974 186792 DEBUG nova.compute.manager [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.046 186792 INFO nova.compute.manager [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] instance snapshotting#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.047 186792 DEBUG nova.objects.instance [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'flavor' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.271 186792 INFO nova.virt.libvirt.driver [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Beginning live snapshot process#033[00m
Nov 22 02:59:37 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.783 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:37 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:37Z|00235|binding|INFO|Releasing lport 188249cb-6e2b-4c68-9c53-aaa0a3da466f from this chassis (sb_readonly=0)
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.840 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.842 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.881 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.905 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.918 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.935 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.973 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:37 np0005531888 nova_compute[186788]: 2025-11-22 07:59:37.974 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpen0lo0gy/d399a794aa4e4092b49f9a3ac0f8e12c.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:38 np0005531888 nova_compute[186788]: 2025-11-22 07:59:38.034 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpen0lo0gy/d399a794aa4e4092b49f9a3ac0f8e12c.delta 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:38 np0005531888 nova_compute[186788]: 2025-11-22 07:59:38.035 186792 INFO nova.virt.libvirt.driver [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:59:38 np0005531888 nova_compute[186788]: 2025-11-22 07:59:38.094 186792 DEBUG nova.virt.libvirt.guest [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:38 np0005531888 podman[226274]: 2025-11-22 07:59:38.116797298 +0000 UTC m=+0.056526661 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 02:59:38 np0005531888 nova_compute[186788]: 2025-11-22 07:59:38.596 186792 DEBUG nova.virt.libvirt.guest [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 46727168 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:39 np0005531888 nova_compute[186788]: 2025-11-22 07:59:39.101 186792 DEBUG nova.virt.libvirt.guest [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:39 np0005531888 nova_compute[186788]: 2025-11-22 07:59:39.105 186792 INFO nova.virt.libvirt.driver [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:59:39 np0005531888 nova_compute[186788]: 2025-11-22 07:59:39.155 186792 DEBUG nova.privsep.utils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:59:39 np0005531888 nova_compute[186788]: 2025-11-22 07:59:39.156 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpen0lo0gy/d399a794aa4e4092b49f9a3ac0f8e12c.delta /var/lib/nova/instances/snapshots/tmpen0lo0gy/d399a794aa4e4092b49f9a3ac0f8e12c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:41 np0005531888 nova_compute[186788]: 2025-11-22 07:59:41.586 186792 DEBUG oslo_concurrency.processutils [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpen0lo0gy/d399a794aa4e4092b49f9a3ac0f8e12c.delta /var/lib/nova/instances/snapshots/tmpen0lo0gy/d399a794aa4e4092b49f9a3ac0f8e12c" returned: 0 in 2.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:41 np0005531888 nova_compute[186788]: 2025-11-22 07:59:41.592 186792 INFO nova.virt.libvirt.driver [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:59:41 np0005531888 nova_compute[186788]: 2025-11-22 07:59:41.906 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798366.9046755, 6d8dfcf7-03f2-4533-bda6-95bb65d3f359 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:41 np0005531888 nova_compute[186788]: 2025-11-22 07:59:41.906 186792 INFO nova.compute.manager [-] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:59:41 np0005531888 nova_compute[186788]: 2025-11-22 07:59:41.940 186792 DEBUG nova.compute.manager [None req-01b1916f-ab3a-4af1-82a7-9c085abaa1d1 - - - - - -] [instance: 6d8dfcf7-03f2-4533-bda6-95bb65d3f359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:41 np0005531888 nova_compute[186788]: 2025-11-22 07:59:41.940 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:42 np0005531888 nova_compute[186788]: 2025-11-22 07:59:42.576 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:42 np0005531888 nova_compute[186788]: 2025-11-22 07:59:42.927 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:44 np0005531888 podman[226312]: 2025-11-22 07:59:44.681141681 +0000 UTC m=+0.058340756 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:59:44 np0005531888 podman[226313]: 2025-11-22 07:59:44.706421463 +0000 UTC m=+0.080895911 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:59:44 np0005531888 nova_compute[186788]: 2025-11-22 07:59:44.861 186792 INFO nova.virt.libvirt.driver [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Snapshot image upload complete#033[00m
Nov 22 02:59:44 np0005531888 nova_compute[186788]: 2025-11-22 07:59:44.862 186792 INFO nova.compute.manager [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Took 7.79 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:59:45 np0005531888 nova_compute[186788]: 2025-11-22 07:59:45.518 186792 DEBUG nova.compute.manager [None req-deb17f7e-87bf-4db0-898c-9bf182edfaa6 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found 2 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Nov 22 02:59:46 np0005531888 nova_compute[186788]: 2025-11-22 07:59:46.942 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:47 np0005531888 ovn_controller[95067]: 2025-11-22T07:59:47Z|00236|binding|INFO|Releasing lport 188249cb-6e2b-4c68-9c53-aaa0a3da466f from this chassis (sb_readonly=0)
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.124 186792 DEBUG nova.compute.manager [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.147 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.194 186792 INFO nova.compute.manager [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] instance snapshotting#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.196 186792 DEBUG nova.objects.instance [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'flavor' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.514 186792 INFO nova.virt.libvirt.driver [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Beginning live snapshot process#033[00m
Nov 22 02:59:47 np0005531888 virtqemud[186358]: invalid argument: disk vda does not have an active block job
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.800 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.864 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.866 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.929 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.934 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json -f qcow2" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.952 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:47 np0005531888 nova_compute[186788]: 2025-11-22 07:59:47.976 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:48 np0005531888 nova_compute[186788]: 2025-11-22 07:59:48.015 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:48 np0005531888 nova_compute[186788]: 2025-11-22 07:59:48.016 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpyr0ii3qt/9dbdb083ba3f4d178f0236269e46773f.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:48 np0005531888 nova_compute[186788]: 2025-11-22 07:59:48.118 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:48 np0005531888 nova_compute[186788]: 2025-11-22 07:59:48.230 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpyr0ii3qt/9dbdb083ba3f4d178f0236269e46773f.delta 1073741824" returned: 0 in 0.214s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:48 np0005531888 nova_compute[186788]: 2025-11-22 07:59:48.231 186792 INFO nova.virt.libvirt.driver [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:59:48 np0005531888 nova_compute[186788]: 2025-11-22 07:59:48.297 186792 DEBUG nova.virt.libvirt.guest [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:48 np0005531888 nova_compute[186788]: 2025-11-22 07:59:48.801 186792 DEBUG nova.virt.libvirt.guest [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 7667712 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:48 np0005531888 nova_compute[186788]: 2025-11-22 07:59:48.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:49 np0005531888 nova_compute[186788]: 2025-11-22 07:59:49.304 186792 DEBUG nova.virt.libvirt.guest [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 14483456 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:49 np0005531888 nova_compute[186788]: 2025-11-22 07:59:49.809 186792 DEBUG nova.virt.libvirt.guest [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 22806528 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:49 np0005531888 nova_compute[186788]: 2025-11-22 07:59:49.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.313 186792 DEBUG nova.virt.libvirt.guest [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 53608448 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.819 186792 DEBUG nova.virt.libvirt.guest [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.823 186792 INFO nova.virt.libvirt.driver [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.873 186792 DEBUG nova.privsep.utils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.873 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpyr0ii3qt/9dbdb083ba3f4d178f0236269e46773f.delta /var/lib/nova/instances/snapshots/tmpyr0ii3qt/9dbdb083ba3f4d178f0236269e46773f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.994 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.995 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:59:50 np0005531888 nova_compute[186788]: 2025-11-22 07:59:50.995 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:59:51 np0005531888 nova_compute[186788]: 2025-11-22 07:59:51.944 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:52 np0005531888 nova_compute[186788]: 2025-11-22 07:59:52.930 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:53 np0005531888 podman[226382]: 2025-11-22 07:59:53.672372247 +0000 UTC m=+0.049128960 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:59:53 np0005531888 podman[226383]: 2025-11-22 07:59:53.701599016 +0000 UTC m=+0.072049403 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:59:54 np0005531888 nova_compute[186788]: 2025-11-22 07:59:54.051 186792 DEBUG oslo_concurrency.processutils [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpyr0ii3qt/9dbdb083ba3f4d178f0236269e46773f.delta /var/lib/nova/instances/snapshots/tmpyr0ii3qt/9dbdb083ba3f4d178f0236269e46773f" returned: 0 in 3.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:54 np0005531888 nova_compute[186788]: 2025-11-22 07:59:54.060 186792 INFO nova.virt.libvirt.driver [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:59:54 np0005531888 nova_compute[186788]: 2025-11-22 07:59:54.553 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:54 np0005531888 nova_compute[186788]: 2025-11-22 07:59:54.572 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:59:54 np0005531888 nova_compute[186788]: 2025-11-22 07:59:54.572 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:59:54 np0005531888 nova_compute[186788]: 2025-11-22 07:59:54.573 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:54 np0005531888 nova_compute[186788]: 2025-11-22 07:59:54.573 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:55 np0005531888 nova_compute[186788]: 2025-11-22 07:59:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:55 np0005531888 nova_compute[186788]: 2025-11-22 07:59:55.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:59:56 np0005531888 nova_compute[186788]: 2025-11-22 07:59:56.946 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:56 np0005531888 nova_compute[186788]: 2025-11-22 07:59:56.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:57 np0005531888 nova_compute[186788]: 2025-11-22 07:59:57.223 186792 INFO nova.virt.libvirt.driver [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Snapshot image upload complete#033[00m
Nov 22 02:59:57 np0005531888 nova_compute[186788]: 2025-11-22 07:59:57.223 186792 INFO nova.compute.manager [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Took 10.00 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:59:57 np0005531888 nova_compute[186788]: 2025-11-22 07:59:57.589 186792 DEBUG nova.compute.manager [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Found 3 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Nov 22 02:59:57 np0005531888 nova_compute[186788]: 2025-11-22 07:59:57.590 186792 DEBUG nova.compute.manager [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Rotating out 1 backups _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4458#033[00m
Nov 22 02:59:57 np0005531888 nova_compute[186788]: 2025-11-22 07:59:57.590 186792 DEBUG nova.compute.manager [None req-1f7dc4b9-11ff-4b5c-92c2-42450ed93010 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Deleting image 2c16bbfa-a2df-4c4f-861c-d350dfb09368 _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4463#033[00m
Nov 22 02:59:57 np0005531888 nova_compute[186788]: 2025-11-22 07:59:57.931 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:58 np0005531888 nova_compute[186788]: 2025-11-22 07:59:58.698 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:58 np0005531888 nova_compute[186788]: 2025-11-22 07:59:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:58 np0005531888 nova_compute[186788]: 2025-11-22 07:59:58.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:58 np0005531888 nova_compute[186788]: 2025-11-22 07:59:58.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:58 np0005531888 nova_compute[186788]: 2025-11-22 07:59:58.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:58 np0005531888 nova_compute[186788]: 2025-11-22 07:59:58.978 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.045 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.109 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.110 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.166 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.309 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.310 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5558MB free_disk=73.32070541381836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.310 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.311 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.440 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.441 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.441 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.455 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.470 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.471 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.484 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.504 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.540 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.562 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.700 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:59:59 np0005531888 nova_compute[186788]: 2025-11-22 07:59:59.700 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:01 np0005531888 podman[226427]: 2025-11-22 08:00:01.679373059 +0000 UTC m=+0.054220006 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:00:01 np0005531888 nova_compute[186788]: 2025-11-22 08:00:01.949 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:02 np0005531888 nova_compute[186788]: 2025-11-22 08:00:02.933 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:03 np0005531888 nova_compute[186788]: 2025-11-22 08:00:03.697 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:03 np0005531888 nova_compute[186788]: 2025-11-22 08:00:03.825 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:04 np0005531888 podman[226450]: 2025-11-22 08:00:04.67930792 +0000 UTC m=+0.048636077 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:00:06 np0005531888 nova_compute[186788]: 2025-11-22 08:00:06.951 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:07 np0005531888 nova_compute[186788]: 2025-11-22 08:00:07.935 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:08 np0005531888 podman[226474]: 2025-11-22 08:00:08.700262322 +0000 UTC m=+0.077640881 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.)
Nov 22 03:00:09 np0005531888 nova_compute[186788]: 2025-11-22 08:00:09.970 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "80cb8b15-443c-424b-894c-1ed6674f77d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:09 np0005531888 nova_compute[186788]: 2025-11-22 08:00:09.970 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:09 np0005531888 nova_compute[186788]: 2025-11-22 08:00:09.994 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.025 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.026 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.046 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.083 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.084 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.094 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.094 186792 INFO nova.compute.claims [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.114 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.215 186792 DEBUG nova.compute.provider_tree [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.227 186792 DEBUG nova.scheduler.client.report [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.246 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.247 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.250 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.256 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.257 186792 INFO nova.compute.claims [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.321 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.322 186792 DEBUG nova.network.neutron [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.350 186792 INFO nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.373 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.456 186792 DEBUG nova.compute.provider_tree [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.478 186792 DEBUG nova.scheduler.client.report [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.502 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.504 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.504 186792 INFO nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Creating image(s)#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.507 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "/var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.507 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.508 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.527 186792 DEBUG nova.policy [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.530 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.531 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.534 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.593 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.594 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.595 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.606 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.627 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.627 186792 DEBUG nova.network.neutron [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.651 186792 INFO nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.658 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.659 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.679 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.747 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk 1073741824" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.748 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.748 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.784 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.786 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.786 186792 INFO nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Creating image(s)#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.787 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.787 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.788 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.801 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.819 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.820 186792 DEBUG nova.virt.disk.api [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Checking if we can resize image /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.821 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.841 186792 DEBUG nova.policy [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.863 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.864 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.865 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.877 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.894 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.894 186792 DEBUG nova.virt.disk.api [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Cannot resize image /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.895 186792 DEBUG nova.objects.instance [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'migration_context' on Instance uuid 80cb8b15-443c-424b-894c-1ed6674f77d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.907 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.908 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Ensure instance console log exists: /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.908 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.909 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.909 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.932 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:10 np0005531888 nova_compute[186788]: 2025-11-22 08:00:10.932 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.043 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk 1073741824" returned: 0 in 0.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.045 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.045 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.105 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.106 186792 DEBUG nova.virt.disk.api [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.106 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.162 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.163 186792 DEBUG nova.virt.disk.api [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.163 186792 DEBUG nova.objects.instance [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.175 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.177 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Ensure instance console log exists: /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.177 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.178 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.178 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.413 186792 DEBUG nova.network.neutron [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Successfully created port: 487183e6-b09b-4561-97a9-8f8e44492221 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.953 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:11 np0005531888 nova_compute[186788]: 2025-11-22 08:00:11.999 186792 DEBUG nova.network.neutron [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Successfully created port: a2f45e58-237f-4de0-8339-5f17a4ad3cfe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.544 186792 DEBUG nova.network.neutron [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Successfully updated port: 487183e6-b09b-4561-97a9-8f8e44492221 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.574 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.575 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.575 186792 DEBUG nova.network.neutron [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.691 186792 DEBUG nova.compute.manager [req-30d7585a-c93e-444d-aa30-a6d1f6cbf083 req-acb192d0-fb2d-420e-9ca8-c1bcc72a2927 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received event network-changed-487183e6-b09b-4561-97a9-8f8e44492221 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.692 186792 DEBUG nova.compute.manager [req-30d7585a-c93e-444d-aa30-a6d1f6cbf083 req-acb192d0-fb2d-420e-9ca8-c1bcc72a2927 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Refreshing instance network info cache due to event network-changed-487183e6-b09b-4561-97a9-8f8e44492221. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.693 186792 DEBUG oslo_concurrency.lockutils [req-30d7585a-c93e-444d-aa30-a6d1f6cbf083 req-acb192d0-fb2d-420e-9ca8-c1bcc72a2927 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.774 186792 DEBUG nova.network.neutron [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:00:12 np0005531888 nova_compute[186788]: 2025-11-22 08:00:12.937 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:13 np0005531888 nova_compute[186788]: 2025-11-22 08:00:13.487 186792 DEBUG nova.network.neutron [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Successfully updated port: a2f45e58-237f-4de0-8339-5f17a4ad3cfe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:00:13 np0005531888 nova_compute[186788]: 2025-11-22 08:00:13.509 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:13 np0005531888 nova_compute[186788]: 2025-11-22 08:00:13.510 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:13 np0005531888 nova_compute[186788]: 2025-11-22 08:00:13.510 186792 DEBUG nova.network.neutron [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:00:13 np0005531888 nova_compute[186788]: 2025-11-22 08:00:13.637 186792 DEBUG nova.compute.manager [req-4bf7bd0c-d333-4d76-9b05-f19a17c000a7 req-10373f6b-fdb2-4c8d-8f12-c1e0422f5dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-changed-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:13 np0005531888 nova_compute[186788]: 2025-11-22 08:00:13.637 186792 DEBUG nova.compute.manager [req-4bf7bd0c-d333-4d76-9b05-f19a17c000a7 req-10373f6b-fdb2-4c8d-8f12-c1e0422f5dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Refreshing instance network info cache due to event network-changed-a2f45e58-237f-4de0-8339-5f17a4ad3cfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:00:13 np0005531888 nova_compute[186788]: 2025-11-22 08:00:13.637 186792 DEBUG oslo_concurrency.lockutils [req-4bf7bd0c-d333-4d76-9b05-f19a17c000a7 req-10373f6b-fdb2-4c8d-8f12-c1e0422f5dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:13 np0005531888 nova_compute[186788]: 2025-11-22 08:00:13.709 186792 DEBUG nova.network.neutron [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.449 186792 DEBUG nova.network.neutron [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Updating instance_info_cache with network_info: [{"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.569 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.570 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Instance network_info: |[{"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.574 186792 DEBUG oslo_concurrency.lockutils [req-30d7585a-c93e-444d-aa30-a6d1f6cbf083 req-acb192d0-fb2d-420e-9ca8-c1bcc72a2927 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.575 186792 DEBUG nova.network.neutron [req-30d7585a-c93e-444d-aa30-a6d1f6cbf083 req-acb192d0-fb2d-420e-9ca8-c1bcc72a2927 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Refreshing network info cache for port 487183e6-b09b-4561-97a9-8f8e44492221 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.578 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Start _get_guest_xml network_info=[{"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.586 186792 WARNING nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.595 186792 DEBUG nova.virt.libvirt.host [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.596 186792 DEBUG nova.virt.libvirt.host [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.600 186792 DEBUG nova.virt.libvirt.host [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.600 186792 DEBUG nova.virt.libvirt.host [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.602 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.602 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.603 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.603 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.603 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.603 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.603 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.604 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.604 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.604 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.604 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.605 186792 DEBUG nova.virt.hardware [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.608 186792 DEBUG nova.virt.libvirt.vif [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1246188074',display_name='tempest-ServerActionsTestOtherB-server-1246188074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1246188074',id=92,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-bcb74bw3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:00:10Z,user_data=None,user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=80cb8b15-443c-424b-894c-1ed6674f77d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.608 186792 DEBUG nova.network.os_vif_util [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.609 186792 DEBUG nova.network.os_vif_util [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:0b:78,bridge_name='br-int',has_traffic_filtering=True,id=487183e6-b09b-4561-97a9-8f8e44492221,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap487183e6-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.610 186792 DEBUG nova.objects.instance [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_devices' on Instance uuid 80cb8b15-443c-424b-894c-1ed6674f77d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.629 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <uuid>80cb8b15-443c-424b-894c-1ed6674f77d5</uuid>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <name>instance-0000005c</name>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestOtherB-server-1246188074</nova:name>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:00:14</nova:creationTime>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:user uuid="d0c5153b41c5499bac372d2df10b9b03">tempest-ServerActionsTestOtherB-270195081-project-member</nova:user>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:project uuid="62d9a4a13f5d41529bc273c278fae96b">tempest-ServerActionsTestOtherB-270195081</nova:project>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:port uuid="487183e6-b09b-4561-97a9-8f8e44492221">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="serial">80cb8b15-443c-424b-894c-1ed6674f77d5</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="uuid">80cb8b15-443c-424b-894c-1ed6674f77d5</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk.config"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:2d:0b:78"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <target dev="tap487183e6-b0"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/console.log" append="off"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:00:14 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:00:14 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.630 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Preparing to wait for external event network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.630 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.631 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.631 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.632 186792 DEBUG nova.virt.libvirt.vif [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1246188074',display_name='tempest-ServerActionsTestOtherB-server-1246188074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1246188074',id=92,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-bcb74bw3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:00:10Z,user_data=None,user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=80cb8b15-443c-424b-894c-1ed6674f77d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.632 186792 DEBUG nova.network.os_vif_util [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.632 186792 DEBUG nova.network.os_vif_util [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:0b:78,bridge_name='br-int',has_traffic_filtering=True,id=487183e6-b09b-4561-97a9-8f8e44492221,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap487183e6-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.633 186792 DEBUG os_vif [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:0b:78,bridge_name='br-int',has_traffic_filtering=True,id=487183e6-b09b-4561-97a9-8f8e44492221,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap487183e6-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.633 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.634 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.634 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.637 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.637 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap487183e6-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.637 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap487183e6-b0, col_values=(('external_ids', {'iface-id': '487183e6-b09b-4561-97a9-8f8e44492221', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:0b:78', 'vm-uuid': '80cb8b15-443c-424b-894c-1ed6674f77d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.639 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531888 NetworkManager[55166]: <info>  [1763798414.6404] manager: (tap487183e6-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.641 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.646 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.647 186792 INFO os_vif [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:0b:78,bridge_name='br-int',has_traffic_filtering=True,id=487183e6-b09b-4561-97a9-8f8e44492221,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap487183e6-b0')#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.690 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.690 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.691 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No VIF found with MAC fa:16:3e:2d:0b:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.691 186792 INFO nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Using config drive#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.772 186792 DEBUG nova.network.neutron [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.888 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.888 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance network_info: |[{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.889 186792 DEBUG oslo_concurrency.lockutils [req-4bf7bd0c-d333-4d76-9b05-f19a17c000a7 req-10373f6b-fdb2-4c8d-8f12-c1e0422f5dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.889 186792 DEBUG nova.network.neutron [req-4bf7bd0c-d333-4d76-9b05-f19a17c000a7 req-10373f6b-fdb2-4c8d-8f12-c1e0422f5dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Refreshing network info cache for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.892 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start _get_guest_xml network_info=[{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.897 186792 WARNING nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.903 186792 DEBUG nova.virt.libvirt.host [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.904 186792 DEBUG nova.virt.libvirt.host [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.910 186792 DEBUG nova.virt.libvirt.host [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.910 186792 DEBUG nova.virt.libvirt.host [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.911 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.911 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.912 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.912 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.912 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.912 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.913 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.913 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.913 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.914 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.914 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.914 186792 DEBUG nova.virt.hardware [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.917 186792 DEBUG nova.virt.libvirt.vif [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:00:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.918 186792 DEBUG nova.network.os_vif_util [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.918 186792 DEBUG nova.network.os_vif_util [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.919 186792 DEBUG nova.objects.instance [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.936 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <uuid>eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</uuid>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <name>instance-0000005d</name>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestJSON-server-1519356482</nova:name>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:00:14</nova:creationTime>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        <nova:port uuid="a2f45e58-237f-4de0-8339-5f17a4ad3cfe">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="serial">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="uuid">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:df:95:59"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <target dev="tapa2f45e58-23"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/console.log" append="off"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:00:14 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:00:14 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:00:14 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:00:14 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.937 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Preparing to wait for external event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.937 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.938 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.938 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.938 186792 DEBUG nova.virt.libvirt.vif [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:00:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.939 186792 DEBUG nova.network.os_vif_util [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.939 186792 DEBUG nova.network.os_vif_util [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.939 186792 DEBUG os_vif [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.940 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.940 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.940 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.942 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.942 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2f45e58-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.943 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2f45e58-23, col_values=(('external_ids', {'iface-id': 'a2f45e58-237f-4de0-8339-5f17a4ad3cfe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:95:59', 'vm-uuid': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.944 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531888 NetworkManager[55166]: <info>  [1763798414.9451] manager: (tapa2f45e58-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.947 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.952 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531888 nova_compute[186788]: 2025-11-22 08:00:14.953 186792 INFO os_vif [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.040 186792 INFO nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Creating config drive at /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk.config#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.044 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx4mpckpe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.112 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.113 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.113 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No VIF found with MAC fa:16:3e:df:95:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.113 186792 INFO nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Using config drive#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.167 186792 DEBUG oslo_concurrency.processutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx4mpckpe" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:15 np0005531888 kernel: tap487183e6-b0: entered promiscuous mode
Nov 22 03:00:15 np0005531888 NetworkManager[55166]: <info>  [1763798415.2482] manager: (tap487183e6-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Nov 22 03:00:15 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:15Z|00237|binding|INFO|Claiming lport 487183e6-b09b-4561-97a9-8f8e44492221 for this chassis.
Nov 22 03:00:15 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:15Z|00238|binding|INFO|487183e6-b09b-4561-97a9-8f8e44492221: Claiming fa:16:3e:2d:0b:78 10.100.0.10
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.252 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:15 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:15Z|00239|binding|INFO|Setting lport 487183e6-b09b-4561-97a9-8f8e44492221 ovn-installed in OVS
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.270 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.276 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:0b:78 10.100.0.10'], port_security=['fa:16:3e:2d:0b:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80908aff-0365-41dd-a88b-8ec1981e86fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=487183e6-b09b-4561-97a9-8f8e44492221) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:15 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:15Z|00240|binding|INFO|Setting lport 487183e6-b09b-4561-97a9-8f8e44492221 up in Southbound
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.278 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 487183e6-b09b-4561-97a9-8f8e44492221 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 bound to our chassis#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.280 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087#033[00m
Nov 22 03:00:15 np0005531888 systemd-udevd[226579]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.295 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e00e1c92-1f11-4797-b493-9c5be04c395e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:15 np0005531888 systemd-machined[153106]: New machine qemu-40-instance-0000005c.
Nov 22 03:00:15 np0005531888 NetworkManager[55166]: <info>  [1763798415.3069] device (tap487183e6-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:00:15 np0005531888 NetworkManager[55166]: <info>  [1763798415.3076] device (tap487183e6-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:00:15 np0005531888 podman[226540]: 2025-11-22 08:00:15.309991372 +0000 UTC m=+0.072100436 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:00:15 np0005531888 systemd[1]: Started Virtual Machine qemu-40-instance-0000005c.
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.327 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6c95fe-f806-4a90-8417-291ec2a03ac3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.330 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e34bffef-1b11-44c2-a85c-732f4ba63793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:15 np0005531888 podman[226541]: 2025-11-22 08:00:15.33510423 +0000 UTC m=+0.093354268 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.354 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b6159c49-526f-4ca6-9b27-81ebb521553e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.372 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2e59f55a-4dba-42b2-85c0-3d30611ae7a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506803, 'reachable_time': 17176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226604, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.385 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[92b49f99-af03-4a15-b5b2-716ad201cd3b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506813, 'tstamp': 506813}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226606, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506816, 'tstamp': 506816}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226606, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.387 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.388 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.394 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.394 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.394 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.395 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:15.395 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.718 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798415.7173388, 80cb8b15-443c-424b-894c-1ed6674f77d5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.718 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] VM Started (Lifecycle Event)#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.784 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.790 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798415.7179816, 80cb8b15-443c-424b-894c-1ed6674f77d5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.790 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.806 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.811 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:15 np0005531888 nova_compute[186788]: 2025-11-22 08:00:15.832 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.051 186792 INFO nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Creating config drive at /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.057 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3crdajq3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.206 186792 DEBUG oslo_concurrency.processutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3crdajq3" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.279 186792 DEBUG nova.compute.manager [req-e16d6627-d29d-4db3-bd1c-d4daa8eb0aa1 req-b72f368a-1ef8-4184-a30e-7bc1dcc974e8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received event network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.279 186792 DEBUG oslo_concurrency.lockutils [req-e16d6627-d29d-4db3-bd1c-d4daa8eb0aa1 req-b72f368a-1ef8-4184-a30e-7bc1dcc974e8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.280 186792 DEBUG oslo_concurrency.lockutils [req-e16d6627-d29d-4db3-bd1c-d4daa8eb0aa1 req-b72f368a-1ef8-4184-a30e-7bc1dcc974e8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.280 186792 DEBUG oslo_concurrency.lockutils [req-e16d6627-d29d-4db3-bd1c-d4daa8eb0aa1 req-b72f368a-1ef8-4184-a30e-7bc1dcc974e8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.280 186792 DEBUG nova.compute.manager [req-e16d6627-d29d-4db3-bd1c-d4daa8eb0aa1 req-b72f368a-1ef8-4184-a30e-7bc1dcc974e8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Processing event network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.281 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:00:16 np0005531888 kernel: tapa2f45e58-23: entered promiscuous mode
Nov 22 03:00:16 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:16Z|00241|binding|INFO|Claiming lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe for this chassis.
Nov 22 03:00:16 np0005531888 NetworkManager[55166]: <info>  [1763798416.2835] manager: (tapa2f45e58-23): new Tun device (/org/freedesktop/NetworkManager/Devices/131)
Nov 22 03:00:16 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:16Z|00242|binding|INFO|a2f45e58-237f-4de0-8339-5f17a4ad3cfe: Claiming fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.284 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.289 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798416.2894425, 80cb8b15-443c-424b-894c-1ed6674f77d5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.290 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.292 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:00:16 np0005531888 NetworkManager[55166]: <info>  [1763798416.2969] device (tapa2f45e58-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:00:16 np0005531888 NetworkManager[55166]: <info>  [1763798416.2978] device (tapa2f45e58-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.297 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.299 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.299 186792 INFO nova.virt.libvirt.driver [-] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Instance spawned successfully.#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.299 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:00:16 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:16Z|00243|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe ovn-installed in OVS
Nov 22 03:00:16 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:16Z|00244|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe up in Southbound
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.303 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.303 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.316 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.319 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[49d0d1ff-5801-4352-a268-eb5dccfa6e70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.320 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.322 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.322 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[822828b7-4510-4721-a677-6de8bb75af50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.323 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e7fec9dd-967f-4521-81b2-97ddaefd67f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.328 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.332 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.332 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.333 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 systemd-machined[153106]: New machine qemu-41-instance-0000005d.
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.333 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.333 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.334 186792 DEBUG nova.virt.libvirt.driver [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.336 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[daba916a-d757-4fde-807c-64d75a33212e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 systemd[1]: Started Virtual Machine qemu-41-instance-0000005d.
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.360 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8a94b0-e282-4628-9712-6b99687f8f30]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.376 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.390 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3cfea4b3-9905-477e-9f2a-e5070aacaa17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 NetworkManager[55166]: <info>  [1763798416.3995] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/132)
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.398 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b8abb06a-9849-43ac-a8c7-009fb4f08635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.437 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[592a1dea-9900-4341-9a3a-da640766aae9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.440 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4fbddae3-d452-4318-b689-d316a2890636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.447 186792 INFO nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Took 5.94 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.447 186792 DEBUG nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:16 np0005531888 NetworkManager[55166]: <info>  [1763798416.4627] device (tap165f7f23-d0): carrier: link connected
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.469 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b97bbe2b-e565-4c55-8ad5-8b482cf97a96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.489 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab933a5-1b91-4fc0-840c-5963af7a1539]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515710, 'reachable_time': 31353, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226666, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.510 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6adc3619-e332-48f8-b822-10afd5cbb92d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515710, 'tstamp': 515710}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226667, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.521 186792 DEBUG nova.network.neutron [req-30d7585a-c93e-444d-aa30-a6d1f6cbf083 req-acb192d0-fb2d-420e-9ca8-c1bcc72a2927 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Updated VIF entry in instance network info cache for port 487183e6-b09b-4561-97a9-8f8e44492221. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.521 186792 DEBUG nova.network.neutron [req-30d7585a-c93e-444d-aa30-a6d1f6cbf083 req-acb192d0-fb2d-420e-9ca8-c1bcc72a2927 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Updating instance_info_cache with network_info: [{"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.524 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5c8d0e-19e5-4d80-9e32-4e605c71e39d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515710, 'reachable_time': 31353, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226668, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.538 186792 DEBUG oslo_concurrency.lockutils [req-30d7585a-c93e-444d-aa30-a6d1f6cbf083 req-acb192d0-fb2d-420e-9ca8-c1bcc72a2927 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.558 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[de9c60ac-9dfe-41a3-9c56-c26cb0420499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.622 186792 INFO nova.compute.manager [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Took 6.57 seconds to build instance.#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.628 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5ba88a-2b99-45f4-8dd0-ffb91ad98b27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.629 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.629 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.630 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.631 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:16 np0005531888 NetworkManager[55166]: <info>  [1763798416.6323] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Nov 22 03:00:16 np0005531888 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.634 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:16 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:16Z|00245|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.639 186792 DEBUG nova.compute.manager [req-5a399889-e4e6-43ef-a1c7-35d038623e39 req-c2be20c4-8b62-421c-add5-13f1314cf2aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.640 186792 DEBUG oslo_concurrency.lockutils [req-5a399889-e4e6-43ef-a1c7-35d038623e39 req-c2be20c4-8b62-421c-add5-13f1314cf2aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.640 186792 DEBUG oslo_concurrency.lockutils [req-5a399889-e4e6-43ef-a1c7-35d038623e39 req-c2be20c4-8b62-421c-add5-13f1314cf2aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.640 186792 DEBUG oslo_concurrency.lockutils [req-5a399889-e4e6-43ef-a1c7-35d038623e39 req-c2be20c4-8b62-421c-add5-13f1314cf2aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.641 186792 DEBUG nova.compute.manager [req-5a399889-e4e6-43ef-a1c7-35d038623e39 req-c2be20c4-8b62-421c-add5-13f1314cf2aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Processing event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.646 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.647 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.648 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.649 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e91bf9dc-2d37-40e9-baa7-ecfe69dd5ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.649 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:00:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:16.650 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.661 186792 DEBUG oslo_concurrency.lockutils [None req-ab887a60-5365-4c7c-8a06-38123e99d204 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.802 186792 DEBUG nova.network.neutron [req-4bf7bd0c-d333-4d76-9b05-f19a17c000a7 req-10373f6b-fdb2-4c8d-8f12-c1e0422f5dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updated VIF entry in instance network info cache for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.803 186792 DEBUG nova.network.neutron [req-4bf7bd0c-d333-4d76-9b05-f19a17c000a7 req-10373f6b-fdb2-4c8d-8f12-c1e0422f5dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.825 186792 DEBUG oslo_concurrency.lockutils [req-4bf7bd0c-d333-4d76-9b05-f19a17c000a7 req-10373f6b-fdb2-4c8d-8f12-c1e0422f5dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.847 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.848 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798416.8468127, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.848 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Started (Lifecycle Event)#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.850 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.853 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance spawned successfully.#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.854 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.869 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.873 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.876 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.876 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.877 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.877 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.878 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.878 186792 DEBUG nova.virt.libvirt.driver [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.910 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.911 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798416.8469152, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.911 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.946 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.952 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798416.8498368, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.953 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.968 186792 INFO nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Took 6.18 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.968 186792 DEBUG nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.975 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:16 np0005531888 nova_compute[186788]: 2025-11-22 08:00:16.978 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:17 np0005531888 nova_compute[186788]: 2025-11-22 08:00:17.005 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:00:17 np0005531888 nova_compute[186788]: 2025-11-22 08:00:17.049 186792 INFO nova.compute.manager [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Took 6.96 seconds to build instance.#033[00m
Nov 22 03:00:17 np0005531888 nova_compute[186788]: 2025-11-22 08:00:17.063 186792 DEBUG oslo_concurrency.lockutils [None req-2baf52b3-f1ce-4975-8912-f62faa68f5af b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:17 np0005531888 podman[226705]: 2025-11-22 08:00:17.067287894 +0000 UTC m=+0.077423016 container create 3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:00:17 np0005531888 podman[226705]: 2025-11-22 08:00:17.012746672 +0000 UTC m=+0.022881824 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:00:17 np0005531888 systemd[1]: Started libpod-conmon-3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24.scope.
Nov 22 03:00:17 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:00:17 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08e014cd4a31ece6493233b0839bb36405d991c6c76b7e81f86e79a39e170d1f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:00:17 np0005531888 podman[226705]: 2025-11-22 08:00:17.168921014 +0000 UTC m=+0.179056166 container init 3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:00:17 np0005531888 podman[226705]: 2025-11-22 08:00:17.176023869 +0000 UTC m=+0.186158991 container start 3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 03:00:17 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[226721]: [NOTICE]   (226725) : New worker (226727) forked
Nov 22 03:00:17 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[226721]: [NOTICE]   (226725) : Loading success.
Nov 22 03:00:17 np0005531888 nova_compute[186788]: 2025-11-22 08:00:17.726 186792 INFO nova.compute.manager [None req-4409e16a-652b-4e60-9a61-41722247d890 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Get console output#033[00m
Nov 22 03:00:17 np0005531888 nova_compute[186788]: 2025-11-22 08:00:17.849 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:00:17 np0005531888 nova_compute[186788]: 2025-11-22 08:00:17.939 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.404 186792 DEBUG nova.compute.manager [req-0d69fc58-cfe1-4ce6-b3f5-d6997a6afb42 req-316c4f20-06c1-4e69-b90a-ae426effbbe2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received event network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.404 186792 DEBUG oslo_concurrency.lockutils [req-0d69fc58-cfe1-4ce6-b3f5-d6997a6afb42 req-316c4f20-06c1-4e69-b90a-ae426effbbe2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.405 186792 DEBUG oslo_concurrency.lockutils [req-0d69fc58-cfe1-4ce6-b3f5-d6997a6afb42 req-316c4f20-06c1-4e69-b90a-ae426effbbe2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.405 186792 DEBUG oslo_concurrency.lockutils [req-0d69fc58-cfe1-4ce6-b3f5-d6997a6afb42 req-316c4f20-06c1-4e69-b90a-ae426effbbe2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.405 186792 DEBUG nova.compute.manager [req-0d69fc58-cfe1-4ce6-b3f5-d6997a6afb42 req-316c4f20-06c1-4e69-b90a-ae426effbbe2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] No waiting events found dispatching network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.406 186792 WARNING nova.compute.manager [req-0d69fc58-cfe1-4ce6-b3f5-d6997a6afb42 req-316c4f20-06c1-4e69-b90a-ae426effbbe2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received unexpected event network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.687 186792 DEBUG nova.compute.manager [req-59ec2482-8589-4ea6-90c0-4e87f24161a5 req-5a73579a-36b6-43ec-9b10-3e4a410a3872 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-changed-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.688 186792 DEBUG nova.compute.manager [req-59ec2482-8589-4ea6-90c0-4e87f24161a5 req-5a73579a-36b6-43ec-9b10-3e4a410a3872 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Refreshing instance network info cache due to event network-changed-a2f45e58-237f-4de0-8339-5f17a4ad3cfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.688 186792 DEBUG oslo_concurrency.lockutils [req-59ec2482-8589-4ea6-90c0-4e87f24161a5 req-5a73579a-36b6-43ec-9b10-3e4a410a3872 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.688 186792 DEBUG oslo_concurrency.lockutils [req-59ec2482-8589-4ea6-90c0-4e87f24161a5 req-5a73579a-36b6-43ec-9b10-3e4a410a3872 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.689 186792 DEBUG nova.network.neutron [req-59ec2482-8589-4ea6-90c0-4e87f24161a5 req-5a73579a-36b6-43ec-9b10-3e4a410a3872 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Refreshing network info cache for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.744 186792 DEBUG nova.compute.manager [req-1eed1b75-f061-4895-ba20-70068564017f req-d39f24ab-9737-4a28-b927-252a2d043075 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.745 186792 DEBUG oslo_concurrency.lockutils [req-1eed1b75-f061-4895-ba20-70068564017f req-d39f24ab-9737-4a28-b927-252a2d043075 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.745 186792 DEBUG oslo_concurrency.lockutils [req-1eed1b75-f061-4895-ba20-70068564017f req-d39f24ab-9737-4a28-b927-252a2d043075 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.745 186792 DEBUG oslo_concurrency.lockutils [req-1eed1b75-f061-4895-ba20-70068564017f req-d39f24ab-9737-4a28-b927-252a2d043075 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.746 186792 DEBUG nova.compute.manager [req-1eed1b75-f061-4895-ba20-70068564017f req-d39f24ab-9737-4a28-b927-252a2d043075 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:18 np0005531888 nova_compute[186788]: 2025-11-22 08:00:18.746 186792 WARNING nova.compute.manager [req-1eed1b75-f061-4895-ba20-70068564017f req-d39f24ab-9737-4a28-b927-252a2d043075 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:19 np0005531888 nova_compute[186788]: 2025-11-22 08:00:19.003 186792 INFO nova.compute.manager [None req-19ff7c5d-ec9e-488e-a856-aa246d12363f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Get console output#033[00m
Nov 22 03:00:19 np0005531888 nova_compute[186788]: 2025-11-22 08:00:19.945 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:22 np0005531888 nova_compute[186788]: 2025-11-22 08:00:22.439 186792 DEBUG nova.network.neutron [req-59ec2482-8589-4ea6-90c0-4e87f24161a5 req-5a73579a-36b6-43ec-9b10-3e4a410a3872 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updated VIF entry in instance network info cache for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:00:22 np0005531888 nova_compute[186788]: 2025-11-22 08:00:22.440 186792 DEBUG nova.network.neutron [req-59ec2482-8589-4ea6-90c0-4e87f24161a5 req-5a73579a-36b6-43ec-9b10-3e4a410a3872 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:22 np0005531888 nova_compute[186788]: 2025-11-22 08:00:22.462 186792 DEBUG oslo_concurrency.lockutils [req-59ec2482-8589-4ea6-90c0-4e87f24161a5 req-5a73579a-36b6-43ec-9b10-3e4a410a3872 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:22 np0005531888 nova_compute[186788]: 2025-11-22 08:00:22.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:24 np0005531888 nova_compute[186788]: 2025-11-22 08:00:24.649 186792 DEBUG nova.compute.manager [None req-249d13be-16f4-40ca-909b-cc1aae93953f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Nov 22 03:00:24 np0005531888 podman[226737]: 2025-11-22 08:00:24.703725671 +0000 UTC m=+0.060600612 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:00:24 np0005531888 podman[226736]: 2025-11-22 08:00:24.719099349 +0000 UTC m=+0.077571049 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:00:24 np0005531888 nova_compute[186788]: 2025-11-22 08:00:24.948 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:27 np0005531888 nova_compute[186788]: 2025-11-22 08:00:27.621 186792 DEBUG oslo_concurrency.lockutils [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:27 np0005531888 nova_compute[186788]: 2025-11-22 08:00:27.622 186792 DEBUG oslo_concurrency.lockutils [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:27 np0005531888 nova_compute[186788]: 2025-11-22 08:00:27.622 186792 DEBUG nova.compute.manager [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:27 np0005531888 nova_compute[186788]: 2025-11-22 08:00:27.627 186792 DEBUG nova.compute.manager [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 22 03:00:27 np0005531888 nova_compute[186788]: 2025-11-22 08:00:27.628 186792 DEBUG nova.objects.instance [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'flavor' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:27 np0005531888 nova_compute[186788]: 2025-11-22 08:00:27.649 186792 DEBUG nova.objects.instance [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'info_cache' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:27 np0005531888 nova_compute[186788]: 2025-11-22 08:00:27.674 186792 DEBUG nova.virt.libvirt.driver [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:00:27 np0005531888 nova_compute[186788]: 2025-11-22 08:00:27.945 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:29 np0005531888 kernel: tape963f21d-d8 (unregistering): left promiscuous mode
Nov 22 03:00:29 np0005531888 NetworkManager[55166]: <info>  [1763798429.9276] device (tape963f21d-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:00:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:29Z|00246|binding|INFO|Releasing lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 from this chassis (sb_readonly=0)
Nov 22 03:00:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:29Z|00247|binding|INFO|Setting lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 down in Southbound
Nov 22 03:00:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:29Z|00248|binding|INFO|Removing iface tape963f21d-d8 ovn-installed in OVS
Nov 22 03:00:29 np0005531888 nova_compute[186788]: 2025-11-22 08:00:29.949 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:29.954 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:5f:a6 10.100.0.4'], port_security=['fa:16:3e:b9:5f:a6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '92409a46-2dd7-4b20-ac9d-958bbb30993d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '320b38f4-6497-45cc-9e33-00f741d5a1b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e963f21d-d8c0-4f76-b5bc-4a3f577d4055) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:29.956 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 unbound from our chassis#033[00m
Nov 22 03:00:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:29.958 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087#033[00m
Nov 22 03:00:29 np0005531888 nova_compute[186788]: 2025-11-22 08:00:29.963 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:29.979 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[504831f1-e5f9-4198-a620-35b53eeb364e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:29 np0005531888 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000053.scope: Deactivated successfully.
Nov 22 03:00:29 np0005531888 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000053.scope: Consumed 21.521s CPU time.
Nov 22 03:00:29 np0005531888 systemd-machined[153106]: Machine qemu-38-instance-00000053 terminated.
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.006 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[1e4fbe44-e79a-4af3-80ac-29a954d3c387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.009 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[dc917822-659a-4408-9606-aad0e66d5a34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.031 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5287a5-d6bb-471d-b408-2524a1528b6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.046 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[586c39c5-3e3e-424e-9eba-4d72e7804e2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506803, 'reachable_time': 17176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226802, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.060 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ad76c11c-553e-42cf-bcdb-368449ec14df]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506813, 'tstamp': 506813}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226803, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506816, 'tstamp': 506816}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226803, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.061 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.063 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.067 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.067 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.067 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.068 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:30.068 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.177 186792 DEBUG nova.compute.manager [req-76010d1a-6bc6-4f4d-9050-d0673abb575f req-a903f3e3-e6d2-4918-925e-5fcd3b979737 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-unplugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.178 186792 DEBUG oslo_concurrency.lockutils [req-76010d1a-6bc6-4f4d-9050-d0673abb575f req-a903f3e3-e6d2-4918-925e-5fcd3b979737 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.179 186792 DEBUG oslo_concurrency.lockutils [req-76010d1a-6bc6-4f4d-9050-d0673abb575f req-a903f3e3-e6d2-4918-925e-5fcd3b979737 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.179 186792 DEBUG oslo_concurrency.lockutils [req-76010d1a-6bc6-4f4d-9050-d0673abb575f req-a903f3e3-e6d2-4918-925e-5fcd3b979737 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.179 186792 DEBUG nova.compute.manager [req-76010d1a-6bc6-4f4d-9050-d0673abb575f req-a903f3e3-e6d2-4918-925e-5fcd3b979737 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] No waiting events found dispatching network-vif-unplugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.179 186792 WARNING nova.compute.manager [req-76010d1a-6bc6-4f4d-9050-d0673abb575f req-a903f3e3-e6d2-4918-925e-5fcd3b979737 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received unexpected event network-vif-unplugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for instance with vm_state active and task_state powering-off.#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.691 186792 INFO nova.virt.libvirt.driver [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance shutdown successfully after 3 seconds.#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.697 186792 INFO nova.virt.libvirt.driver [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance destroyed successfully.#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.697 186792 DEBUG nova.objects.instance [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'numa_topology' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.716 186792 DEBUG nova.compute.manager [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:30 np0005531888 nova_compute[186788]: 2025-11-22 08:00:30.817 186792 DEBUG oslo_concurrency.lockutils [None req-217e5ac4-3341-4608-8a70-fd2c3c61dbe9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:32 np0005531888 nova_compute[186788]: 2025-11-22 08:00:32.323 186792 DEBUG nova.compute.manager [req-464c17fb-f88b-466e-ab3c-6cd8c95dc898 req-1270edfa-8c8a-4af4-b29b-13040698e56f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:32 np0005531888 nova_compute[186788]: 2025-11-22 08:00:32.323 186792 DEBUG oslo_concurrency.lockutils [req-464c17fb-f88b-466e-ab3c-6cd8c95dc898 req-1270edfa-8c8a-4af4-b29b-13040698e56f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:32 np0005531888 nova_compute[186788]: 2025-11-22 08:00:32.324 186792 DEBUG oslo_concurrency.lockutils [req-464c17fb-f88b-466e-ab3c-6cd8c95dc898 req-1270edfa-8c8a-4af4-b29b-13040698e56f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:32 np0005531888 nova_compute[186788]: 2025-11-22 08:00:32.324 186792 DEBUG oslo_concurrency.lockutils [req-464c17fb-f88b-466e-ab3c-6cd8c95dc898 req-1270edfa-8c8a-4af4-b29b-13040698e56f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:32 np0005531888 nova_compute[186788]: 2025-11-22 08:00:32.324 186792 DEBUG nova.compute.manager [req-464c17fb-f88b-466e-ab3c-6cd8c95dc898 req-1270edfa-8c8a-4af4-b29b-13040698e56f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] No waiting events found dispatching network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:32 np0005531888 nova_compute[186788]: 2025-11-22 08:00:32.324 186792 WARNING nova.compute.manager [req-464c17fb-f88b-466e-ab3c-6cd8c95dc898 req-1270edfa-8c8a-4af4-b29b-13040698e56f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received unexpected event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for instance with vm_state stopped and task_state resize_prep.#033[00m
Nov 22 03:00:32 np0005531888 podman[226849]: 2025-11-22 08:00:32.696405572 +0000 UTC m=+0.065335269 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:00:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:32Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:0b:78 10.100.0.10
Nov 22 03:00:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:32Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:0b:78 10.100.0.10
Nov 22 03:00:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:32Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:00:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:32Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:00:32 np0005531888 nova_compute[186788]: 2025-11-22 08:00:32.945 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:34.066 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:34 np0005531888 nova_compute[186788]: 2025-11-22 08:00:34.067 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:34.068 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:00:34 np0005531888 nova_compute[186788]: 2025-11-22 08:00:34.164 186792 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:34 np0005531888 nova_compute[186788]: 2025-11-22 08:00:34.165 186792 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:34 np0005531888 nova_compute[186788]: 2025-11-22 08:00:34.165 186792 DEBUG nova.network.neutron [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:00:34 np0005531888 nova_compute[186788]: 2025-11-22 08:00:34.954 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:35 np0005531888 podman[226870]: 2025-11-22 08:00:35.683373875 +0000 UTC m=+0.052733558 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:00:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:36.813 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:36.814 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:36.814 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.842 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'name': 'tempest-ServerActionsTestJSON-server-1519356482', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005d', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'hostId': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.844 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '92409a46-2dd7-4b20-ac9d-958bbb30993d', 'name': 'tempest-ServerActionsTestOtherB-server-342710330', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000053', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '62d9a4a13f5d41529bc273c278fae96b', 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'hostId': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.847 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005c', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '62d9a4a13f5d41529bc273c278fae96b', 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'hostId': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.847 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.863 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/memory.usage volume: 40.4375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.864 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.879 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ee49c28-18d6-4018-bd96-ee2e9fd7a9a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4375, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'timestamp': '2025-11-22T08:00:36.847672', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5459faba-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.56254173, 'message_signature': '9f3d0ac350bdb8085ce02cc9097985c0c3cbd9709a9a82594ae99c1a49aecc1d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'timestamp': '2025-11-22T08:00:36.847672', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '545c5e9a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.578560184, 'message_signature': 'd12e6cecd13953074dde6b707f41453ff1ab8a467445143683451526ebed7a75'}]}, 'timestamp': '2025-11-22 08:00:36.879760', '_unique_id': '004dea7829ee433abf86369d7e46ef54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:00:36 np0005531888 nova_compute[186788]: 2025-11-22 08:00:36.894 186792 DEBUG nova.network.neutron [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.905 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.requests volume: 1127 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.906 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.906 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 nova_compute[186788]: 2025-11-22 08:00:36.924 186792 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.931 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.requests volume: 1095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.931 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9946dac7-468b-4edd-8da4-c48e6316c468', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1127, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:36.882454', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5460633c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': 'd4f5ce424987b49527794c00f05f74763ecd4361016a070e1f88308614c09aea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:36.882454', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '54607034-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': '855ad32391970c7fd872f7e7fb8b632704f00f36fc851a56277b792e902ec213'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1095, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:36.882454', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5464501e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': '834225dfdd380c8f5006e7502f200e3fbf3b8e7fd81877d7f2479e9d84b02c22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:36.882454', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '54645b9a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': '4d635dd1abfa8277ac0b16f81f537c36547617368986b2b0bdd31a610449c466'}]}, 'timestamp': '2025-11-22 08:00:36.931994', '_unique_id': '8d3112e61ebf486ebb6cac03ca82e7dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.943 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.944 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.945 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.954 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.954 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdcf1096-f2ed-40d7-9f79-d1cb75a39fc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:36.934103', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '54663c80-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.633613268, 'message_signature': '1e5371b399707530fda9b1e2434077421a97197c77ff3bcfebded546deb0d612'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:36.934103', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '546646da-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.633613268, 'message_signature': '9cfd44d6cc43caae1a3808672e44de00ec4b9a16919d1f9d181e8f2c4b6310d6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:36.934103', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5467cc1c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.644700321, 'message_signature': '4042243494336ced6c492e833102576ea595e2c719658cd9bb152e537863f3c5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:36.934103', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5467e224-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.644700321, 'message_signature': '439cf47f60e56c49b8f4447f1d097e96b08ea281cc9acb3cc66061a8d6517796'}]}, 'timestamp': '2025-11-22 08:00:36.955210', '_unique_id': 'da4cd416cbb74a6e87cb4f97747717a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.958 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.958 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1519356482>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-342710330>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1246188074>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1519356482>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-342710330>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1246188074>]
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.958 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.959 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.latency volume: 1922593702 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.959 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.latency volume: 240453791 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.961 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.961 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.latency volume: 1310824649 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.961 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.latency volume: 113214800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cf1639b-0c0a-412d-b75e-8f936b27460d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1922593702, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:36.959113', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '54688d3c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': 'db90d4885d4648d74a1cbfa7bd9d35e9a73a1258717934d045a540eb99ebdc40'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 240453791, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:36.959113', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5468a178-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': 'a55c5a23ee75670b41a170fe1cb5f470c550af267106ed2eeac77b51d124b5a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1310824649, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:36.959113', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5468dee0-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': 'ec55f504ecd1aef8bae9dae81da0b879e54bb5cad2f7cdf0854c2d7ba0c91dc4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 113214800, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:36.959113', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5468f34e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': 'bc722de901bf2214bf7ee89c8acb277c58ce82c6c72c6213680593fd63043e1e'}]}, 'timestamp': '2025-11-22 08:00:36.962190', '_unique_id': '0c10656df9c84f03959d3c1ccee802e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.964 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.968 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 / tapa2f45e58-23 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.968 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.969 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.972 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 80cb8b15-443c-424b-894c-1ed6674f77d5 / tap487183e6-b0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.973 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9739788-e3d7-4801-bb12-47524c309d29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:36.964905', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '546a0054-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': 'ebae9c77b693c70d730cf69c8c2f4a64739759e7722c2bd7beea37b2f4f221cf'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:36.964905', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '546ab3be-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': 'a9620df4db2403758040a54c464a17a306ff298024d127a2aab5e1c3b5d703b7'}]}, 'timestamp': '2025-11-22 08:00:36.973799', '_unique_id': '8445f9535e404e9a81e42db155d3c076'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.975 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.976 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.976 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.977 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.977 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.requests volume: 268 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.977 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e65a9fae-44ac-4a8c-9073-ba426fab1434', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 331, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:36.976021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '546b1ec6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': '8cbc707c6b850314ec6caf8f861973e5f67e91734a37895cb6b9db564ac63bd9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:36.976021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '546b2970-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': '8cd1e11f0a572795655bfcf5839b1af455f3b45f7593dfad81a1dce6d08ba16b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 268, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:36.976021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '546b5288-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': '1d457aef4bca3e50ce99ad78a8fb7f9d260b45e668053877546294bbfd6649d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:36.976021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '546b5da0-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': '4cf8ab8aa4d2504df8fdfb60644aa2ed11f242b62e9d6f6a80d85c9f70828a7c'}]}, 'timestamp': '2025-11-22 08:00:36.977906', '_unique_id': 'ead8e2e52f1649c7b9c336fc2fb3b979'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.979 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.979 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.980 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.980 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.981 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.981 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb002145-28d2-4952-b782-b91d299db62f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:36.979784', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '546bb1d8-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.633613268, 'message_signature': '5b5a9a98d43e948213a7728b13523ffe76e934c3b63b7835dbdf3a63db22ce60'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:36.979784', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '546bbce6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.633613268, 'message_signature': '1ae3ac721238eaea0755f51b966d788e11f6d0c45ff7372fd4e9ae57f23ed77e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:36.979784', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '546be338-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.644700321, 'message_signature': '2b992f81ca8e514b4a9a801788e591fc963229eadd51623921be217dad00a96e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:36.979784', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '546bedec-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.644700321, 'message_signature': '81c082c685a271484701460d4de32a7038a6472bc8d9e8c27c9a5b987b3a5084'}]}, 'timestamp': '2025-11-22 08:00:36.981627', '_unique_id': '3a3eca7745f1403494f8813d83dc8b26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.983 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.984 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.984 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1f5e1b2-982f-4236-a6f4-f36c2b63e62d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:36.983365', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '546c3d56-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': '1c5af939e44fbf077f9deb4fb650a3d55dd820c163fd84f77498ca392a676d7c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:36.983365', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '546c5dfe-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': '9e171d16caba468d53b87df26441b0ade8195297721c48baae529a4b98634daf'}]}, 'timestamp': '2025-11-22 08:00:36.984480', '_unique_id': 'c92213da35a446ca8e83d22e53bdb849'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.985 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.986 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.bytes volume: 1722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.986 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '969d682d-9d2c-4d19-a2ae-715142abac63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1722, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:36.986084', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '546ca728-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': 'fa507129ceee49a27ad787e0c1ccee276fd2ae7e196b12ab54a9227c150f2cb2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:36.986084', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '546cccbc-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': 'd7de835ea80c848e0dd776314a9bb11fe3862f59549b4febaccea1a20f761bfd'}]}, 'timestamp': '2025-11-22 08:00:36.987313', '_unique_id': 'b191c77caeaa4cf7a97caa8ba7340d84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.987 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.988 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.988 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/cpu volume: 13810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.989 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.989 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/cpu volume: 13220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01953e4e-197b-4f8e-a969-6b4dbb78ff0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13810000000, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'timestamp': '2025-11-22T08:00:36.988809', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '546d1398-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.56254173, 'message_signature': '921a0154998fcc6437e0e6581b5ea5fd622925a6e676dc8efef3356213487f2d'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13220000000, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'timestamp': '2025-11-22T08:00:36.988809', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '546d3620-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.578560184, 'message_signature': '5d6732b28bc57924604dea1c19c0a2dc30988174a825c15ebe8485e8548868d7'}]}, 'timestamp': '2025-11-22 08:00:36.990006', '_unique_id': '8db413d4f51341778b1b296c6e39833a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.990 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.992 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.992 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1519356482>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-342710330>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1246188074>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1519356482>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-342710330>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1246188074>]
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.993 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.994 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.994 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4189aa6c-03a1-4292-9d2f-3031c4f29e7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:36.993286', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '546dc11c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': '4fffe9d71740096604b2944a6267cef95b2c52bb75d01782bd62e4ea7f0bb839'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:36.993286', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '546df47a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': 'b4aedb7d9eb52168d177cafe6600d2e44e52379f469422250aafa27c68d951a0'}]}, 'timestamp': '2025-11-22 08:00:36.994940', '_unique_id': 'a27284b7d217476ead2fd8a4cb527c1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.997 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.998 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.998 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba8574b0-2112-4ca9-8cc6-206087f6bc55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:36.997066', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '546e546a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': 'd370518b731e1c52d1751e3d2d5ca70d908725295fc4fbd833662760d93ee3d7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:36.997066', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '546e835e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': 'f40bae2b94eed809ce4859ec7bc42cd8433e8512a8b47921920134d7d186003e'}]}, 'timestamp': '2025-11-22 08:00:36.998621', '_unique_id': '5bc6ed6ce9ae49b1a66091fd2631daaa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.000 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.bytes volume: 30935552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.000 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.001 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.001 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.bytes volume: 30747136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.001 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92b34f5f-5700-4261-acd8-ca1ce8e4d045', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30935552, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:37.000323', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '546ed30e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': 'a001bab2a57b28ad0d86f4a04716c1eb181830a272cc02dbc5212db999b1d3c2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:37.000323', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '546edee4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': '0d16e5eba914c5e5cf6fb7d16e574904fd8e16e359d1830e7bf0388f739177a1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30747136, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:37.000323', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '546f05f4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': '78e5d9a46c903623a07bb6db4c129ea195227a360ab9717a026bd83e2c305bde'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:37.000323', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '546f1076-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': '83bb91f6a05eebe684153f91ba9475f613bbd3119357178acac9fda22a25b81a'}]}, 'timestamp': '2025-11-22 08:00:37.002180', '_unique_id': '9f489bfbd42f4b4dbba1e7363a54e758'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.002 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.003 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.004 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.004 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56ad6e73-0320-4ef8-b903-17e3790f6c9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:37.003794', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '546f5aa4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': '25a9b4bcf52219f5ba2c08647b10b88bf10318ed1bb94224e6965f37338bfcc2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:37.003794', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '546f7f84-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': '873ad8c97b6642beeefbef071c41264343eb50613a3244e6d828e8b42206df71'}]}, 'timestamp': '2025-11-22 08:00:37.005076', '_unique_id': 'e207a1ffd6384046b1003c1c7320cf02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.005 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.006 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.latency volume: 36130857862 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.006 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.007 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.007 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.latency volume: 72565896023 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.008 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07012944-825d-4d15-9d22-553cf64571e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 36130857862, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:37.006606', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '546fc872-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': '414e3ed4636d2d477f8008aa27b8a0293ce8c75caa573a34669e608691710126'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:37.006606', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '546fd33a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': 'ceb52acb94ef9e63f220ecac85b638da814d031d5a2a6b1a688ed125ff41dab3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 72565896023, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:37.006606', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '546ff658-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': 'e4d74c733dbf4e01d7b6f0ad1718d8a0e5b6e7879192aea3011292b18443c22f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:37.006606', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '547000ee-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': '5b7b7a21553a14c37e3ecfe50e920e3ad076d2fc651d8a7266a87029911a325e'}]}, 'timestamp': '2025-11-22 08:00:37.008339', '_unique_id': '6ef2ed54b8f3405bbe7537959c1d4b7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.009 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.010 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.010 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1519356482>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-342710330>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1246188074>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1519356482>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-342710330>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1246188074>]
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.010 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.011 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.011 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71171104-daad-437a-b674-2b7c0befdb22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:37.010491', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '5470617e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': '564e4d5cca3f3ea5b9b417c095638ae89ffbe9e4308335a6f32e683c92dcf073'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:37.010491', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '54708a28-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': '9c144d897644f9346a2e245ec15e9ba53ac19c5256d5da2d0f0643faf059a653'}]}, 'timestamp': '2025-11-22 08:00:37.011864', '_unique_id': 'c05171c00c37467189d264502ab8be7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.012 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.013 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.014 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.014 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aae78a3d-972d-44bd-8f2c-b556cbe21d48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:37.013369', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '5470d0e6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': 'cb9fc4502df5021a20b85758b08e74e39630955df0890e4b0ac256ba88c42774'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:37.013369', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '5470f45e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': '9aad3038cac52f1dd218c0a0a03af4790627db36d77c941ac4b6811fee796e5d'}]}, 'timestamp': '2025-11-22 08:00:37.014610', '_unique_id': '8739935893ba4411b94008261aadc188'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.016 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.bytes volume: 72761344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.016 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.017 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.017 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.bytes volume: 72708096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.017 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '356ae885-ee66-45b9-bcba-661f7ebc593d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72761344, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:37.016076', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '54713a68-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': '700994d6475c15ce6e1a2cb46f85f0ba10164fb81e5a37f5af0dbefd3571076b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:37.016076', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5471451c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.581954817, 'message_signature': '713dcec19925f0021fc775ec3cd420a0cb24114788827c3ab2aa8cc3e3a5d4d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72708096, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:37.016076', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '547169ca-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': 'efe21f9b5da88093bd180e4aa3bad8070a8106383e8ee7822601790b1a623dee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:37.016076', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5471756e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.606553743, 'message_signature': 'cd3afb541b02f638cb047a9f2c7096e58912ad16e5ed56091b46ce68b00d458c'}]}, 'timestamp': '2025-11-22 08:00:37.017877', '_unique_id': '095bf37cc6e2493fac33465220cab072'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.018 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.019 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.019 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.019 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.020 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.020 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0da24e0-1152-4032-b6b0-7eeb5d676d58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:00:37.019484', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5471c08c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.633613268, 'message_signature': '9683e21fca6e792f3686f4520ed17b8a45e6a96845779d38f7f428b6d5ea7ea3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:00:37.019484', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5471cb2c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.633613268, 'message_signature': '9f89fa6bfcfc049a676b7165bd2dc6b9bd58de0ac47931e34b3e5d458ed6914f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:00:37.019484', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5471eeea-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.644700321, 'message_signature': '571e3c5a85fe56e53eba31f2866cbf57001df141fbafb2e7ac4f00e656e47b3e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:00:37.019484', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5471faf2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.644700321, 'message_signature': '65a3731b820dd8efd903db26c4ccad2addb38011b145f513db939238eafba3a4'}]}, 'timestamp': '2025-11-22 08:00:37.021289', '_unique_id': '7fcec1f782e14c55b26cb8677e879251'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.021 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.022 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.023 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.023 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.024 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2810dc4f-ced4-4296-a4cf-4c034dc261d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:37.023084', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '54724c6e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': '0d2f3540c0a9748390d00528c57b03b291a6898f48741d76174b6ede02ee658c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:37.023084', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '547271ee-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': '79ba6deb830263787f15818b7f0dbfd63a61f9f6383c1fc1a0d81db423ca4ecd'}]}, 'timestamp': '2025-11-22 08:00:37.024354', '_unique_id': '95757b0a5ab540c4a77f438358b8390f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.026 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.026 12 DEBUG ceilometer.compute.pollsters [-] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000053, id=92409a46-2dd7-4b20-ac9d-958bbb30993d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.026 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a826c8b-9072-4d97-a0bf-fe8260336e77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:00:37.025984', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '5472bd48-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.664449857, 'message_signature': 'cda0fc4e0d3f9ffc52c1fc133f4ab20e4f8730d102cbf8a53fa6d3a2b7e9ba21'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:00:37.025984', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '5472e318-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5177.669571113, 'message_signature': 'a554a658cefa12b7fe0b83f651ee6232ad10cfb8f5f96b7d89636dc66e7828a1'}]}, 'timestamp': '2025-11-22 08:00:37.027248', '_unique_id': '50e8eab379384e21bbd2d53ff23cf1ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.027 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.028 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:00:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:00:37.028 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1519356482>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-342710330>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1246188074>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1519356482>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-342710330>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1246188074>]
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.043 186792 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.043 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Creating file /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/908493debe8e480cbaa76648342d43e6.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.044 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/908493debe8e480cbaa76648342d43e6.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:37 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.494 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/908493debe8e480cbaa76648342d43e6.tmp" returned: 1 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.496 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/908493debe8e480cbaa76648342d43e6.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.496 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Creating directory /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.497 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.726 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d" returned: 0 in 0.229s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.731 186792 INFO nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance already shutdown.#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.739 186792 INFO nova.virt.libvirt.driver [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance destroyed successfully.#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.740 186792 DEBUG nova.virt.libvirt.vif [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-678450698-network", "vif_mac": "fa:16:3e:b9:5f:a6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.741 186792 DEBUG nova.network.os_vif_util [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-678450698-network", "vif_mac": "fa:16:3e:b9:5f:a6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.743 186792 DEBUG nova.network.os_vif_util [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.743 186792 DEBUG os_vif [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.745 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.746 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape963f21d-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.748 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.751 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.754 186792 INFO os_vif [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8')#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.760 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.823 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.824 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.888 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.890 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Copying file /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk to 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.891 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:37 np0005531888 nova_compute[186788]: 2025-11-22 08:00:37.948 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:38 np0005531888 nova_compute[186788]: 2025-11-22 08:00:38.562 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "scp -r /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk" returned: 0 in 0.672s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:38 np0005531888 nova_compute[186788]: 2025-11-22 08:00:38.564 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Copying file /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:00:38 np0005531888 nova_compute[186788]: 2025-11-22 08:00:38.564 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk.config 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:38 np0005531888 nova_compute[186788]: 2025-11-22 08:00:38.790 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "scp -C -r /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk.config 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:38 np0005531888 nova_compute[186788]: 2025-11-22 08:00:38.791 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Copying file /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:00:38 np0005531888 nova_compute[186788]: 2025-11-22 08:00:38.791 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk.info 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:39 np0005531888 nova_compute[186788]: 2025-11-22 08:00:39.021 186792 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "scp -C -r /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_resize/disk.info 192.168.122.100:/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info" returned: 0 in 0.230s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:39 np0005531888 podman[226909]: 2025-11-22 08:00:39.691128361 +0000 UTC m=+0.063788799 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:00:40 np0005531888 nova_compute[186788]: 2025-11-22 08:00:40.427 186792 DEBUG neutronclient.v2_0.client [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:00:40 np0005531888 nova_compute[186788]: 2025-11-22 08:00:40.550 186792 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:40 np0005531888 nova_compute[186788]: 2025-11-22 08:00:40.551 186792 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:40 np0005531888 nova_compute[186788]: 2025-11-22 08:00:40.551 186792 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:42.070 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:42 np0005531888 nova_compute[186788]: 2025-11-22 08:00:42.748 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:42 np0005531888 nova_compute[186788]: 2025-11-22 08:00:42.951 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:43 np0005531888 nova_compute[186788]: 2025-11-22 08:00:43.837 186792 DEBUG nova.compute.manager [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-changed-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:43 np0005531888 nova_compute[186788]: 2025-11-22 08:00:43.838 186792 DEBUG nova.compute.manager [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Refreshing instance network info cache due to event network-changed-e963f21d-d8c0-4f76-b5bc-4a3f577d4055. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:00:43 np0005531888 nova_compute[186788]: 2025-11-22 08:00:43.838 186792 DEBUG oslo_concurrency.lockutils [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:43 np0005531888 nova_compute[186788]: 2025-11-22 08:00:43.838 186792 DEBUG oslo_concurrency.lockutils [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:43 np0005531888 nova_compute[186788]: 2025-11-22 08:00:43.838 186792 DEBUG nova.network.neutron [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Refreshing network info cache for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:00:45 np0005531888 nova_compute[186788]: 2025-11-22 08:00:45.190 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798430.189431, 92409a46-2dd7-4b20-ac9d-958bbb30993d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:45 np0005531888 nova_compute[186788]: 2025-11-22 08:00:45.190 186792 INFO nova.compute.manager [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:00:45 np0005531888 nova_compute[186788]: 2025-11-22 08:00:45.214 186792 DEBUG nova.compute.manager [None req-ee6656fd-5269-4a59-acbe-70f3545691e3 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:45 np0005531888 nova_compute[186788]: 2025-11-22 08:00:45.217 186792 DEBUG nova.compute.manager [None req-ee6656fd-5269-4a59-acbe-70f3545691e3 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: resize_migrated, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:45 np0005531888 nova_compute[186788]: 2025-11-22 08:00:45.303 186792 INFO nova.compute.manager [None req-ee6656fd-5269-4a59-acbe-70f3545691e3 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 22 03:00:45 np0005531888 podman[226932]: 2025-11-22 08:00:45.687886657 +0000 UTC m=+0.059154825 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:00:45 np0005531888 podman[226933]: 2025-11-22 08:00:45.716899081 +0000 UTC m=+0.083396452 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:00:46 np0005531888 nova_compute[186788]: 2025-11-22 08:00:46.485 186792 DEBUG nova.network.neutron [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updated VIF entry in instance network info cache for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:00:46 np0005531888 nova_compute[186788]: 2025-11-22 08:00:46.486 186792 DEBUG nova.network.neutron [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:46 np0005531888 nova_compute[186788]: 2025-11-22 08:00:46.499 186792 DEBUG oslo_concurrency.lockutils [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:47 np0005531888 nova_compute[186788]: 2025-11-22 08:00:47.751 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:47 np0005531888 nova_compute[186788]: 2025-11-22 08:00:47.952 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:49 np0005531888 nova_compute[186788]: 2025-11-22 08:00:49.786 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:49 np0005531888 nova_compute[186788]: 2025-11-22 08:00:49.787 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:49 np0005531888 nova_compute[186788]: 2025-11-22 08:00:49.787 186792 DEBUG nova.compute.manager [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Going to confirm migration 14 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 22 03:00:49 np0005531888 nova_compute[186788]: 2025-11-22 08:00:49.814 186792 DEBUG nova.objects.instance [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'info_cache' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:49 np0005531888 nova_compute[186788]: 2025-11-22 08:00:49.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:49 np0005531888 nova_compute[186788]: 2025-11-22 08:00:49.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:50 np0005531888 nova_compute[186788]: 2025-11-22 08:00:50.694 186792 DEBUG neutronclient.v2_0.client [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:00:50 np0005531888 nova_compute[186788]: 2025-11-22 08:00:50.694 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:50 np0005531888 nova_compute[186788]: 2025-11-22 08:00:50.695 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:50 np0005531888 nova_compute[186788]: 2025-11-22 08:00:50.695 186792 DEBUG nova.network.neutron [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:00:50 np0005531888 nova_compute[186788]: 2025-11-22 08:00:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:51 np0005531888 nova_compute[186788]: 2025-11-22 08:00:51.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:51 np0005531888 nova_compute[186788]: 2025-11-22 08:00:51.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:00:51 np0005531888 nova_compute[186788]: 2025-11-22 08:00:51.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:00:52 np0005531888 nova_compute[186788]: 2025-11-22 08:00:52.479 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:52 np0005531888 nova_compute[186788]: 2025-11-22 08:00:52.480 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:52 np0005531888 nova_compute[186788]: 2025-11-22 08:00:52.480 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:00:52 np0005531888 nova_compute[186788]: 2025-11-22 08:00:52.480 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 80cb8b15-443c-424b-894c-1ed6674f77d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:52 np0005531888 nova_compute[186788]: 2025-11-22 08:00:52.756 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:52 np0005531888 nova_compute[186788]: 2025-11-22 08:00:52.954 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.330 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.331 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.331 186792 INFO nova.compute.manager [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Rebooting instance#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.345 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.345 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.346 186792 DEBUG nova.network.neutron [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.747 186792 DEBUG nova.network.neutron [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.766 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.766 186792 DEBUG nova.objects.instance [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'migration_context' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.800 186792 DEBUG nova.virt.libvirt.vif [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.801 186792 DEBUG nova.network.os_vif_util [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.801 186792 DEBUG nova.network.os_vif_util [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.802 186792 DEBUG os_vif [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.805 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.806 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape963f21d-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.806 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.808 186792 INFO os_vif [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8')#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.809 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.809 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.931 186792 DEBUG nova.compute.provider_tree [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:00:53 np0005531888 nova_compute[186788]: 2025-11-22 08:00:53.954 186792 DEBUG nova.scheduler.client.report [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.180 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.181 186792 DEBUG nova.compute.manager [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Resized/migrated instance is powered off. Setting vm_state to 'stopped'. _confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4805#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.362 186792 INFO nova.scheduler.client.report [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Deleted allocation for migration 8a0018e5-ed9f-45b7-a5a4-16ecd560c356#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.450 186792 DEBUG oslo_concurrency.lockutils [None req-e73b89f8-3ead-429c-857b-a28ac04fc540 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.508 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Updating instance_info_cache with network_info: [{"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.546 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.547 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.547 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:54 np0005531888 nova_compute[186788]: 2025-11-22 08:00:54.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.136 186792 DEBUG nova.network.neutron [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.150 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.158 186792 DEBUG nova.compute.manager [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:55 np0005531888 kernel: tapa2f45e58-23 (unregistering): left promiscuous mode
Nov 22 03:00:55 np0005531888 NetworkManager[55166]: <info>  [1763798455.5615] device (tapa2f45e58-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:00:55 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:55Z|00249|binding|INFO|Releasing lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe from this chassis (sb_readonly=0)
Nov 22 03:00:55 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:55Z|00250|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe down in Southbound
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.571 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:55 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:55Z|00251|binding|INFO|Removing iface tapa2f45e58-23 ovn-installed in OVS
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.574 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:55.586 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:55.587 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.589 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:55.590 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:00:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:55.591 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[efde52a0-156f-4328-b061-849af4e01c36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:55.592 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore#033[00m
Nov 22 03:00:55 np0005531888 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Nov 22 03:00:55 np0005531888 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000005d.scope: Consumed 16.175s CPU time.
Nov 22 03:00:55 np0005531888 systemd-machined[153106]: Machine qemu-41-instance-0000005d terminated.
Nov 22 03:00:55 np0005531888 podman[226978]: 2025-11-22 08:00:55.661857421 +0000 UTC m=+0.070611918 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:00:55 np0005531888 podman[226979]: 2025-11-22 08:00:55.661073871 +0000 UTC m=+0.067370417 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 03:00:55 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[226721]: [NOTICE]   (226725) : haproxy version is 2.8.14-c23fe91
Nov 22 03:00:55 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[226721]: [NOTICE]   (226725) : path to executable is /usr/sbin/haproxy
Nov 22 03:00:55 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[226721]: [WARNING]  (226725) : Exiting Master process...
Nov 22 03:00:55 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[226721]: [WARNING]  (226725) : Exiting Master process...
Nov 22 03:00:55 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[226721]: [ALERT]    (226725) : Current worker (226727) exited with code 143 (Terminated)
Nov 22 03:00:55 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[226721]: [WARNING]  (226725) : All workers exited. Exiting... (0)
Nov 22 03:00:55 np0005531888 systemd[1]: libpod-3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24.scope: Deactivated successfully.
Nov 22 03:00:55 np0005531888 podman[227035]: 2025-11-22 08:00:55.766400171 +0000 UTC m=+0.073242892 container died 3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.795 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance destroyed successfully.#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.797 186792 DEBUG nova.objects.instance [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.808 186792 DEBUG nova.virt.libvirt.vif [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.809 186792 DEBUG nova.network.os_vif_util [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.810 186792 DEBUG nova.network.os_vif_util [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.811 186792 DEBUG os_vif [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.812 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.813 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2f45e58-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.815 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.817 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.819 186792 INFO os_vif [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.825 186792 DEBUG nova.virt.libvirt.driver [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start _get_guest_xml network_info=[{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.830 186792 WARNING nova.virt.libvirt.driver [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.837 186792 DEBUG nova.virt.libvirt.host [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.838 186792 DEBUG nova.virt.libvirt.host [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.842 186792 DEBUG nova.virt.libvirt.host [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.844 186792 DEBUG nova.virt.libvirt.host [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.845 186792 DEBUG nova.virt.libvirt.driver [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:00:55 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24-userdata-shm.mount: Deactivated successfully.
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.845 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.846 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.846 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.846 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.846 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.847 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.847 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.847 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.847 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.847 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:00:55 np0005531888 systemd[1]: var-lib-containers-storage-overlay-08e014cd4a31ece6493233b0839bb36405d991c6c76b7e81f86e79a39e170d1f-merged.mount: Deactivated successfully.
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.848 186792 DEBUG nova.virt.hardware [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.848 186792 DEBUG nova.objects.instance [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'vcpu_model' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.862 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.932 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.934 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.934 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.935 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.936 186792 DEBUG nova.virt.libvirt.vif [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.937 186792 DEBUG nova.network.os_vif_util [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.938 186792 DEBUG nova.network.os_vif_util [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.939 186792 DEBUG nova.objects.instance [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.960 186792 DEBUG nova.virt.libvirt.driver [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <uuid>eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</uuid>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <name>instance-0000005d</name>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestJSON-server-1519356482</nova:name>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:00:55</nova:creationTime>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        <nova:port uuid="a2f45e58-237f-4de0-8339-5f17a4ad3cfe">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <entry name="serial">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <entry name="uuid">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:df:95:59"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <target dev="tapa2f45e58-23"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/console.log" append="off"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <input type="keyboard" bus="usb"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:00:55 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:00:55 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:00:55 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:00:55 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:00:55 np0005531888 nova_compute[186788]: 2025-11-22 08:00:55.961 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.023 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.025 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.088 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.090 186792 DEBUG nova.objects.instance [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'trusted_certs' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.113 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.174 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.175 186792 DEBUG nova.virt.disk.api [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.176 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:56 np0005531888 podman[227035]: 2025-11-22 08:00:56.236669804 +0000 UTC m=+0.543512355 container cleanup 3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.236 186792 DEBUG oslo_concurrency.processutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.237 186792 DEBUG nova.virt.disk.api [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.238 186792 DEBUG nova.objects.instance [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:56 np0005531888 systemd[1]: libpod-conmon-3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24.scope: Deactivated successfully.
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.255 186792 DEBUG nova.virt.libvirt.vif [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:00:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.256 186792 DEBUG nova.network.os_vif_util [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.257 186792 DEBUG nova.network.os_vif_util [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.257 186792 DEBUG os_vif [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.258 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.259 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.259 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.262 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.262 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2f45e58-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.263 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2f45e58-23, col_values=(('external_ids', {'iface-id': 'a2f45e58-237f-4de0-8339-5f17a4ad3cfe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:95:59', 'vm-uuid': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.265 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 NetworkManager[55166]: <info>  [1763798456.2658] manager: (tapa2f45e58-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.267 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.270 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.271 186792 INFO os_vif [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:00:56 np0005531888 podman[227096]: 2025-11-22 08:00:56.33815951 +0000 UTC m=+0.075657311 container remove 3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.345 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[177546f0-1c39-4208-9a11-4586c3dd87ed]: (4, ('Sat Nov 22 08:00:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24)\n3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24\nSat Nov 22 08:00:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24)\n3c104fb31cceb8f2398c05f47f6312c2ae6aeb48eff3128adf5021068088ec24\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.348 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b6697d45-07af-45bd-9491-6adf37cbfbbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.350 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.352 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 03:00:56 np0005531888 NetworkManager[55166]: <info>  [1763798456.3577] manager: (tapa2f45e58-23): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Nov 22 03:00:56 np0005531888 kernel: tapa2f45e58-23: entered promiscuous mode
Nov 22 03:00:56 np0005531888 systemd-udevd[227001]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:00:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:56Z|00252|binding|INFO|Claiming lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe for this chassis.
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.371 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:56Z|00253|binding|INFO|a2f45e58-237f-4de0-8339-5f17a4ad3cfe: Claiming fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.375 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[adb584ea-6237-4e74-bf8a-4d6033d3e4c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 NetworkManager[55166]: <info>  [1763798456.3772] device (tapa2f45e58-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:00:56 np0005531888 NetworkManager[55166]: <info>  [1763798456.3786] device (tapa2f45e58-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:00:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:56Z|00254|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe ovn-installed in OVS
Nov 22 03:00:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:56Z|00255|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe up in Southbound
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.387 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.389 186792 DEBUG nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.389 186792 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.390 186792 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.390 186792 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.390 186792 DEBUG nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.390 186792 WARNING nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state reboot_started_hard.#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.391 186792 DEBUG nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.391 186792 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.391 186792 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.391 186792 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.391 186792 DEBUG nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.392 186792 WARNING nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state reboot_started_hard.#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.392 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[07b544e3-c70f-4cff-82a9-854290356e26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.392 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.394 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5fdd14f3-2ac8-4346-9a2f-a3dd5c72b580]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 systemd-machined[153106]: New machine qemu-42-instance-0000005d.
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.412 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[88c3ea87-a29b-4599-a890-24d5162da785]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515702, 'reachable_time': 25186, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227129, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 systemd[1]: Started Virtual Machine qemu-42-instance-0000005d.
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.415 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.415 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[cfafd652-3b14-4d12-831e-ece4fd84f96a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.417 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:00:56 np0005531888 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.418 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.432 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cfcc3211-c1c7-4434-b667-a9f224a149de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.433 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.435 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.435 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f4fc1901-1b18-4da7-8e2d-31eb8a00ec3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.436 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[911c1de1-ba62-42c6-b956-29b523a42513]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.451 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[eb279341-7157-4058-87a8-1d5d3fbe755e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.467 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfebdc2-fa69-4fad-83a9-b031b2d0606c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.499 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f626b4e3-d14d-4399-9855-836993759827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 NetworkManager[55166]: <info>  [1763798456.5080] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/136)
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.507 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c54296cc-7a0f-437e-8b37-e94be347080c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.540 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[04ec1c90-610d-4303-9382-98f284261cb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.544 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6c211d8e-876c-4c74-ba40-34d346ccbe71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 NetworkManager[55166]: <info>  [1763798456.5694] device (tap165f7f23-d0): carrier: link connected
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.576 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[946208b0-3a0d-490c-9bde-027500469e97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.595 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fadb4a61-84b8-4699-8a66-6dd2c4e4a059]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519721, 'reachable_time': 44652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227160, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.610 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[36325a1e-e853-487f-9953-2c9b3498a1d8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519721, 'tstamp': 519721}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227161, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.628 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f9053c5c-6035-4b65-8084-737469466bf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519721, 'reachable_time': 44652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227162, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.660 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea417dc-eedc-453d-8d63-85f5ed942faa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.726 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d254b4c9-618b-4405-bd41-53143c2b9741]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.728 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.728 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.729 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:56 np0005531888 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 03:00:56 np0005531888 NetworkManager[55166]: <info>  [1763798456.7315] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.730 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.733 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.737 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.738 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:00:56Z|00256|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.741 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.742 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e988bf-e51e-403d-a7e7-0d97c4f7a6f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.743 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:00:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:00:56.745 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.752 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.994 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.995 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798456.9936564, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.995 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:00:56 np0005531888 nova_compute[186788]: 2025-11-22 08:00:56.997 186792 DEBUG nova.compute.manager [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.001 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance rebooted successfully.#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.002 186792 DEBUG nova.compute.manager [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.039 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.043 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.094 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.095 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798456.9950018, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.095 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Started (Lifecycle Event)#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.123 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.136 186792 DEBUG oslo_concurrency.lockutils [None req-24b1e396-7228-4a70-a737-ba58a7af403b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.140 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:57 np0005531888 podman[227201]: 2025-11-22 08:00:57.199305044 +0000 UTC m=+0.100953393 container create 6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 03:00:57 np0005531888 podman[227201]: 2025-11-22 08:00:57.125428098 +0000 UTC m=+0.027076467 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:00:57 np0005531888 systemd[1]: Started libpod-conmon-6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625.scope.
Nov 22 03:00:57 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:00:57 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90cb6f0c546c1c4ff747f2d6d97164191665ee9cc9be68e93e318c370d5add2b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:00:57 np0005531888 podman[227201]: 2025-11-22 08:00:57.317413548 +0000 UTC m=+0.219061927 container init 6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:00:57 np0005531888 podman[227201]: 2025-11-22 08:00:57.327711472 +0000 UTC m=+0.229359851 container start 6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:00:57 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227217]: [NOTICE]   (227221) : New worker (227223) forked
Nov 22 03:00:57 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227217]: [NOTICE]   (227221) : Loading success.
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:57 np0005531888 nova_compute[186788]: 2025-11-22 08:00:57.956 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.526 186792 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.527 186792 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.527 186792 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.527 186792 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.527 186792 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.528 186792 WARNING nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.528 186792 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.528 186792 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.528 186792 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.528 186792 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.528 186792 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:58 np0005531888 nova_compute[186788]: 2025-11-22 08:00:58.529 186792 WARNING nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:59 np0005531888 nova_compute[186788]: 2025-11-22 08:00:59.624 186792 INFO nova.compute.manager [None req-4c432c20-66ae-41ce-b50d-e79e16f0c647 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Get console output#033[00m
Nov 22 03:00:59 np0005531888 nova_compute[186788]: 2025-11-22 08:00:59.631 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:00:59 np0005531888 nova_compute[186788]: 2025-11-22 08:00:59.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:59 np0005531888 nova_compute[186788]: 2025-11-22 08:00:59.991 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:59 np0005531888 nova_compute[186788]: 2025-11-22 08:00:59.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:59 np0005531888 nova_compute[186788]: 2025-11-22 08:00:59.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:59 np0005531888 nova_compute[186788]: 2025-11-22 08:00:59.992 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.077 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.223 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.225 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.290 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.298 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.367 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.368 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.435 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.631 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.632 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5405MB free_disk=73.28464126586914GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.632 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.633 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.743 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 80cb8b15-443c-424b-894c-1ed6674f77d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.744 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.744 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.744 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.811 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.818 186792 INFO nova.compute.manager [None req-491831c7-a61a-4707-9e7f-6a673a456066 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Get console output#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.823 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.826 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.858 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:01:00 np0005531888 nova_compute[186788]: 2025-11-22 08:01:00.858 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:01 np0005531888 nova_compute[186788]: 2025-11-22 08:01:01.265 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:02 np0005531888 nova_compute[186788]: 2025-11-22 08:01:02.959 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:03 np0005531888 podman[227258]: 2025-11-22 08:01:03.708574947 +0000 UTC m=+0.071754715 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:01:06 np0005531888 nova_compute[186788]: 2025-11-22 08:01:06.271 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:06 np0005531888 podman[227279]: 2025-11-22 08:01:06.67633248 +0000 UTC m=+0.047987952 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:01:07 np0005531888 nova_compute[186788]: 2025-11-22 08:01:07.683 186792 DEBUG oslo_concurrency.lockutils [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:07 np0005531888 nova_compute[186788]: 2025-11-22 08:01:07.684 186792 DEBUG oslo_concurrency.lockutils [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:07 np0005531888 nova_compute[186788]: 2025-11-22 08:01:07.684 186792 DEBUG nova.compute.manager [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:07 np0005531888 nova_compute[186788]: 2025-11-22 08:01:07.688 186792 DEBUG nova.compute.manager [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 22 03:01:07 np0005531888 nova_compute[186788]: 2025-11-22 08:01:07.689 186792 DEBUG nova.objects.instance [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'flavor' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:07 np0005531888 nova_compute[186788]: 2025-11-22 08:01:07.721 186792 DEBUG nova.objects.instance [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'info_cache' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:07 np0005531888 nova_compute[186788]: 2025-11-22 08:01:07.757 186792 DEBUG nova.virt.libvirt.driver [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:01:07 np0005531888 nova_compute[186788]: 2025-11-22 08:01:07.961 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:10 np0005531888 podman[227305]: 2025-11-22 08:01:10.681638193 +0000 UTC m=+0.057110125 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git)
Nov 22 03:01:11 np0005531888 nova_compute[186788]: 2025-11-22 08:01:11.272 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.296 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.298 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.315 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.447 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.448 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.455 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.455 186792 INFO nova.compute.claims [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.655 186792 DEBUG nova.compute.provider_tree [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.672 186792 DEBUG nova.scheduler.client.report [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.702 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.703 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.759 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.759 186792 DEBUG nova.network.neutron [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.786 186792 INFO nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.800 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.921 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.924 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.924 186792 INFO nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Creating image(s)#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.925 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.926 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.926 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.940 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:12 np0005531888 nova_compute[186788]: 2025-11-22 08:01:12.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.002 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.003 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.004 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.015 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.093 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.094 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.134 186792 DEBUG nova.policy [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.589 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk 1073741824" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.591 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.591 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.653 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.654 186792 DEBUG nova.virt.disk.api [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Checking if we can resize image /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.654 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.712 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.714 186792 DEBUG nova.virt.disk.api [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Cannot resize image /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.715 186792 DEBUG nova.objects.instance [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'migration_context' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.745 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.745 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Ensure instance console log exists: /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.746 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.747 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:13 np0005531888 nova_compute[186788]: 2025-11-22 08:01:13.747 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:13Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:01:14 np0005531888 nova_compute[186788]: 2025-11-22 08:01:14.965 186792 DEBUG nova.network.neutron [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Successfully created port: 43ae6beb-d59a-483d-8ced-1303f84a69d1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:01:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:15.388 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:15 np0005531888 nova_compute[186788]: 2025-11-22 08:01:15.389 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:15.390 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:01:16 np0005531888 nova_compute[186788]: 2025-11-22 08:01:16.275 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:16 np0005531888 nova_compute[186788]: 2025-11-22 08:01:16.489 186792 DEBUG nova.network.neutron [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Successfully updated port: 43ae6beb-d59a-483d-8ced-1303f84a69d1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:01:16 np0005531888 nova_compute[186788]: 2025-11-22 08:01:16.504 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:16 np0005531888 nova_compute[186788]: 2025-11-22 08:01:16.504 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:16 np0005531888 nova_compute[186788]: 2025-11-22 08:01:16.504 186792 DEBUG nova.network.neutron [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:01:16 np0005531888 podman[227352]: 2025-11-22 08:01:16.821605784 +0000 UTC m=+0.196770268 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 03:01:16 np0005531888 podman[227353]: 2025-11-22 08:01:16.88770476 +0000 UTC m=+0.259424530 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:01:16 np0005531888 nova_compute[186788]: 2025-11-22 08:01:16.941 186792 DEBUG nova.network.neutron [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:01:17 np0005531888 nova_compute[186788]: 2025-11-22 08:01:17.801 186792 DEBUG nova.virt.libvirt.driver [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:01:17 np0005531888 nova_compute[186788]: 2025-11-22 08:01:17.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.084 186792 DEBUG nova.compute.manager [req-122acf33-9a64-47a3-8153-7437522a7adb req-e07e869d-9e12-4e46-9cf3-907d7c54c0ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-changed-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.084 186792 DEBUG nova.compute.manager [req-122acf33-9a64-47a3-8153-7437522a7adb req-e07e869d-9e12-4e46-9cf3-907d7c54c0ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Refreshing instance network info cache due to event network-changed-43ae6beb-d59a-483d-8ced-1303f84a69d1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.084 186792 DEBUG oslo_concurrency.lockutils [req-122acf33-9a64-47a3-8153-7437522a7adb req-e07e869d-9e12-4e46-9cf3-907d7c54c0ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.365 186792 DEBUG nova.network.neutron [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updating instance_info_cache with network_info: [{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.398 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.398 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance network_info: |[{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.399 186792 DEBUG oslo_concurrency.lockutils [req-122acf33-9a64-47a3-8153-7437522a7adb req-e07e869d-9e12-4e46-9cf3-907d7c54c0ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.400 186792 DEBUG nova.network.neutron [req-122acf33-9a64-47a3-8153-7437522a7adb req-e07e869d-9e12-4e46-9cf3-907d7c54c0ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Refreshing network info cache for port 43ae6beb-d59a-483d-8ced-1303f84a69d1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.402 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Start _get_guest_xml network_info=[{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.406 186792 WARNING nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.429 186792 DEBUG nova.virt.libvirt.host [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.429 186792 DEBUG nova.virt.libvirt.host [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.438 186792 DEBUG nova.virt.libvirt.host [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.438 186792 DEBUG nova.virt.libvirt.host [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.440 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.440 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.440 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.441 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.441 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.441 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.441 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.442 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.442 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.442 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.442 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.443 186792 DEBUG nova.virt.hardware [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.446 186792 DEBUG nova.virt.libvirt.vif [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:01:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-767428975',display_name='tempest-ServerActionsTestOtherB-server-767428975',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-767428975',id=96,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-41j79svb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=aaf09935-3011-4bf6-bdf9-28fe60097c1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.446 186792 DEBUG nova.network.os_vif_util [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.447 186792 DEBUG nova.network.os_vif_util [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.448 186792 DEBUG nova.objects.instance [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_devices' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.462 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <uuid>aaf09935-3011-4bf6-bdf9-28fe60097c1c</uuid>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <name>instance-00000060</name>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestOtherB-server-767428975</nova:name>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:01:18</nova:creationTime>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        <nova:user uuid="d0c5153b41c5499bac372d2df10b9b03">tempest-ServerActionsTestOtherB-270195081-project-member</nova:user>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        <nova:project uuid="62d9a4a13f5d41529bc273c278fae96b">tempest-ServerActionsTestOtherB-270195081</nova:project>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        <nova:port uuid="43ae6beb-d59a-483d-8ced-1303f84a69d1">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <entry name="serial">aaf09935-3011-4bf6-bdf9-28fe60097c1c</entry>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <entry name="uuid">aaf09935-3011-4bf6-bdf9-28fe60097c1c</entry>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.config"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:57:82:86"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <target dev="tap43ae6beb-d5"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/console.log" append="off"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:01:18 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:01:18 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:01:18 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:01:18 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.463 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Preparing to wait for external event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.464 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.464 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.464 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.465 186792 DEBUG nova.virt.libvirt.vif [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:01:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-767428975',display_name='tempest-ServerActionsTestOtherB-server-767428975',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-767428975',id=96,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-41j79svb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=aaf09935-3011-4bf6-bdf9-28fe60097c1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.465 186792 DEBUG nova.network.os_vif_util [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.465 186792 DEBUG nova.network.os_vif_util [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.466 186792 DEBUG os_vif [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.467 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.467 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.467 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.469 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.470 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43ae6beb-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.470 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43ae6beb-d5, col_values=(('external_ids', {'iface-id': '43ae6beb-d59a-483d-8ced-1303f84a69d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:82:86', 'vm-uuid': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.471 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531888 NetworkManager[55166]: <info>  [1763798478.4724] manager: (tap43ae6beb-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.474 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.480 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.482 186792 INFO os_vif [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5')#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.696 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.697 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.697 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No VIF found with MAC fa:16:3e:57:82:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:01:18 np0005531888 nova_compute[186788]: 2025-11-22 08:01:18.698 186792 INFO nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Using config drive#033[00m
Nov 22 03:01:19 np0005531888 nova_compute[186788]: 2025-11-22 08:01:19.626 186792 INFO nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Creating config drive at /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.config#033[00m
Nov 22 03:01:19 np0005531888 nova_compute[186788]: 2025-11-22 08:01:19.632 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4pf7dr7m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:19 np0005531888 nova_compute[186788]: 2025-11-22 08:01:19.759 186792 DEBUG oslo_concurrency.processutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4pf7dr7m" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:19 np0005531888 kernel: tap43ae6beb-d5: entered promiscuous mode
Nov 22 03:01:19 np0005531888 NetworkManager[55166]: <info>  [1763798479.8272] manager: (tap43ae6beb-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Nov 22 03:01:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:19Z|00257|binding|INFO|Claiming lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 for this chassis.
Nov 22 03:01:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:19Z|00258|binding|INFO|43ae6beb-d59a-483d-8ced-1303f84a69d1: Claiming fa:16:3e:57:82:86 10.100.0.11
Nov 22 03:01:19 np0005531888 nova_compute[186788]: 2025-11-22 08:01:19.828 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.835 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:82:86 10.100.0.11'], port_security=['fa:16:3e:57:82:86 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '320b38f4-6497-45cc-9e33-00f741d5a1b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=43ae6beb-d59a-483d-8ced-1303f84a69d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.836 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 43ae6beb-d59a-483d-8ced-1303f84a69d1 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 bound to our chassis#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.838 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087#033[00m
Nov 22 03:01:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:19Z|00259|binding|INFO|Setting lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 ovn-installed in OVS
Nov 22 03:01:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:19Z|00260|binding|INFO|Setting lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 up in Southbound
Nov 22 03:01:19 np0005531888 nova_compute[186788]: 2025-11-22 08:01:19.843 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:19 np0005531888 nova_compute[186788]: 2025-11-22 08:01:19.846 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.859 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aca84528-e4c1-4c19-9fbd-e7784b0643bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:19 np0005531888 systemd-udevd[227418]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:01:19 np0005531888 systemd-machined[153106]: New machine qemu-43-instance-00000060.
Nov 22 03:01:19 np0005531888 NetworkManager[55166]: <info>  [1763798479.8833] device (tap43ae6beb-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:01:19 np0005531888 NetworkManager[55166]: <info>  [1763798479.8842] device (tap43ae6beb-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:01:19 np0005531888 systemd[1]: Started Virtual Machine qemu-43-instance-00000060.
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.893 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4ac8b7ea-7310-4b6d-a68c-21da4001b766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.897 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac4f660-200b-4568-935f-ae624fd433ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.925 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b7909f0b-b8b5-4965-ad26-c72db417b3b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.944 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9407d149-babc-402b-8bf4-54983b1b2d79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506803, 'reachable_time': 17176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227431, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.962 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2058bba8-d12d-485b-af6f-2e47d197d0ea]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506813, 'tstamp': 506813}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227433, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506816, 'tstamp': 506816}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227433, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.965 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:19 np0005531888 nova_compute[186788]: 2025-11-22 08:01:19.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.968 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.968 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.969 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:19.969 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.421 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798480.421275, aaf09935-3011-4bf6-bdf9-28fe60097c1c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.422 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] VM Started (Lifecycle Event)#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.448 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.452 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798480.4252186, aaf09935-3011-4bf6-bdf9-28fe60097c1c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.453 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.465 186792 DEBUG nova.compute.manager [req-b5ce6ba3-0ecc-4ab5-bf38-63916cd36143 req-cec10671-130f-4d66-944f-3b92c0d3b3b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.466 186792 DEBUG oslo_concurrency.lockutils [req-b5ce6ba3-0ecc-4ab5-bf38-63916cd36143 req-cec10671-130f-4d66-944f-3b92c0d3b3b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.466 186792 DEBUG oslo_concurrency.lockutils [req-b5ce6ba3-0ecc-4ab5-bf38-63916cd36143 req-cec10671-130f-4d66-944f-3b92c0d3b3b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.466 186792 DEBUG oslo_concurrency.lockutils [req-b5ce6ba3-0ecc-4ab5-bf38-63916cd36143 req-cec10671-130f-4d66-944f-3b92c0d3b3b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.467 186792 DEBUG nova.compute.manager [req-b5ce6ba3-0ecc-4ab5-bf38-63916cd36143 req-cec10671-130f-4d66-944f-3b92c0d3b3b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Processing event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.467 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.472 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.473 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.477 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798480.4710941, aaf09935-3011-4bf6-bdf9-28fe60097c1c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.477 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.479 186792 INFO nova.virt.libvirt.driver [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance spawned successfully.#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.480 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.510 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.516 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.522 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.523 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.523 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.524 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.524 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.525 186792 DEBUG nova.virt.libvirt.driver [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.556 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.596 186792 INFO nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Took 7.67 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.596 186792 DEBUG nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.676 186792 INFO nova.compute.manager [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Took 8.28 seconds to build instance.#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.691 186792 DEBUG oslo_concurrency.lockutils [None req-b29a8643-3e1b-49fd-9756-303bd46bc15f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.393s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.824 186792 DEBUG nova.network.neutron [req-122acf33-9a64-47a3-8153-7437522a7adb req-e07e869d-9e12-4e46-9cf3-907d7c54c0ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updated VIF entry in instance network info cache for port 43ae6beb-d59a-483d-8ced-1303f84a69d1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.825 186792 DEBUG nova.network.neutron [req-122acf33-9a64-47a3-8153-7437522a7adb req-e07e869d-9e12-4e46-9cf3-907d7c54c0ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updating instance_info_cache with network_info: [{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:20 np0005531888 nova_compute[186788]: 2025-11-22 08:01:20.839 186792 DEBUG oslo_concurrency.lockutils [req-122acf33-9a64-47a3-8153-7437522a7adb req-e07e869d-9e12-4e46-9cf3-907d7c54c0ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:21 np0005531888 kernel: tapa2f45e58-23 (unregistering): left promiscuous mode
Nov 22 03:01:21 np0005531888 NetworkManager[55166]: <info>  [1763798481.0112] device (tapa2f45e58-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:01:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:21Z|00261|binding|INFO|Releasing lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe from this chassis (sb_readonly=0)
Nov 22 03:01:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:21Z|00262|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe down in Southbound
Nov 22 03:01:21 np0005531888 nova_compute[186788]: 2025-11-22 08:01:21.016 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:21Z|00263|binding|INFO|Removing iface tapa2f45e58-23 ovn-installed in OVS
Nov 22 03:01:21 np0005531888 nova_compute[186788]: 2025-11-22 08:01:21.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:21.023 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:21.025 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:01:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:21.027 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:01:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:21.028 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ddcc8503-ccc4-4f92-974b-21469df33729]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:21.029 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore#033[00m
Nov 22 03:01:21 np0005531888 nova_compute[186788]: 2025-11-22 08:01:21.034 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:21 np0005531888 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Nov 22 03:01:21 np0005531888 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005d.scope: Consumed 15.995s CPU time.
Nov 22 03:01:21 np0005531888 systemd-machined[153106]: Machine qemu-42-instance-0000005d terminated.
Nov 22 03:01:21 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227217]: [NOTICE]   (227221) : haproxy version is 2.8.14-c23fe91
Nov 22 03:01:21 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227217]: [NOTICE]   (227221) : path to executable is /usr/sbin/haproxy
Nov 22 03:01:21 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227217]: [WARNING]  (227221) : Exiting Master process...
Nov 22 03:01:21 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227217]: [WARNING]  (227221) : Exiting Master process...
Nov 22 03:01:21 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227217]: [ALERT]    (227221) : Current worker (227223) exited with code 143 (Terminated)
Nov 22 03:01:21 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227217]: [WARNING]  (227221) : All workers exited. Exiting... (0)
Nov 22 03:01:21 np0005531888 systemd[1]: libpod-6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625.scope: Deactivated successfully.
Nov 22 03:01:21 np0005531888 conmon[227217]: conmon 6f3a232d0794c10d5c32 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625.scope/container/memory.events
Nov 22 03:01:21 np0005531888 podman[227460]: 2025-11-22 08:01:21.300186636 +0000 UTC m=+0.171308813 container died 6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:01:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625-userdata-shm.mount: Deactivated successfully.
Nov 22 03:01:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay-90cb6f0c546c1c4ff747f2d6d97164191665ee9cc9be68e93e318c370d5add2b-merged.mount: Deactivated successfully.
Nov 22 03:01:21 np0005531888 nova_compute[186788]: 2025-11-22 08:01:21.824 186792 INFO nova.virt.libvirt.driver [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance shutdown successfully after 14 seconds.#033[00m
Nov 22 03:01:21 np0005531888 nova_compute[186788]: 2025-11-22 08:01:21.831 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance destroyed successfully.#033[00m
Nov 22 03:01:21 np0005531888 nova_compute[186788]: 2025-11-22 08:01:21.831 186792 DEBUG nova.objects.instance [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'numa_topology' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:21 np0005531888 nova_compute[186788]: 2025-11-22 08:01:21.843 186792 DEBUG nova.compute.manager [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:21 np0005531888 nova_compute[186788]: 2025-11-22 08:01:21.931 186792 DEBUG oslo_concurrency.lockutils [None req-03f0b6de-1c89-46b6-bcf8-b7bd30c0e68a b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 14.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:22.393 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.541 186792 DEBUG nova.compute.manager [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.542 186792 DEBUG oslo_concurrency.lockutils [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.542 186792 DEBUG oslo_concurrency.lockutils [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.542 186792 DEBUG oslo_concurrency.lockutils [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.543 186792 DEBUG nova.compute.manager [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] No waiting events found dispatching network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.543 186792 WARNING nova.compute.manager [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received unexpected event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.543 186792 DEBUG nova.compute.manager [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.543 186792 DEBUG oslo_concurrency.lockutils [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.543 186792 DEBUG oslo_concurrency.lockutils [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.544 186792 DEBUG oslo_concurrency.lockutils [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.544 186792 DEBUG nova.compute.manager [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.544 186792 WARNING nova.compute.manager [req-d7151bfd-2a1a-4aa2-a915-c23ec9cf2fd6 req-9b738eea-0d31-446c-8334-5020cf32560a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state stopped and task_state None.#033[00m
Nov 22 03:01:22 np0005531888 podman[227460]: 2025-11-22 08:01:22.626253582 +0000 UTC m=+1.497375759 container cleanup 6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:01:22 np0005531888 systemd[1]: libpod-conmon-6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625.scope: Deactivated successfully.
Nov 22 03:01:22 np0005531888 nova_compute[186788]: 2025-11-22 08:01:22.969 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:23 np0005531888 nova_compute[186788]: 2025-11-22 08:01:23.310 186792 DEBUG nova.objects.instance [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'flavor' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:23 np0005531888 nova_compute[186788]: 2025-11-22 08:01:23.333 186792 DEBUG nova.objects.instance [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'info_cache' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:23 np0005531888 podman[227506]: 2025-11-22 08:01:23.360774043 +0000 UTC m=+0.710901051 container remove 6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.366 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5ace6265-402b-4779-8be1-48dd16410316]: (4, ('Sat Nov 22 08:01:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625)\n6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625\nSat Nov 22 08:01:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625)\n6f3a232d0794c10d5c3216aae69410b648deda46dd49b3a65e689592c592c625\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.368 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c275cd37-d485-4f9d-9251-3e06cd488037]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.369 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:23 np0005531888 nova_compute[186788]: 2025-11-22 08:01:23.372 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:23 np0005531888 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 03:01:23 np0005531888 nova_compute[186788]: 2025-11-22 08:01:23.380 186792 DEBUG oslo_concurrency.lockutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:23 np0005531888 nova_compute[186788]: 2025-11-22 08:01:23.381 186792 DEBUG oslo_concurrency.lockutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:23 np0005531888 nova_compute[186788]: 2025-11-22 08:01:23.381 186792 DEBUG nova.network.neutron [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:01:23 np0005531888 nova_compute[186788]: 2025-11-22 08:01:23.389 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.392 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f57fc6ff-1cc7-4e25-9647-c965d2151c6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.409 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[142a4e14-9373-483f-bfb2-5a7655e8027b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.410 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[38f79a07-d7b6-4d27-9fb5-19a22c41088a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.424 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8c610580-f937-4b01-a6b1-822e886f78b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519713, 'reachable_time': 22248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227524, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:23 np0005531888 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.427 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:01:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:23.429 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[92662ec6-df32-4c39-bc3c-11469f94d8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:23 np0005531888 nova_compute[186788]: 2025-11-22 08:01:23.471 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:24 np0005531888 nova_compute[186788]: 2025-11-22 08:01:24.758 186792 DEBUG nova.compute.manager [req-f52d59dc-b711-4b9c-aaa2-621898e32b95 req-8f943204-0e87-49b5-8097-24a5b1769f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:24 np0005531888 nova_compute[186788]: 2025-11-22 08:01:24.759 186792 DEBUG oslo_concurrency.lockutils [req-f52d59dc-b711-4b9c-aaa2-621898e32b95 req-8f943204-0e87-49b5-8097-24a5b1769f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:24 np0005531888 nova_compute[186788]: 2025-11-22 08:01:24.759 186792 DEBUG oslo_concurrency.lockutils [req-f52d59dc-b711-4b9c-aaa2-621898e32b95 req-8f943204-0e87-49b5-8097-24a5b1769f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:24 np0005531888 nova_compute[186788]: 2025-11-22 08:01:24.759 186792 DEBUG oslo_concurrency.lockutils [req-f52d59dc-b711-4b9c-aaa2-621898e32b95 req-8f943204-0e87-49b5-8097-24a5b1769f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:24 np0005531888 nova_compute[186788]: 2025-11-22 08:01:24.759 186792 DEBUG nova.compute.manager [req-f52d59dc-b711-4b9c-aaa2-621898e32b95 req-8f943204-0e87-49b5-8097-24a5b1769f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:01:24 np0005531888 nova_compute[186788]: 2025-11-22 08:01:24.760 186792 WARNING nova.compute.manager [req-f52d59dc-b711-4b9c-aaa2-621898e32b95 req-8f943204-0e87-49b5-8097-24a5b1769f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.220 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.830 186792 DEBUG nova.network.neutron [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.850 186792 DEBUG oslo_concurrency.lockutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.880 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance destroyed successfully.#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.881 186792 DEBUG nova.objects.instance [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'numa_topology' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.895 186792 DEBUG nova.objects.instance [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.909 186792 DEBUG nova.virt.libvirt.vif [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.910 186792 DEBUG nova.network.os_vif_util [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.911 186792 DEBUG nova.network.os_vif_util [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.911 186792 DEBUG os_vif [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.914 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.914 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2f45e58-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.915 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.917 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.919 186792 INFO os_vif [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.925 186792 DEBUG nova.virt.libvirt.driver [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start _get_guest_xml network_info=[{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.930 186792 WARNING nova.virt.libvirt.driver [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.939 186792 DEBUG nova.virt.libvirt.host [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.940 186792 DEBUG nova.virt.libvirt.host [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.946 186792 DEBUG nova.virt.libvirt.host [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.947 186792 DEBUG nova.virt.libvirt.host [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.949 186792 DEBUG nova.virt.libvirt.driver [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.949 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.950 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.950 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.950 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.951 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.951 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.951 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.951 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.952 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.952 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.952 186792 DEBUG nova.virt.hardware [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.953 186792 DEBUG nova.objects.instance [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'vcpu_model' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.978 186792 DEBUG nova.virt.libvirt.vif [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.979 186792 DEBUG nova.network.os_vif_util [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.980 186792 DEBUG nova.network.os_vif_util [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.981 186792 DEBUG nova.objects.instance [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:25 np0005531888 nova_compute[186788]: 2025-11-22 08:01:25.998 186792 DEBUG nova.virt.libvirt.driver [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:01:25 np0005531888 nova_compute[186788]:  <uuid>eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</uuid>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:  <name>instance-0000005d</name>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:01:25 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestJSON-server-1519356482</nova:name>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:01:25</nova:creationTime>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:01:25 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:        <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:        <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:01:25 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:        <nova:port uuid="a2f45e58-237f-4de0-8339-5f17a4ad3cfe">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <entry name="serial">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <entry name="uuid">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:df:95:59"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <target dev="tapa2f45e58-23"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/console.log" append="off"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <input type="keyboard" bus="usb"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:01:26 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:01:26 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:01:26 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:01:26 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.000 186792 DEBUG oslo_concurrency.processutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.063 186792 DEBUG oslo_concurrency.processutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.064 186792 DEBUG oslo_concurrency.processutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.125 186792 DEBUG oslo_concurrency.processutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.126 186792 DEBUG nova.objects.instance [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'trusted_certs' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.142 186792 DEBUG oslo_concurrency.processutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.209 186792 DEBUG oslo_concurrency.processutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.210 186792 DEBUG nova.virt.disk.api [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.211 186792 DEBUG oslo_concurrency.processutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.278 186792 DEBUG oslo_concurrency.processutils [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.279 186792 DEBUG nova.virt.disk.api [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.280 186792 DEBUG nova.objects.instance [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.296 186792 DEBUG nova.virt.libvirt.vif [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.296 186792 DEBUG nova.network.os_vif_util [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.297 186792 DEBUG nova.network.os_vif_util [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.298 186792 DEBUG os_vif [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.298 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.299 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.299 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.302 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.302 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2f45e58-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.305 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2f45e58-23, col_values=(('external_ids', {'iface-id': 'a2f45e58-237f-4de0-8339-5f17a4ad3cfe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:95:59', 'vm-uuid': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.306 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 NetworkManager[55166]: <info>  [1763798486.3073] manager: (tapa2f45e58-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.310 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.311 186792 INFO os_vif [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:01:26 np0005531888 podman[227545]: 2025-11-22 08:01:26.403895548 +0000 UTC m=+0.054570643 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:01:26 np0005531888 podman[227544]: 2025-11-22 08:01:26.404588146 +0000 UTC m=+0.056732857 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:01:26 np0005531888 kernel: tapa2f45e58-23: entered promiscuous mode
Nov 22 03:01:26 np0005531888 NetworkManager[55166]: <info>  [1763798486.4316] manager: (tapa2f45e58-23): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Nov 22 03:01:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:26Z|00264|binding|INFO|Claiming lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe for this chassis.
Nov 22 03:01:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:26Z|00265|binding|INFO|a2f45e58-237f-4de0-8339-5f17a4ad3cfe: Claiming fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.432 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.447 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:26Z|00266|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe ovn-installed in OVS
Nov 22 03:01:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:26Z|00267|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe up in Southbound
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.448 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.450 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.452 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.467 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[358431c8-54ef-4283-92fc-eab2d29824bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.468 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.470 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.470 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec46288-7440-40b2-a4cb-f4bc8b99d140]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.471 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a7aa7985-371a-48f5-8486-f8217004ac4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 systemd-machined[153106]: New machine qemu-44-instance-0000005d.
Nov 22 03:01:26 np0005531888 systemd-udevd[227598]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.485 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccae411-2dae-427f-89a0-2a1a448a667c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 systemd[1]: Started Virtual Machine qemu-44-instance-0000005d.
Nov 22 03:01:26 np0005531888 NetworkManager[55166]: <info>  [1763798486.5004] device (tapa2f45e58-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:01:26 np0005531888 NetworkManager[55166]: <info>  [1763798486.5015] device (tapa2f45e58-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.502 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb690ee-b177-4be6-b3f1-a2bada32447f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.533 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc01a87-356e-477d-b25d-05bb8a9b804d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 systemd-udevd[227602]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.539 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fa161ff7-001c-4bc4-85dd-9c21944506a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 NetworkManager[55166]: <info>  [1763798486.5401] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/142)
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.571 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[612e67f7-12fc-4347-a49c-24d3532b5cf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.575 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd1cedf-60c1-46e9-99a0-e756174325c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 NetworkManager[55166]: <info>  [1763798486.6022] device (tap165f7f23-d0): carrier: link connected
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.609 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ba2084-e3ad-4573-9072-8f4a8592a234]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.627 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ff26ebb6-7322-4a79-b3fd-7458a49351d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522724, 'reachable_time': 23131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227630, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.642 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[639b6456-7e20-4c98-8841-98ffa9d36d75]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522724, 'tstamp': 522724}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227631, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.660 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cc01f58d-5aad-4c37-8d6d-7cd48a4ab06b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522724, 'reachable_time': 23131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227632, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.693 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a58296c0-0665-4002-9589-75fab72dd586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.766 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[023c9c45-3295-4c12-a1d6-74e6304a21df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.768 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.768 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.768 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.770 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 NetworkManager[55166]: <info>  [1763798486.7712] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Nov 22 03:01:26 np0005531888 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.773 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.773 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.774 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:26Z|00268|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:01:26 np0005531888 nova_compute[186788]: 2025-11-22 08:01:26.786 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.786 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.787 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1bef970c-016f-4667-9705-c2e5edda10c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.788 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:01:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:26.789 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:01:27 np0005531888 podman[227664]: 2025-11-22 08:01:27.129594443 +0000 UTC m=+0.026769101 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.269 186792 DEBUG nova.compute.manager [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-changed-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.270 186792 DEBUG nova.compute.manager [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Refreshing instance network info cache due to event network-changed-43ae6beb-d59a-483d-8ced-1303f84a69d1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.271 186792 DEBUG oslo_concurrency.lockutils [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.271 186792 DEBUG oslo_concurrency.lockutils [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.271 186792 DEBUG nova.network.neutron [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Refreshing network info cache for port 43ae6beb-d59a-483d-8ced-1303f84a69d1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.466 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.466 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798487.465552, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.467 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.470 186792 DEBUG nova.compute.manager [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.476 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance rebooted successfully.#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.477 186792 DEBUG nova.compute.manager [None req-a6952cdb-d5fd-4f6a-b829-91fbaa2f24d7 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.517 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.522 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.565 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.566 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798487.4672422, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.567 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Started (Lifecycle Event)#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.586 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.598 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:01:27 np0005531888 nova_compute[186788]: 2025-11-22 08:01:27.970 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:28 np0005531888 podman[227664]: 2025-11-22 08:01:28.21088951 +0000 UTC m=+1.108064168 container create fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:01:28 np0005531888 systemd[1]: Started libpod-conmon-fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32.scope.
Nov 22 03:01:28 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:01:28 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5193f37d6893f760ca6b601fe8236604db5bfc0f61bcc5bac8ddf6e4d9310162/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:01:28 np0005531888 podman[227664]: 2025-11-22 08:01:28.739178589 +0000 UTC m=+1.636353227 container init fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:01:28 np0005531888 podman[227664]: 2025-11-22 08:01:28.745588747 +0000 UTC m=+1.642763385 container start fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:01:28 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227688]: [NOTICE]   (227692) : New worker (227694) forked
Nov 22 03:01:28 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227688]: [NOTICE]   (227692) : Loading success.
Nov 22 03:01:29 np0005531888 nova_compute[186788]: 2025-11-22 08:01:29.462 186792 DEBUG nova.compute.manager [req-0a809fb8-d505-43f9-b38e-6d9ded940a77 req-396dd9c6-de92-44f4-97b3-ff96219b7dee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:29 np0005531888 nova_compute[186788]: 2025-11-22 08:01:29.463 186792 DEBUG oslo_concurrency.lockutils [req-0a809fb8-d505-43f9-b38e-6d9ded940a77 req-396dd9c6-de92-44f4-97b3-ff96219b7dee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:29 np0005531888 nova_compute[186788]: 2025-11-22 08:01:29.464 186792 DEBUG oslo_concurrency.lockutils [req-0a809fb8-d505-43f9-b38e-6d9ded940a77 req-396dd9c6-de92-44f4-97b3-ff96219b7dee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:29 np0005531888 nova_compute[186788]: 2025-11-22 08:01:29.464 186792 DEBUG oslo_concurrency.lockutils [req-0a809fb8-d505-43f9-b38e-6d9ded940a77 req-396dd9c6-de92-44f4-97b3-ff96219b7dee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:29 np0005531888 nova_compute[186788]: 2025-11-22 08:01:29.464 186792 DEBUG nova.compute.manager [req-0a809fb8-d505-43f9-b38e-6d9ded940a77 req-396dd9c6-de92-44f4-97b3-ff96219b7dee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:01:29 np0005531888 nova_compute[186788]: 2025-11-22 08:01:29.464 186792 WARNING nova.compute.manager [req-0a809fb8-d505-43f9-b38e-6d9ded940a77 req-396dd9c6-de92-44f4-97b3-ff96219b7dee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state None.#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.623 186792 DEBUG nova.network.neutron [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updated VIF entry in instance network info cache for port 43ae6beb-d59a-483d-8ced-1303f84a69d1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.624 186792 DEBUG nova.network.neutron [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updating instance_info_cache with network_info: [{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.638 186792 DEBUG oslo_concurrency.lockutils [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.639 186792 DEBUG nova.compute.manager [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.640 186792 DEBUG oslo_concurrency.lockutils [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.640 186792 DEBUG oslo_concurrency.lockutils [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.640 186792 DEBUG oslo_concurrency.lockutils [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.641 186792 DEBUG nova.compute.manager [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:01:30 np0005531888 nova_compute[186788]: 2025-11-22 08:01:30.641 186792 WARNING nova.compute.manager [req-ea3736ca-e148-4b2a-ae93-fb01102f265d req-87bcedf2-8bf5-4538-8e24-dd233d5ebda4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 22 03:01:31 np0005531888 nova_compute[186788]: 2025-11-22 08:01:31.308 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:32 np0005531888 nova_compute[186788]: 2025-11-22 08:01:32.074 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:32 np0005531888 nova_compute[186788]: 2025-11-22 08:01:32.972 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:34 np0005531888 podman[227718]: 2025-11-22 08:01:34.693710772 +0000 UTC m=+0.062280863 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:01:34 np0005531888 nova_compute[186788]: 2025-11-22 08:01:34.818 186792 INFO nova.compute.manager [None req-fd3dd0ca-3d62-4999-b947-b133deec8957 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Pausing#033[00m
Nov 22 03:01:34 np0005531888 nova_compute[186788]: 2025-11-22 08:01:34.819 186792 DEBUG nova.objects.instance [None req-fd3dd0ca-3d62-4999-b947-b133deec8957 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'flavor' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:34 np0005531888 nova_compute[186788]: 2025-11-22 08:01:34.854 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798494.8540034, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:34 np0005531888 nova_compute[186788]: 2025-11-22 08:01:34.855 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:01:34 np0005531888 nova_compute[186788]: 2025-11-22 08:01:34.857 186792 DEBUG nova.compute.manager [None req-fd3dd0ca-3d62-4999-b947-b133deec8957 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:34 np0005531888 nova_compute[186788]: 2025-11-22 08:01:34.877 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:34 np0005531888 nova_compute[186788]: 2025-11-22 08:01:34.880 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:01:34 np0005531888 nova_compute[186788]: 2025-11-22 08:01:34.914 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.560 186792 INFO nova.compute.manager [None req-433e9b38-c13f-4e25-9eeb-624a9ed13714 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Unpausing#033[00m
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.561 186792 DEBUG nova.objects.instance [None req-433e9b38-c13f-4e25-9eeb-624a9ed13714 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'flavor' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.629 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798495.6282108, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.632 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:01:35 np0005531888 virtqemud[186358]: argument unsupported: QEMU guest agent is not configured
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.639 186792 DEBUG nova.virt.libvirt.guest [None req-433e9b38-c13f-4e25-9eeb-624a9ed13714 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.640 186792 DEBUG nova.compute.manager [None req-433e9b38-c13f-4e25-9eeb-624a9ed13714 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.680 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.685 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:01:35 np0005531888 nova_compute[186788]: 2025-11-22 08:01:35.723 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Nov 22 03:01:36 np0005531888 nova_compute[186788]: 2025-11-22 08:01:36.310 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:36.814 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:36.816 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:36.818 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:37 np0005531888 podman[227749]: 2025-11-22 08:01:37.68775971 +0000 UTC m=+0.055027004 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:01:37 np0005531888 nova_compute[186788]: 2025-11-22 08:01:37.974 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:41 np0005531888 nova_compute[186788]: 2025-11-22 08:01:41.312 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:41 np0005531888 podman[227781]: 2025-11-22 08:01:41.747838051 +0000 UTC m=+0.121695094 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, release=1755695350, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Nov 22 03:01:42 np0005531888 nova_compute[186788]: 2025-11-22 08:01:42.976 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:43Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:82:86 10.100.0.11
Nov 22 03:01:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:43Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:82:86 10.100.0.11
Nov 22 03:01:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:45Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:01:46 np0005531888 nova_compute[186788]: 2025-11-22 08:01:46.318 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:47 np0005531888 podman[227810]: 2025-11-22 08:01:47.722738893 +0000 UTC m=+0.089208704 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:01:47 np0005531888 podman[227811]: 2025-11-22 08:01:47.785418425 +0000 UTC m=+0.146298278 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:01:47 np0005531888 nova_compute[186788]: 2025-11-22 08:01:47.979 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:51 np0005531888 nova_compute[186788]: 2025-11-22 08:01:51.322 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:52 np0005531888 nova_compute[186788]: 2025-11-22 08:01:52.853 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:52 np0005531888 nova_compute[186788]: 2025-11-22 08:01:52.853 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:52 np0005531888 nova_compute[186788]: 2025-11-22 08:01:52.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:52 np0005531888 nova_compute[186788]: 2025-11-22 08:01:52.982 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:53 np0005531888 nova_compute[186788]: 2025-11-22 08:01:53.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:53 np0005531888 nova_compute[186788]: 2025-11-22 08:01:53.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:01:54 np0005531888 nova_compute[186788]: 2025-11-22 08:01:54.646 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:54 np0005531888 nova_compute[186788]: 2025-11-22 08:01:54.646 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:54 np0005531888 nova_compute[186788]: 2025-11-22 08:01:54.646 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:01:56 np0005531888 nova_compute[186788]: 2025-11-22 08:01:56.324 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:56 np0005531888 systemd[1]: Starting dnf makecache...
Nov 22 03:01:56 np0005531888 podman[227856]: 2025-11-22 08:01:56.686130026 +0000 UTC m=+0.052578083 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:01:56 np0005531888 podman[227857]: 2025-11-22 08:01:56.686919626 +0000 UTC m=+0.048995346 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:01:56 np0005531888 nova_compute[186788]: 2025-11-22 08:01:56.891 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:56 np0005531888 dnf[227858]: Metadata cache refreshed recently.
Nov 22 03:01:56 np0005531888 nova_compute[186788]: 2025-11-22 08:01:56.968 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:56 np0005531888 nova_compute[186788]: 2025-11-22 08:01:56.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:01:56 np0005531888 nova_compute[186788]: 2025-11-22 08:01:56.969 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:56 np0005531888 nova_compute[186788]: 2025-11-22 08:01:56.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:56 np0005531888 nova_compute[186788]: 2025-11-22 08:01:56.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:56 np0005531888 nova_compute[186788]: 2025-11-22 08:01:56.973 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:01:56 np0005531888 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 22 03:01:56 np0005531888 systemd[1]: Finished dnf makecache.
Nov 22 03:01:57 np0005531888 nova_compute[186788]: 2025-11-22 08:01:57.143 186792 DEBUG oslo_concurrency.lockutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:57 np0005531888 nova_compute[186788]: 2025-11-22 08:01:57.144 186792 DEBUG oslo_concurrency.lockutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:57 np0005531888 nova_compute[186788]: 2025-11-22 08:01:57.144 186792 INFO nova.compute.manager [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Rebooting instance#033[00m
Nov 22 03:01:57 np0005531888 nova_compute[186788]: 2025-11-22 08:01:57.161 186792 DEBUG oslo_concurrency.lockutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:57 np0005531888 nova_compute[186788]: 2025-11-22 08:01:57.161 186792 DEBUG oslo_concurrency.lockutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:57 np0005531888 nova_compute[186788]: 2025-11-22 08:01:57.162 186792 DEBUG nova.network.neutron [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:01:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:57.765 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:57 np0005531888 nova_compute[186788]: 2025-11-22 08:01:57.766 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:57.767 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:01:57 np0005531888 nova_compute[186788]: 2025-11-22 08:01:57.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.193 186792 DEBUG nova.network.neutron [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.214 186792 DEBUG oslo_concurrency.lockutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.221 186792 DEBUG nova.compute.manager [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:59 np0005531888 kernel: tapa2f45e58-23 (unregistering): left promiscuous mode
Nov 22 03:01:59 np0005531888 NetworkManager[55166]: <info>  [1763798519.4158] device (tapa2f45e58-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:01:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:59Z|00269|binding|INFO|Releasing lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe from this chassis (sb_readonly=0)
Nov 22 03:01:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:59Z|00270|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe down in Southbound
Nov 22 03:01:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:01:59Z|00271|binding|INFO|Removing iface tapa2f45e58-23 ovn-installed in OVS
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.424 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:59.431 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:59.432 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:01:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:59.435 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.437 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:59.436 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7c7b90-81ce-424e-9ce9-80c0d10d2879]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:01:59.439 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore#033[00m
Nov 22 03:01:59 np0005531888 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Nov 22 03:01:59 np0005531888 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000005d.scope: Consumed 18.334s CPU time.
Nov 22 03:01:59 np0005531888 systemd-machined[153106]: Machine qemu-44-instance-0000005d terminated.
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.647 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance destroyed successfully.#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.647 186792 DEBUG nova.objects.instance [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:59 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227688]: [NOTICE]   (227692) : haproxy version is 2.8.14-c23fe91
Nov 22 03:01:59 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227688]: [NOTICE]   (227692) : path to executable is /usr/sbin/haproxy
Nov 22 03:01:59 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227688]: [WARNING]  (227692) : Exiting Master process...
Nov 22 03:01:59 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227688]: [ALERT]    (227692) : Current worker (227694) exited with code 143 (Terminated)
Nov 22 03:01:59 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[227688]: [WARNING]  (227692) : All workers exited. Exiting... (0)
Nov 22 03:01:59 np0005531888 systemd[1]: libpod-fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32.scope: Deactivated successfully.
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.667 186792 DEBUG nova.virt.libvirt.vif [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.668 186792 DEBUG nova.network.os_vif_util [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.669 186792 DEBUG nova.network.os_vif_util [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.669 186792 DEBUG os_vif [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:01:59 np0005531888 podman[227920]: 2025-11-22 08:01:59.67144676 +0000 UTC m=+0.128728256 container died fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.672 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.672 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2f45e58-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.674 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.676 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.680 186792 INFO os_vif [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.708 186792 DEBUG nova.virt.libvirt.driver [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start _get_guest_xml network_info=[{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.716 186792 WARNING nova.virt.libvirt.driver [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.722 186792 DEBUG nova.virt.libvirt.host [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.723 186792 DEBUG nova.virt.libvirt.host [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.728 186792 DEBUG nova.virt.libvirt.host [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.729 186792 DEBUG nova.virt.libvirt.host [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.730 186792 DEBUG nova.virt.libvirt.driver [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.731 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.731 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.731 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.732 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.732 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.732 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.732 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.732 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.733 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.733 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.733 186792 DEBUG nova.virt.hardware [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.733 186792 DEBUG nova.objects.instance [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'vcpu_model' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.754 186792 DEBUG nova.virt.libvirt.vif [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.755 186792 DEBUG nova.network.os_vif_util [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.756 186792 DEBUG nova.network.os_vif_util [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.757 186792 DEBUG nova.objects.instance [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.784 186792 DEBUG nova.virt.libvirt.driver [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <uuid>eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</uuid>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <name>instance-0000005d</name>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestJSON-server-1519356482</nova:name>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:01:59</nova:creationTime>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        <nova:port uuid="a2f45e58-237f-4de0-8339-5f17a4ad3cfe">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <entry name="serial">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <entry name="uuid">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:df:95:59"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <target dev="tapa2f45e58-23"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/console.log" append="off"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <input type="keyboard" bus="usb"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:01:59 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:01:59 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:01:59 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:01:59 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.786 186792 DEBUG oslo_concurrency.processutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.863 186792 DEBUG oslo_concurrency.processutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.864 186792 DEBUG oslo_concurrency.processutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.927 186792 DEBUG oslo_concurrency.processutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.929 186792 DEBUG nova.objects.instance [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'trusted_certs' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.942 186792 DEBUG oslo_concurrency.processutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.960 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.984 186792 DEBUG nova.compute.manager [req-581ffadb-80a5-453e-8073-11658bf0cfe4 req-9772554b-7cfe-4c4d-85bc-9cc9549c4eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.984 186792 DEBUG oslo_concurrency.lockutils [req-581ffadb-80a5-453e-8073-11658bf0cfe4 req-9772554b-7cfe-4c4d-85bc-9cc9549c4eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.984 186792 DEBUG oslo_concurrency.lockutils [req-581ffadb-80a5-453e-8073-11658bf0cfe4 req-9772554b-7cfe-4c4d-85bc-9cc9549c4eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.985 186792 DEBUG oslo_concurrency.lockutils [req-581ffadb-80a5-453e-8073-11658bf0cfe4 req-9772554b-7cfe-4c4d-85bc-9cc9549c4eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.985 186792 DEBUG nova.compute.manager [req-581ffadb-80a5-453e-8073-11658bf0cfe4 req-9772554b-7cfe-4c4d-85bc-9cc9549c4eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.985 186792 WARNING nova.compute.manager [req-581ffadb-80a5-453e-8073-11658bf0cfe4 req-9772554b-7cfe-4c4d-85bc-9cc9549c4eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state reboot_started_hard.#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.997 186792 DEBUG oslo_concurrency.processutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.998 186792 DEBUG nova.virt.disk.api [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:01:59 np0005531888 nova_compute[186788]: 2025-11-22 08:01:59.998 186792 DEBUG oslo_concurrency.processutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.055 186792 DEBUG oslo_concurrency.processutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.056 186792 DEBUG nova.virt.disk.api [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.057 186792 DEBUG nova.objects.instance [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:00 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32-userdata-shm.mount: Deactivated successfully.
Nov 22 03:02:00 np0005531888 systemd[1]: var-lib-containers-storage-overlay-5193f37d6893f760ca6b601fe8236604db5bfc0f61bcc5bac8ddf6e4d9310162-merged.mount: Deactivated successfully.
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.076 186792 DEBUG nova.virt.libvirt.vif [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.076 186792 DEBUG nova.network.os_vif_util [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.077 186792 DEBUG nova.network.os_vif_util [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.077 186792 DEBUG os_vif [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.078 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.078 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.079 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.081 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.081 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2f45e58-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.081 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2f45e58-23, col_values=(('external_ids', {'iface-id': 'a2f45e58-237f-4de0-8339-5f17a4ad3cfe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:95:59', 'vm-uuid': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.083 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 NetworkManager[55166]: <info>  [1763798520.0838] manager: (tapa2f45e58-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.086 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.089 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.090 186792 INFO os_vif [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:02:00 np0005531888 kernel: tapa2f45e58-23: entered promiscuous mode
Nov 22 03:02:00 np0005531888 NetworkManager[55166]: <info>  [1763798520.2783] manager: (tapa2f45e58-23): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Nov 22 03:02:00 np0005531888 systemd-udevd[227901]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.279 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:00Z|00272|binding|INFO|Claiming lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe for this chassis.
Nov 22 03:02:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:00Z|00273|binding|INFO|a2f45e58-237f-4de0-8339-5f17a4ad3cfe: Claiming fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.288 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:00 np0005531888 NetworkManager[55166]: <info>  [1763798520.2922] device (tapa2f45e58-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:02:00 np0005531888 NetworkManager[55166]: <info>  [1763798520.2930] device (tapa2f45e58-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.296 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:00Z|00274|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe ovn-installed in OVS
Nov 22 03:02:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:00Z|00275|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe up in Southbound
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.298 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 systemd-machined[153106]: New machine qemu-45-instance-0000005d.
Nov 22 03:02:00 np0005531888 systemd[1]: Started Virtual Machine qemu-45-instance-0000005d.
Nov 22 03:02:00 np0005531888 podman[227920]: 2025-11-22 08:02:00.358913234 +0000 UTC m=+0.816194730 container cleanup fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:02:00 np0005531888 systemd[1]: libpod-conmon-fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32.scope: Deactivated successfully.
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.631 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.631 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798520.6304288, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.631 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.633 186792 DEBUG nova.compute.manager [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.638 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance rebooted successfully.#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.639 186792 DEBUG nova.compute.manager [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.656 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.659 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:02:00 np0005531888 podman[228002]: 2025-11-22 08:02:00.682907621 +0000 UTC m=+0.298719427 container remove fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.688 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.689 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798520.630552, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.689 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Started (Lifecycle Event)#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.689 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c7847bd4-0b50-4c6f-b5a8-a6c14304e076]: (4, ('Sat Nov 22 08:01:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32)\nfb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32\nSat Nov 22 08:02:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (fb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32)\nfb65e4c198736cb5c45c65b9db4f7931b42e72afd1b3678b368305eafb4f4d32\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.691 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[19a422fd-2f8b-4a72-875f-f88661da740e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.692 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.694 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.697 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.701 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[772b7bda-8f5d-4458-af66-7c6bb828c2a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.710 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.711 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.719 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c094ffa8-e17d-4eea-83e5-861b56d0bc9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.720 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2961f8-dc02-47c0-b083-755d3c539409]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.728 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.729 186792 DEBUG oslo_concurrency.lockutils [None req-67285a3c-6f05-4d2f-8bae-767a33549930 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.736 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[944cba8e-3b8a-4310-8c8a-e5a031f3d596]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522717, 'reachable_time': 27439, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228023, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.739 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.740 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[01fd9242-cdae-4922-8386-7c24fa580da2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.745 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.746 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.757 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0bfaef-c52a-4b19-9c73-3d63917ea71c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.758 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.760 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.760 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d5161e77-c378-430b-807c-22bac5bf3a48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.761 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd65c75-0c7f-4b8c-a861-878939b8a3a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.769 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.773 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[86ea9714-d02b-4ae6-b8f3-25f973ec77b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.788 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9f36e3-7112-41bd-94fb-1072cf779bf4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.818 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[038e7773-1f65-4d74-90c5-3901a5170c5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.823 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a699bf02-9198-40d7-850c-c09685e55f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 NetworkManager[55166]: <info>  [1763798520.8250] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/146)
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.856 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c57b14f2-3e3f-4b14-a440-866e6b2a36ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.859 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[9390a627-b3ac-4ea0-8a4c-a992057337ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 NetworkManager[55166]: <info>  [1763798520.8847] device (tap165f7f23-d0): carrier: link connected
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.890 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[bf702b7f-a47a-4c2b-9f93-bdc2060bf3ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.907 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d1c4f5-237b-4c07-b114-fe839a653506]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526152, 'reachable_time': 26049, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228048, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.922 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3adafe84-b239-4efc-9529-41568a971ccf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526152, 'tstamp': 526152}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228049, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.938 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[28ab29cc-454e-4979-9f1f-234c7aa960ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526152, 'reachable_time': 26049, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228050, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:00.973 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba4c7d1-e438-4f25-b6bc-a25d7e5fe648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.990 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.991 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.991 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:00 np0005531888 nova_compute[186788]: 2025-11-22 08:02:00.991 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.034 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3661a452-c6b8-42e0-8331-bf1e66aed2dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.036 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.036 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.036 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:01 np0005531888 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.039 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:01 np0005531888 NetworkManager[55166]: <info>  [1763798521.0419] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.043 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:01Z|00276|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.044 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.047 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.048 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4aabba-b840-49a6-bce5-f1428f94be4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.048 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:02:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:01.049 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.057 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.105 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.166 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.168 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.228 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.235 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.296 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.297 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.360 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.367 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.442 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.444 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:01 np0005531888 podman[228096]: 2025-11-22 08:02:01.406707928 +0000 UTC m=+0.026952225 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.511 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.746 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.748 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5341MB free_disk=73.25581359863281GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.748 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.748 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:01 np0005531888 podman[228096]: 2025-11-22 08:02:01.779939714 +0000 UTC m=+0.400184001 container create d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.835 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 80cb8b15-443c-424b-894c-1ed6674f77d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.835 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.836 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance aaf09935-3011-4bf6-bdf9-28fe60097c1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.836 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.836 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:02:01 np0005531888 systemd[1]: Started libpod-conmon-d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732.scope.
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.913 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:02:01 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:02:01 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab8d5943c2fffc70017a6d0ecd1b4ee00bb1d30b210f7e70649fc181d3a87013/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.937 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.960 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:02:01 np0005531888 nova_compute[186788]: 2025-11-22 08:02:01.960 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.061 186792 DEBUG nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.061 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.062 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.062 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.062 186792 DEBUG nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.062 186792 WARNING nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.063 186792 DEBUG nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.063 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.063 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.063 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.063 186792 DEBUG nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.063 186792 WARNING nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.064 186792 DEBUG nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.064 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.064 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.064 186792 DEBUG oslo_concurrency.lockutils [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.064 186792 DEBUG nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.064 186792 WARNING nova.compute.manager [req-ee4fb8a5-e006-4620-991b-6b5238a25224 req-4b6844d6-8d41-4314-9542-cab95a737bd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:02 np0005531888 podman[228096]: 2025-11-22 08:02:02.101412259 +0000 UTC m=+0.721656556 container init d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:02:02 np0005531888 podman[228096]: 2025-11-22 08:02:02.107816406 +0000 UTC m=+0.728060683 container start d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:02:02 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[228117]: [NOTICE]   (228121) : New worker (228123) forked
Nov 22 03:02:02 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[228117]: [NOTICE]   (228121) : Loading success.
Nov 22 03:02:02 np0005531888 nova_compute[186788]: 2025-11-22 08:02:02.985 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:05 np0005531888 nova_compute[186788]: 2025-11-22 08:02:05.084 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:05 np0005531888 podman[228132]: 2025-11-22 08:02:05.680396191 +0000 UTC m=+0.052062801 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:02:07 np0005531888 nova_compute[186788]: 2025-11-22 08:02:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:07 np0005531888 nova_compute[186788]: 2025-11-22 08:02:07.989 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:08 np0005531888 podman[228152]: 2025-11-22 08:02:08.701921145 +0000 UTC m=+0.075370005 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:02:10 np0005531888 nova_compute[186788]: 2025-11-22 08:02:10.086 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:12 np0005531888 podman[228175]: 2025-11-22 08:02:12.693579934 +0000 UTC m=+0.060692493 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:02:12 np0005531888 nova_compute[186788]: 2025-11-22 08:02:12.991 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:15 np0005531888 nova_compute[186788]: 2025-11-22 08:02:15.088 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:17Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:95:59 10.100.0.8
Nov 22 03:02:17 np0005531888 nova_compute[186788]: 2025-11-22 08:02:17.993 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:18 np0005531888 podman[228202]: 2025-11-22 08:02:18.684079002 +0000 UTC m=+0.061408150 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:02:18 np0005531888 podman[228203]: 2025-11-22 08:02:18.7181417 +0000 UTC m=+0.090944117 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:02:20 np0005531888 nova_compute[186788]: 2025-11-22 08:02:20.090 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:22 np0005531888 nova_compute[186788]: 2025-11-22 08:02:22.995 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:25 np0005531888 nova_compute[186788]: 2025-11-22 08:02:25.092 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:27 np0005531888 podman[228249]: 2025-11-22 08:02:27.680392569 +0000 UTC m=+0.055178959 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:02:27 np0005531888 podman[228250]: 2025-11-22 08:02:27.680399419 +0000 UTC m=+0.053447365 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:02:27 np0005531888 nova_compute[186788]: 2025-11-22 08:02:27.997 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:30 np0005531888 nova_compute[186788]: 2025-11-22 08:02:30.095 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531888 nova_compute[186788]: 2025-11-22 08:02:33.000 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:35 np0005531888 nova_compute[186788]: 2025-11-22 08:02:35.097 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:36 np0005531888 podman[228297]: 2025-11-22 08:02:36.695097644 +0000 UTC m=+0.066846065 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 03:02:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:36.815 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:36.815 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:36.816 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.844 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'name': 'tempest-ServerActionsTestJSON-server-1519356482', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005d', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'hostId': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.847 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'name': 'tempest-ServerActionsTestOtherB-server-767428975', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000060', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '62d9a4a13f5d41529bc273c278fae96b', 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'hostId': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.849 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005c', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '62d9a4a13f5d41529bc273c278fae96b', 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'hostId': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.849 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.853 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.packets volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.856 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for aaf09935-3011-4bf6-bdf9-28fe60097c1c / tap43ae6beb-d5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.856 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.incoming.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.858 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2cdeacd-507d-45bf-a9b4-64b9b49733c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 30, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:36.849949', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9bdef76e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': '38167e7c2edf322d18b250ce22a1d02addcdb8b920a5278984752e9fcf31cc1b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:36.849949', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9bdf71d0-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': '295f4ae97aafbae63f6479ee6826c7fe8e4f35d51ee09c8365173a0039485d4f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:36.849949', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9bdfc70c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': 'bf501bfb9da77b0499bd1f744b58adbabdfc4ff83ee8e8f5c5f0f0eaa0b0fcea'}]}, 'timestamp': '2025-11-22 08:02:36.859094', '_unique_id': '52b90cdba9dc480580005eef61ee5970'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.861 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.861 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.861 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-767428975>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-767428975>]
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.861 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.896 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.bytes volume: 290816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.898 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.936 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.write.bytes volume: 73072640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.937 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.975 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.977 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b4153a1-1345-4915-976d-38480059c249', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 290816, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:36.861682', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9be5a2ee-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '0f680f2f332c4c68e0bf6f784f08e022d6211b8952c5de462fde37008c1fad9f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:36.861682', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9be5d76e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': 'b6090b9566cadd719014be52c1f55d0c49a92ab94bbe00bf3a8018f1c37c8b12'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73072640, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:36.861682', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bebb576-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': '3e50958148de8e5d6ba434021765d60db0b269aeb68719b16b3c5bbc8c4ab579'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:36.861682', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bebc7fa-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': 'b401392a5d9e4896b6a2a3432f80603c1dfbde7f137f6f0784ae3ddd677df0f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:36.861682', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bf19e5a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': '5d3f1c069f1e1ce5011e7eb051624e670b63e45f9625e80d21c871443d541434'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:36.861682', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: al_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bf1e766-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': 'fc6182c40a6be8c66f0a06d38e5c0f853357d8c52d38f10db5e367db54b9d1fe'}]}, 'timestamp': '2025-11-22 08:02:36.978010', '_unique_id': '24fa5776450046edbd5dfca38710ee75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.981 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.981 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.bytes.delta volume: 2766 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.981 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.982 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.bytes.delta volume: 168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f2299c4-40be-4458-a001-3a3a55f823df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 2766, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:36.981415', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9bf28158-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': 'd1d843c895050add3c73bd07a6f92abcc66f33115a03067a55b063d8a1b6602b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:36.981415', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9bf28f68-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': 'e2db7321b09fe9a598af73f1b10116a098dcaee00c47078f68300744499dfad1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 168, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:36.981415', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9bf29b5c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': '24fa6632afa5fc5a626d5878752d00cbd0a1611af9584fa2c2d78cc75b6226de'}]}, 'timestamp': '2025-11-22 08:02:36.982522', '_unique_id': '60755a7d96304dac9b7a481d832a2191'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:36.984 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:02:36 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:36.979 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.002 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.usage volume: 30408704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.002 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.017 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.018 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.030 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.030 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a706120-c0ef-495d-9466-1e294a1210b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30408704, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:36.985044', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bf5afcc-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.684603468, 'message_signature': '0f83090add1d92e8b11ae6b58d87b1f2726fb671bb777d7a3258533af11b8119'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:36.985044', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bf5be9a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.684603468, 'message_signature': 'dd6c17e9a5b35f30db1c8bbd628e063f60eb28aa0baefc7863c839f1a5de9c9a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:36.985044', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bf805a6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.70256795, 'message_signature': '974d26cd64e7a6e0ad48d1d9763d70452ed735883d357e1978101fbd972a99c6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:36.985044', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bf812ee-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.70256795, 'message_signature': '37fdbc3712b33e94fc369fb6f0960ba9b01c897794a07b0d7794559a33916e84'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:36.985044', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bf9f0b4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.717795615, 'message_signature': '3b5fb5ba20912a805a238b8b43c674212eb0d7e1f73aebfc15122b9b1a561b4d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:36.985044', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus'
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: age_id': '9bf9fe92-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.717795615, 'message_signature': '0eca7b0e85a039d2822997e5a30647da9a324a3b986c7968b5da3c0b07793ead'}]}, 'timestamp': '2025-11-22 08:02:37.030913', '_unique_id': 'a60ad62b9be44cedb31116637b7b11ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.033 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.packets volume: 33 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.033 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.033 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '540bae65-1a8b-4f71-8436-8441712e63b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 33, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:37.033111', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9bfa604e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': '52f726069999427efbbdc5a983e47b5487196029360f6a1fe87a5e488b101c42'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:37.033111', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9bfa6b02-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': '3f30349382a18088d10f326821303e5466125fbf044114f8b98d6146c0b5bf15'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:37.033111', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9bfa75d4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': '9f15603708a4de63263ed4197aa19e431b65a443e4fb18987e316dab09aef599'}]}, 'timestamp': '2025-11-22 08:02:37.033957', '_unique_id': '1f868503376a45d6bc5403881b7d1945'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.034 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.035 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.035 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.requests volume: 1216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.035 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.036 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.read.requests volume: 1126 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.036 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.037 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.requests volume: 1095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.037 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f0ad82-d153-4938-8838-2c80bcd82156', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1216, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:37.035472', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bfabc92-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '0373fc69f3d85f9b5b40f60c6eec421f565b0534cf8e37f51c5561b4a9d9765e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:37.035472', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bfac688-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '2118da82d0fdf2a67e552c58a285dffd93f560393181184ed4cc5ff387e0867d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1126, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:37.035472', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bfae69a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': '6b6b9e18e6628e799a65699f16745d90d578c5e8425f86ab051506cb7ea755c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:37.035472', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bfaf216-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': '9d2a66fc73fcefb3632e24dc3eb24b7d4ae669f676af14a34d445f931f41bcaf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1095, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:37.035472', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bfafd74-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': 'b0215b7418147a2bbf56c2f7e4421b39f3ec7f3173d6a9bf8adb8da4d5c366f4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:37.035472', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bfb09d6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': '713b33f3ba98e5d0ee8b5be7c2aa259a62f03e6fb99b99c40a1d7a1cf5e4a720'}]}, 'timestamp': '2025-11-22 08:02:37.037754', '_unique_id': '6a188327a2484126acdb9891095fda94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.039 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.039 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-767428975>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-767428975>]
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.039 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.bytes volume: 3796 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.040 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.040 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81ecc144-01ea-4440-8acd-39209a8ed708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3796, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:37.039755', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9bfb62c8-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': '7d293c6fc4f28b65fc53690905db2d951d665a27449c33566c7f60ef2002c973'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:37.039755', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9bfb6caa-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': 'bea32d0efcd0141805dea52ae6eb355654a8784be815eb374479da33e49aba09'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:37.039755', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9bfb78c6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': '736bdadfcbd183cbfd85bf5db2ef76d7dcedfdf08705b5de59a1deecfa588a03'}]}, 'timestamp': '2025-11-22 08:02:37.040581', '_unique_id': '2f747fa27760444c935cd3080b124752'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.041 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.058 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/cpu volume: 14340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.075 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/cpu volume: 15090000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.090 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/cpu volume: 13790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4303aee2-23f6-4009-8eac-887629ec6bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14340000000, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'timestamp': '2025-11-22T08:02:37.042003', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9bfe5884-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.758340212, 'message_signature': '725e49440cc7d4c934c115b43ac6d014f17eb7ced9e1912899a09365672b71db'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15090000000, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'timestamp': '2025-11-22T08:02:37.042003', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9c00eb6c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.775127094, 'message_signature': 'cb5b5d6a7087f8bf51d8d5bac01ab5cf5b8410d719379700dee1c1e6a9dcf5ad'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13790000000, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'timestamp': '2025-11-22T08:02:37.042003', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9c0337a0-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.790156014, 'message_signature': 'df7915de20c52dfffa8aee3dbb682f61221e8c12bced41f6cd6bb4f477671748'}]}, 'timestamp': '2025-11-22 08:02:37.091452', '_unique_id': '29018f5c29fb418798482f5558962b75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.092 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.093 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.093 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-767428975>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-767428975>]
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.093 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.093 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/memory.usage volume: 42.3359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.094 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/memory.usage volume: 42.75 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.094 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/memory.usage volume: 42.62109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddba59af-ceaa-462b-bb49-1df642e6a3be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.3359375, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'timestamp': '2025-11-22T08:02:37.093781', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9c03a24e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.758340212, 'message_signature': '6ac04f6bde9047107e5853777a980139ed0b06ea9ec73d38ff6be88715bb4006'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.75, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'timestamp': '2025-11-22T08:02:37.093781', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9c03ab18-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.775127094, 'message_signature': '7010cc8428f9b4458e738f135b280914f5c4008859615ed6ea89a739c34bcec4'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.62109375, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'timestamp': '2025-11-22T08:02:37.093781', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9c03b428-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.790156014, 'message_signature': 'e9a5bf3bd9e01854bef7afe76fdd230f825d4ae39eb5066eeb4279bdd7e22010'}]}, 'timestamp': '2025-11-22 08:02:37.094502', '_unique_id': '0c76bf3ddbab4dd8a1a89fb05bae2a1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.095 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.096 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.096 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd35a424d-e166-45a0-bb8a-91bfec319080', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:37.095818', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9c03f05a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': '95505b8bd93960d523b53b90e959080d99740217d6821a9f97f5557d12ea3c6c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:37.095818', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9c03fadc-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': '7d14273033098805c39ad642b66ffb4c35babee407c27265a45ff26a13bd8122'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:37.095818', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9c04046e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': '4afb6deb23a681e9693b17d88b86f49e29eebaf43218aab7ee658bfb60cf2909'}]}, 'timestamp': '2025-11-22 08:02:37.096576', '_unique_id': '65715305975e44a5b31536007c35b73a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.097 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.098 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.098 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce7adb2d-c79e-4704-ae92-4fea8d3922da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:37.097825', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9c043f60-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': '199804f7154a84b5e20c7d3ae2d2e3f881f89ceeb3aca168e6a3b4dd5c9e2f13'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:37.097825', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9c044a5a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': 'e76545777b5b3d634e881a24e9aaf2b2421fd67bbe665e18304c5aba8e27100c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:37.097825', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9c045504-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': 'a4c62ceed56c371d7618b8255b36587ef08d5c32c48f81e5788b94b5ca979545'}]}, 'timestamp': '2025-11-22 08:02:37.098681', '_unique_id': '1238586497134fa89544ad6cc3b3e8e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.099 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.100 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-767428975>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-767428975>]
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.100 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.bytes volume: 4488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.100 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.incoming.bytes volume: 4041 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.100 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.bytes volume: 1598 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b401f9f2-6dda-4b9b-9f81-d411fc90a51f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4488, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:37.100306', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9c04a1bc-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': '931b6d95ba692e9ae9bfc7b754d9853a398feb7fc379bf33b05c4feaf5d9ce05'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4041, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:37.100306', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9c04ac34-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': '4caa152e5ded01cec0892a50b72ba2836faf932980c3b76b01ce7e54a118991f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1598, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:37.100306', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9c04b49a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': 'fe4ce79e8641354edab55284ebe62bf38a4d49cde147849589c359fcbd4949bf'}]}, 'timestamp': '2025-11-22 08:02:37.101071', '_unique_id': '42b59dd6a9ab45479fb2cbf4062ccec5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.101 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.102 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.requests volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.102 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.103 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.write.requests volume: 346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.103 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.103 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.requests volume: 295 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.103 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '040e372c-185d-4460-9c08-a9eabcead0fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 30, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:37.102431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c04f432-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '09b06965b78050b0e1420ef1b2016436c29a0d5ef41bc18a0c928ed623588af4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:37.102431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c04ff2c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '14ed1c06e3bb31f60786e71e6bdaa1c4c5b65cc9006297a17f99f457f73bbf5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 346, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:37.102431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c0509a4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': 'e25b1a9dddeb60aa47b4f299964b42d1830a470282bb8f00b1cc13118fa20d43'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:37.102431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c051444-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': '48a77a20156f81588f3911f7618710fec5cb45a0b9ecc4f770eb8b38c3808005'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 295, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:37.102431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c051f20-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': '383d014e6cd373e571413bc60c138b97c45c1e4e7d4e9e1fadce48e3b55cd0d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:37.102431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c052a74-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': 'e453797a85107eaadb134f1f1df9594f2fb4485d05a51a3c0e27cb4a9059eea7'}]}, 'timestamp': '2025-11-22 08:02:37.104135', '_unique_id': 'e27a9f5942c0440ca9a036fe7bd2dd93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.106 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.106 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.106 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4288cfa5-1ba4-4a40-a0cc-90d3a5d05210', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:37.105977', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9c057eca-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': '9de1af95ee10c013d677c2269d8c55a9b5c7018da9a535ccb84116679c7ea335'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:37.105977', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9c059388-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': '099426aa4a060c3a3a3e8ced93b43a8785190aa792dd84c3c978847b29f4fe38'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:37.105977', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9c059cf2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': '2032e9a61f5a5d7264ea550f20ac303e9aa9a06d3dab2c780f5e0113dcd1ac97'}]}, 'timestamp': '2025-11-22 08:02:37.107020', '_unique_id': 'ad70ab649a7c438cb3abfbc766920915'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.107 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.108 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.latency volume: 836972033 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.108 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.108 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.write.latency volume: 242471570621 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.109 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.109 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.latency volume: 73053229968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.109 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c494eafe-0074-4804-962d-fc8d086002f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 836972033, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:37.108425', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c05dcc6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '3bc58ffac2d1a5c2d595e6343e262f69d0ab1dd357cc8b6243302c0bfa8bcf95'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:37.108425', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c05e608-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '6da282113562630b82bf199e02e61e8a03737cedcc45e546848e8c8875596c81'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 242471570621, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:37.108425', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c05ee5a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': 'd31359ce99ec4e7c2d8c7c7f76f92ccb067b6785f75a71a6c0d3010c6c1c1974'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:37.108425', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c05f5da-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': 'a75e48333fff300d34fbb80a4e3e4648e72cf3f8cab9bfda4ba9a0fe77a209d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 73053229968, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:37.108425', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c05ff58-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': '795d7eaa08b48e2edbfb771565d01a8a274c0d0d420f9951d10e168d7322ee8e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:37.108425', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d3
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c060b06-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': 'ac0c045b1fb0777df81a7340abe8b23aa1f8f9973c610ff5dac0aa4f22233758'}]}, 'timestamp': '2025-11-22 08:02:37.109840', '_unique_id': '60475321ce3c4ca4809e104a39b3a4c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.111 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.112 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.112 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.112 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.113 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.113 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c96d0e02-006e-4554-8944-6ecc96750280', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:37.111901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c0666c8-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.684603468, 'message_signature': 'd11b69b1f9e4de10c01630cb9d373ebdd826014232dddf19e43af0c0474c2fcd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:37.111901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c0672b2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.684603468, 'message_signature': '2286f4b95c93ec0b2da19a68a46e6c4fdffba0efca7a3f0ed90ea81d437cb629'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:37.111901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c067f96-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.70256795, 'message_signature': '50110e921b3c0863e2049455a0ff0580a5e14cb4c6c4c54400b1765b194ff3eb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:37.111901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c068ae0-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.70256795, 'message_signature': '77ee53405564e763c834a42e084ec132b148b9a67a0f8864d5c7b7cec5fb6f43'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:37.111901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c0694d6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.717795615, 'message_signature': '01e0831dbf9397f47f96be43bddf9012536b1a533f42fbfe5beb711f0d0a743d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:37.111901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', '
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: isk_name': 'sda'}, 'message_id': '9c069c38-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.717795615, 'message_signature': 'f78563eba38d20ec7068219fc0a6e6c2bc649e8db6d027863b7d7dfcf8295632'}]}, 'timestamp': '2025-11-22 08:02:37.113545', '_unique_id': '1a13747567d04d5dafaf160f9ecf11c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.115 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.bytes volume: 32081920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.115 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.115 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.read.bytes volume: 30468608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.115 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.116 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.bytes volume: 30747136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.116 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0e9d566-3539-4486-9639-baa72006ad8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32081920, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:37.115026', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c06dee6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '5d72783f49766cc0628b37cc9025105b258ccf5a0f48fb78a7e9d94d4879d161'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:37.115026', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c06eb3e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': 'bfe73a9398a8a9b2da052d2fa8fd186d2e79d4abb63b943dc7332eb45675f063'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30468608, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:37.115026', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c06f8ae-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': '2a25e3349edea26632d92325a8e5ee7caf40bc05c3d79fe0b1e1bb6ba17467b6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:37.115026', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c0703b2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': 'b319a342c54d1cbc96a9bd18b58322ab8291abee9f6983731e789a8cc20f88e3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30747136, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:37.115026', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c070fba-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': '52d9f838b45f072519801d7dd1f41a7218bc8eb58555bd093660cce286ba0009'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:37.115026', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_u
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c071c1c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': '14d535f672a9f50b07f2a8a7796bce5e4307dac220d5aa960034220ca495630d'}]}, 'timestamp': '2025-11-22 08:02:37.116890', '_unique_id': '7f82a022c75548d587d120c9ca46be6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.118 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.allocation volume: 31399936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.119 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.119 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.119 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.120 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.120 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca07ffa7-4983-4c4e-b467-9e3741489d64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31399936, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:37.118864', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c07763a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.684603468, 'message_signature': 'a0731cc31ba4e192a238ed532b685d9d65bf0ed01f0095972b4717e6c4c75c98'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:37.118864', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c078170-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.684603468, 'message_signature': 'ae75da2ded17df329222fca2a1eef68ac29b9dcdc66248a5da70ee66b142fc1b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:37.118864', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c078d82-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.70256795, 'message_signature': '76b85e1418854763ac02608595b731ee4517ba3e152aa0f6d1d8e4c5f3c1d441'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:37.118864', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c0798c2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.70256795, 'message_signature': '9f2ddb13cb0942be66d284a839cd6e998278d3d070193ebfd29a7fd46241db37'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:37.118864', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c07a448-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.717795615, 'message_signature': 'ecb6403af728174127e5fc4898395133526a8b4235169f025c5d098b4f5c4a91'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:37.118864', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 1, 'disk_name': 'sda'}, 'message_id': '9c07b05a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.717795615, 'message_signature': '74c48906d30d438311a9a927e9d780b196ff7996ffaad40124a2a83347fb94d6'}]}, 'timestamp': '2025-11-22 08:02:37.120687', '_unique_id': 'b19da614de0d4abca3028738e7d66af4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.122 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.bytes.delta volume: 2470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.122 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.122 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.bytes.delta volume: 294 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15ffdddd-ad7d-4f5b-8110-4506f46efc57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 2470, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:37.122336', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9c07fdf8-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': 'b4786b859464bb7baecf075772c0927f44346fe455d7703b35f7d6d34cef910b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:37.122336', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9c080802-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': 'f34df0b3d0aae4a2239cae36693391909a77e3be938939787d461d0e300a6a5e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 294, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:37.122336', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9c08111c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': '7c051e32c1e3ba2d5a6bb169d99f9c45e95d69f7045c07a777bf64f413d7e5f5'}]}, 'timestamp': '2025-11-22 08:02:37.123120', '_unique_id': '0335dc16a2434b72947121630f2bfe14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.123 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.125 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.latency volume: 2966432114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.125 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.device.read.latency volume: 235070298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.126 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.read.latency volume: 3374575324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.126 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.device.read.latency volume: 134565536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.126 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.latency volume: 1310824649 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.126 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/disk.device.read.latency volume: 113214800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9a16030-bdbb-4193-8314-fa47b1ea01ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2966432114, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-vda', 'timestamp': '2025-11-22T08:02:37.125550', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c087eae-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '1ca5ea4220dda27238bfa609e6c53e25984fa7c464245e8c201611445ea164d2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 235070298, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-sda', 'timestamp': '2025-11-22T08:02:37.125550', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'instance-0000005d', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c088cb4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.561188954, 'message_signature': '74132c7da8658adb0f602fcf52a66b5419912eb3ec73abb09e67cd864b197439'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3374575324, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-vda', 'timestamp': '2025-11-22T08:02:37.125550', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c089538-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': 'd62e703bd92560777d805c5160f1a4a866d1d37be513bbdab94837603c309e21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 134565536, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c-sda', 'timestamp': '2025-11-22T08:02:37.125550', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'instance-00000060', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c089eb6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.598369959, 'message_signature': '2101fa2cb5358173f2143ce7ead21fd00f210c8257254878886ebff5f8bb798c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1310824649, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-vda', 'timestamp': '2025-11-22T08:02:37.125550', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9c08a83e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': '1fa6c96f3ad78c41211ec059f1d421f5648c04307b52c480ddfc92e10efbf33f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 113214800, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': '80cb8b15-443c-424b-894c-1ed6674f77d5-sda', 'timestamp': '2025-11-22T08:02:37.125550', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'instance-0000005c', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9c08b022-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.637326227, 'message_signature': '8d25f3cb0a12237286f01b712ff55ba64ed58e6be63de8cb19b9acd9b0a58a87'}]}, 'timestamp': '2025-11-22 08:02:37.127196', '_unique_id': '1097d2e0a1a24b5ba858dac24d45a44d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.129 12 DEBUG ceilometer.compute.pollsters [-] eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.129 12 DEBUG ceilometer.compute.pollsters [-] aaf09935-3011-4bf6-bdf9-28fe60097c1c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.129 12 DEBUG ceilometer.compute.pollsters [-] 80cb8b15-443c-424b-894c-1ed6674f77d5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93cb3353-41cb-4e60-b5d7-f51dde21930f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000005d-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-tapa2f45e58-23', 'timestamp': '2025-11-22T08:02:37.129204', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1519356482', 'name': 'tapa2f45e58-23', 'instance_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:df:95:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2f45e58-23'}, 'message_id': '9c0908c4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.549444946, 'message_signature': '8891954c55bc93e098dc22d46e07b87fc58a215340100df914db55e94003a8e4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-00000060-aaf09935-3011-4bf6-bdf9-28fe60097c1c-tap43ae6beb-d5', 'timestamp': '2025-11-22T08:02:37.129204', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-767428975', 'name': 'tap43ae6beb-d5', 'instance_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:57:82:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43ae6beb-d5'}, 'message_id': '9c091116-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.553307131, 'message_signature': '357d334f9becbf7939945d4ba2981599641cd08d5d0b547eb21e134bf416d9eb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_name': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_name': None, 'resource_id': 'instance-0000005c-80cb8b15-443c-424b-894c-1ed6674f77d5-tap487183e6-b0', 'timestamp': '2025-11-22T08:02:37.129204', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1246188074', 'name': 'tap487183e6-b0', 'instance_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'instance_type': 'm1.nano', 'host': 'f554bcb6b73f8321c4e5c3fd3666f13662fceffd58fc81542a71ae20', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2d:0b:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap487183e6-b0'}, 'message_id': '9c091ac6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5297.556426257, 'message_signature': '8f8b2c4c5dacf4ee9cfa6648be4f6a839d435e9a31bef549fa7d542e6afbcc8d'}]}, 'timestamp': '2025-11-22 08:02:37.129940', '_unique_id': 'a93702138220481e994a09fd3580f9b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:02:37.130 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:37 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:37.032 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:37.038 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:37.104 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:37.110 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:37.114 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:37.117 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:37.121 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 rsyslogd[1010]: message too long (8192) with configured size 8096, begin of message is: 2025-11-22 08:02:37.128 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 22 03:02:37 np0005531888 nova_compute[186788]: 2025-11-22 08:02:37.876 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:37 np0005531888 nova_compute[186788]: 2025-11-22 08:02:37.876 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:37 np0005531888 nova_compute[186788]: 2025-11-22 08:02:37.877 186792 INFO nova.compute.manager [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Shelving#033[00m
Nov 22 03:02:37 np0005531888 nova_compute[186788]: 2025-11-22 08:02:37.905 186792 DEBUG nova.virt.libvirt.driver [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:02:38 np0005531888 nova_compute[186788]: 2025-11-22 08:02:38.002 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531888 podman[228318]: 2025-11-22 08:02:39.68698006 +0000 UTC m=+0.056693555 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.100 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531888 kernel: tap43ae6beb-d5 (unregistering): left promiscuous mode
Nov 22 03:02:40 np0005531888 NetworkManager[55166]: <info>  [1763798560.3064] device (tap43ae6beb-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.310 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:40Z|00277|binding|INFO|Releasing lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 from this chassis (sb_readonly=0)
Nov 22 03:02:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:40Z|00278|binding|INFO|Setting lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 down in Southbound
Nov 22 03:02:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:02:40Z|00279|binding|INFO|Removing iface tap43ae6beb-d5 ovn-installed in OVS
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.314 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.333 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531888 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000060.scope: Deactivated successfully.
Nov 22 03:02:40 np0005531888 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000060.scope: Consumed 18.976s CPU time.
Nov 22 03:02:40 np0005531888 systemd-machined[153106]: Machine qemu-43-instance-00000060 terminated.
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.428 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:82:86 10.100.0.11'], port_security=['fa:16:3e:57:82:86 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '320b38f4-6497-45cc-9e33-00f741d5a1b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=43ae6beb-d59a-483d-8ced-1303f84a69d1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.429 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 43ae6beb-d59a-483d-8ced-1303f84a69d1 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 unbound from our chassis#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.432 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.448 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c9665935-9a19-4b59-9fa4-a6b405d0029b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.482 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd9808d-d2d5-4656-9d8d-92b09c30bb57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.485 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[50bb8e56-1a0d-44f2-86d4-de42b4660398]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.515 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5a28de3f-cb82-4f0f-bbb0-8d7f8c1dea51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.533 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[763c372d-22bb-414b-89d5-adaabc17c281]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506803, 'reachable_time': 17176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228354, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.549 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[533a3e8f-7d4e-4efa-a0a9-58f54b87b081]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506813, 'tstamp': 506813}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228359, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506816, 'tstamp': 506816}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228359, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.552 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.553 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.558 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.559 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.559 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.560 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:02:40.560 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.907 186792 DEBUG nova.compute.manager [req-e77a70fb-7182-4acc-88d1-b22e5fbbc6cc req-03497569-2a0e-41ae-92ca-c36f99506eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-unplugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.907 186792 DEBUG oslo_concurrency.lockutils [req-e77a70fb-7182-4acc-88d1-b22e5fbbc6cc req-03497569-2a0e-41ae-92ca-c36f99506eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.907 186792 DEBUG oslo_concurrency.lockutils [req-e77a70fb-7182-4acc-88d1-b22e5fbbc6cc req-03497569-2a0e-41ae-92ca-c36f99506eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.907 186792 DEBUG oslo_concurrency.lockutils [req-e77a70fb-7182-4acc-88d1-b22e5fbbc6cc req-03497569-2a0e-41ae-92ca-c36f99506eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.908 186792 DEBUG nova.compute.manager [req-e77a70fb-7182-4acc-88d1-b22e5fbbc6cc req-03497569-2a0e-41ae-92ca-c36f99506eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] No waiting events found dispatching network-vif-unplugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.908 186792 WARNING nova.compute.manager [req-e77a70fb-7182-4acc-88d1-b22e5fbbc6cc req-03497569-2a0e-41ae-92ca-c36f99506eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received unexpected event network-vif-unplugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 for instance with vm_state active and task_state shelving.#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.922 186792 INFO nova.virt.libvirt.driver [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance shutdown successfully after 3 seconds.#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.928 186792 INFO nova.virt.libvirt.driver [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance destroyed successfully.#033[00m
Nov 22 03:02:40 np0005531888 nova_compute[186788]: 2025-11-22 08:02:40.929 186792 DEBUG nova.objects.instance [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'numa_topology' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:41 np0005531888 nova_compute[186788]: 2025-11-22 08:02:41.213 186792 INFO nova.virt.libvirt.driver [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Beginning cold snapshot process#033[00m
Nov 22 03:02:41 np0005531888 nova_compute[186788]: 2025-11-22 08:02:41.455 186792 DEBUG nova.privsep.utils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 03:02:41 np0005531888 nova_compute[186788]: 2025-11-22 08:02:41.456 186792 DEBUG oslo_concurrency.processutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk /var/lib/nova/instances/snapshots/tmpc5p300qg/b48f419209054b4b8127e9145d84920b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:43 np0005531888 nova_compute[186788]: 2025-11-22 08:02:43.005 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:43 np0005531888 nova_compute[186788]: 2025-11-22 08:02:43.254 186792 DEBUG nova.compute.manager [req-2058a5be-1172-4389-9aec-0e280030be6b req-14c5fae0-6889-40ab-a8bc-271ca9f6d0cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:43 np0005531888 nova_compute[186788]: 2025-11-22 08:02:43.254 186792 DEBUG oslo_concurrency.lockutils [req-2058a5be-1172-4389-9aec-0e280030be6b req-14c5fae0-6889-40ab-a8bc-271ca9f6d0cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:43 np0005531888 nova_compute[186788]: 2025-11-22 08:02:43.254 186792 DEBUG oslo_concurrency.lockutils [req-2058a5be-1172-4389-9aec-0e280030be6b req-14c5fae0-6889-40ab-a8bc-271ca9f6d0cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:43 np0005531888 nova_compute[186788]: 2025-11-22 08:02:43.255 186792 DEBUG oslo_concurrency.lockutils [req-2058a5be-1172-4389-9aec-0e280030be6b req-14c5fae0-6889-40ab-a8bc-271ca9f6d0cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:43 np0005531888 nova_compute[186788]: 2025-11-22 08:02:43.255 186792 DEBUG nova.compute.manager [req-2058a5be-1172-4389-9aec-0e280030be6b req-14c5fae0-6889-40ab-a8bc-271ca9f6d0cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] No waiting events found dispatching network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:43 np0005531888 nova_compute[186788]: 2025-11-22 08:02:43.255 186792 WARNING nova.compute.manager [req-2058a5be-1172-4389-9aec-0e280030be6b req-14c5fae0-6889-40ab-a8bc-271ca9f6d0cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received unexpected event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 for instance with vm_state active and task_state shelving_image_pending_upload.#033[00m
Nov 22 03:02:43 np0005531888 podman[228383]: 2025-11-22 08:02:43.687700531 +0000 UTC m=+0.059497164 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 22 03:02:45 np0005531888 nova_compute[186788]: 2025-11-22 08:02:45.102 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:48 np0005531888 nova_compute[186788]: 2025-11-22 08:02:48.007 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:48 np0005531888 nova_compute[186788]: 2025-11-22 08:02:48.362 186792 DEBUG oslo_concurrency.processutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk /var/lib/nova/instances/snapshots/tmpc5p300qg/b48f419209054b4b8127e9145d84920b" returned: 0 in 6.906s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:48 np0005531888 nova_compute[186788]: 2025-11-22 08:02:48.363 186792 INFO nova.virt.libvirt.driver [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Snapshot extracted, beginning image upload#033[00m
Nov 22 03:02:49 np0005531888 podman[228406]: 2025-11-22 08:02:49.705478669 +0000 UTC m=+0.071833506 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:02:49 np0005531888 podman[228407]: 2025-11-22 08:02:49.728901126 +0000 UTC m=+0.091606083 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.104 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.853 186792 INFO nova.virt.libvirt.driver [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Snapshot image upload complete#033[00m
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.857 186792 DEBUG nova.compute.manager [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.938 186792 INFO nova.compute.manager [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Shelve offloading#033[00m
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.953 186792 INFO nova.virt.libvirt.driver [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance destroyed successfully.#033[00m
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.954 186792 DEBUG nova.compute.manager [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.956 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.957 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:50 np0005531888 nova_compute[186788]: 2025-11-22 08:02:50.957 186792 DEBUG nova.network.neutron [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:02:51 np0005531888 nova_compute[186788]: 2025-11-22 08:02:51.968 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:52 np0005531888 nova_compute[186788]: 2025-11-22 08:02:52.346 186792 DEBUG nova.network.neutron [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updating instance_info_cache with network_info: [{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:52 np0005531888 nova_compute[186788]: 2025-11-22 08:02:52.369 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:52 np0005531888 nova_compute[186788]: 2025-11-22 08:02:52.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:53 np0005531888 nova_compute[186788]: 2025-11-22 08:02:53.010 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:53 np0005531888 nova_compute[186788]: 2025-11-22 08:02:53.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:53 np0005531888 nova_compute[186788]: 2025-11-22 08:02:53.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:02:53 np0005531888 nova_compute[186788]: 2025-11-22 08:02:53.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.165 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.166 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.166 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.166 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 80cb8b15-443c-424b-894c-1ed6674f77d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.173 186792 INFO nova.virt.libvirt.driver [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance destroyed successfully.#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.174 186792 DEBUG nova.objects.instance [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'resources' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.198 186792 DEBUG nova.virt.libvirt.vif [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-767428975',display_name='tempest-ServerActionsTestOtherB-server-767428975',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-767428975',id=96,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-41j79svb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member',shelved_at='2025-11-22T08:02:50.857407',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='243c9ed1-3260-4815-a04c-08e51aa5a027'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:02:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=aaf09935-3011-4bf6-bdf9-28fe60097c1c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.199 186792 DEBUG nova.network.os_vif_util [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.200 186792 DEBUG nova.network.os_vif_util [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.200 186792 DEBUG os_vif [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.202 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43ae6beb-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.205 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.209 186792 INFO os_vif [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5')#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.210 186792 INFO nova.virt.libvirt.driver [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Deleting instance files /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c_del#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.217 186792 INFO nova.virt.libvirt.driver [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Deletion of /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c_del complete#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.437 186792 INFO nova.scheduler.client.report [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Deleted allocations for instance aaf09935-3011-4bf6-bdf9-28fe60097c1c#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.507 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.508 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.606 186792 DEBUG nova.compute.provider_tree [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.624 186792 DEBUG nova.scheduler.client.report [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.663 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:54 np0005531888 nova_compute[186788]: 2025-11-22 08:02:54.740 186792 DEBUG oslo_concurrency.lockutils [None req-b3a46644-0d79-4b5c-9c6d-587f1ecb90e9 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 16.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:55 np0005531888 nova_compute[186788]: 2025-11-22 08:02:55.582 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798560.5811598, aaf09935-3011-4bf6-bdf9-28fe60097c1c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:02:55 np0005531888 nova_compute[186788]: 2025-11-22 08:02:55.583 186792 INFO nova.compute.manager [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:02:55 np0005531888 nova_compute[186788]: 2025-11-22 08:02:55.599 186792 DEBUG nova.compute.manager [None req-d2dfea39-1fa0-4fad-aa6d-d40966828a24 - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:02:56 np0005531888 nova_compute[186788]: 2025-11-22 08:02:56.062 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Updating instance_info_cache with network_info: [{"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:56 np0005531888 nova_compute[186788]: 2025-11-22 08:02:56.102 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-80cb8b15-443c-424b-894c-1ed6674f77d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:56 np0005531888 nova_compute[186788]: 2025-11-22 08:02:56.103 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:02:56 np0005531888 nova_compute[186788]: 2025-11-22 08:02:56.103 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:56 np0005531888 nova_compute[186788]: 2025-11-22 08:02:56.104 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:56 np0005531888 nova_compute[186788]: 2025-11-22 08:02:56.104 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:56 np0005531888 nova_compute[186788]: 2025-11-22 08:02:56.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:56 np0005531888 nova_compute[186788]: 2025-11-22 08:02:56.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:02:58 np0005531888 nova_compute[186788]: 2025-11-22 08:02:58.012 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:58 np0005531888 podman[228454]: 2025-11-22 08:02:58.684779047 +0000 UTC m=+0.052866750 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:02:58 np0005531888 podman[228453]: 2025-11-22 08:02:58.685013693 +0000 UTC m=+0.054630874 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.405 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.406 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.406 186792 INFO nova.compute.manager [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Unshelving#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.552 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.553 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.559 186792 DEBUG nova.objects.instance [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_requests' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.575 186792 DEBUG nova.objects.instance [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'numa_topology' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.590 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.591 186792 INFO nova.compute.claims [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.799 186792 DEBUG nova.compute.provider_tree [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.813 186792 DEBUG nova.scheduler.client.report [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:02:59 np0005531888 nova_compute[186788]: 2025-11-22 08:02:59.862 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:00 np0005531888 nova_compute[186788]: 2025-11-22 08:03:00.061 186792 INFO nova.network.neutron [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updating port 43ae6beb-d59a-483d-8ced-1303f84a69d1 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 22 03:03:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:01.014 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:01.017 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:03:01 np0005531888 nova_compute[186788]: 2025-11-22 08:03:01.019 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:01.019 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:01 np0005531888 nova_compute[186788]: 2025-11-22 08:03:01.634 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:01 np0005531888 nova_compute[186788]: 2025-11-22 08:03:01.634 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:01 np0005531888 nova_compute[186788]: 2025-11-22 08:03:01.634 186792 DEBUG nova.network.neutron [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:03:01 np0005531888 nova_compute[186788]: 2025-11-22 08:03:01.760 186792 DEBUG nova.compute.manager [req-bbcd46ca-1d41-4fef-80b0-f9987bb42ffb req-f9483833-c4a8-4e01-9c28-b6c0be2e4bdb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-changed-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:01 np0005531888 nova_compute[186788]: 2025-11-22 08:03:01.760 186792 DEBUG nova.compute.manager [req-bbcd46ca-1d41-4fef-80b0-f9987bb42ffb req-f9483833-c4a8-4e01-9c28-b6c0be2e4bdb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Refreshing instance network info cache due to event network-changed-43ae6beb-d59a-483d-8ced-1303f84a69d1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:01 np0005531888 nova_compute[186788]: 2025-11-22 08:03:01.760 186792 DEBUG oslo_concurrency.lockutils [req-bbcd46ca-1d41-4fef-80b0-f9987bb42ffb req-f9483833-c4a8-4e01-9c28-b6c0be2e4bdb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:01 np0005531888 nova_compute[186788]: 2025-11-22 08:03:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:02 np0005531888 nova_compute[186788]: 2025-11-22 08:03:02.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:02 np0005531888 nova_compute[186788]: 2025-11-22 08:03:02.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:02 np0005531888 nova_compute[186788]: 2025-11-22 08:03:02.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:02 np0005531888 nova_compute[186788]: 2025-11-22 08:03:02.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:02 np0005531888 nova_compute[186788]: 2025-11-22 08:03:02.987 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.014 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.082 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.144 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.145 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.186 186792 DEBUG nova.network.neutron [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updating instance_info_cache with network_info: [{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.207 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.209 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.211 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.212 186792 INFO nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Creating image(s)#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.212 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.213 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.214 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.214 186792 DEBUG nova.objects.instance [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'trusted_certs' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.216 186792 DEBUG oslo_concurrency.lockutils [req-bbcd46ca-1d41-4fef-80b0-f9987bb42ffb req-f9483833-c4a8-4e01-9c28-b6c0be2e4bdb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.216 186792 DEBUG nova.network.neutron [req-bbcd46ca-1d41-4fef-80b0-f9987bb42ffb req-f9483833-c4a8-4e01-9c28-b6c0be2e4bdb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Refreshing network info cache for port 43ae6beb-d59a-483d-8ced-1303f84a69d1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.224 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.245 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "0ee57c243eb3946217335143ef545d69665f34f2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.247 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "0ee57c243eb3946217335143ef545d69665f34f2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.291 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.292 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.353 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.534 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.535 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5391MB free_disk=73.28369140625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.536 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.536 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.610 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 80cb8b15-443c-424b-894c-1ed6674f77d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.611 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.611 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance aaf09935-3011-4bf6-bdf9-28fe60097c1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.611 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.611 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.748 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.763 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.792 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:03:03 np0005531888 nova_compute[186788]: 2025-11-22 08:03:03.793 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:04 np0005531888 nova_compute[186788]: 2025-11-22 08:03:04.213 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531888 nova_compute[186788]: 2025-11-22 08:03:05.303 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:05 np0005531888 nova_compute[186788]: 2025-11-22 08:03:05.360 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2.part --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:05 np0005531888 nova_compute[186788]: 2025-11-22 08:03:05.362 186792 DEBUG nova.virt.images [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] 243c9ed1-3260-4815-a04c-08e51aa5a027 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 03:03:05 np0005531888 nova_compute[186788]: 2025-11-22 08:03:05.363 186792 DEBUG nova.privsep.utils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 03:03:05 np0005531888 nova_compute[186788]: 2025-11-22 08:03:05.363 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2.part /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:05 np0005531888 nova_compute[186788]: 2025-11-22 08:03:05.726 186792 DEBUG nova.network.neutron [req-bbcd46ca-1d41-4fef-80b0-f9987bb42ffb req-f9483833-c4a8-4e01-9c28-b6c0be2e4bdb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updated VIF entry in instance network info cache for port 43ae6beb-d59a-483d-8ced-1303f84a69d1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:05 np0005531888 nova_compute[186788]: 2025-11-22 08:03:05.728 186792 DEBUG nova.network.neutron [req-bbcd46ca-1d41-4fef-80b0-f9987bb42ffb req-f9483833-c4a8-4e01-9c28-b6c0be2e4bdb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updating instance_info_cache with network_info: [{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:05 np0005531888 nova_compute[186788]: 2025-11-22 08:03:05.749 186792 DEBUG oslo_concurrency.lockutils [req-bbcd46ca-1d41-4fef-80b0-f9987bb42ffb req-f9483833-c4a8-4e01-9c28-b6c0be2e4bdb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-aaf09935-3011-4bf6-bdf9-28fe60097c1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.011 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2.part /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2.converted" returned: 0 in 1.648s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.021 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.092 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2.converted --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.093 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "0ee57c243eb3946217335143ef545d69665f34f2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.108 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.165 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.167 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "0ee57c243eb3946217335143ef545d69665f34f2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.168 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "0ee57c243eb3946217335143ef545d69665f34f2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.179 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.244 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.245 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2,backing_fmt=raw /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.353 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2,backing_fmt=raw /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk 1073741824" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.354 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "0ee57c243eb3946217335143ef545d69665f34f2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.355 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.412 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.413 186792 DEBUG nova.objects.instance [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'migration_context' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.426 186792 INFO nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Rebasing disk image.#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.427 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.491 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:07 np0005531888 nova_compute[186788]: 2025-11-22 08:03:07.492 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:07 np0005531888 podman[228541]: 2025-11-22 08:03:07.687793799 +0000 UTC m=+0.057614978 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.942 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk" returned: 0 in 1.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.943 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.943 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Ensure instance console log exists: /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.943 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.944 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.944 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.946 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Start _get_guest_xml network_info=[{"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='fe3cf6b41ec0673ab47546044ea3f047',container_format='bare',created_at=2025-11-22T08:02:37Z,direct_url=<?>,disk_format='qcow2',id=243c9ed1-3260-4815-a04c-08e51aa5a027,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-767428975-shelved',owner='62d9a4a13f5d41529bc273c278fae96b',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-22T08:02:50Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.950 186792 WARNING nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.957 186792 DEBUG nova.virt.libvirt.host [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.958 186792 DEBUG nova.virt.libvirt.host [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.961 186792 DEBUG nova.virt.libvirt.host [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.961 186792 DEBUG nova.virt.libvirt.host [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.963 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.963 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='fe3cf6b41ec0673ab47546044ea3f047',container_format='bare',created_at=2025-11-22T08:02:37Z,direct_url=<?>,disk_format='qcow2',id=243c9ed1-3260-4815-a04c-08e51aa5a027,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-767428975-shelved',owner='62d9a4a13f5d41529bc273c278fae96b',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-22T08:02:50Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.964 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.964 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.964 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.964 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.965 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.966 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.967 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.967 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.967 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.967 186792 DEBUG nova.virt.hardware [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.968 186792 DEBUG nova.objects.instance [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'vcpu_model' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.983 186792 DEBUG nova.virt.libvirt.vif [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:01:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-767428975',display_name='tempest-ServerActionsTestOtherB-server-767428975',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-767428975',id=96,image_ref='243c9ed1-3260-4815-a04c-08e51aa5a027',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-41j79svb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member',shelved_at='2025-11-22T08:02:50.857407',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='243c9ed1-3260-4815-a04c-08e51aa5a027'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=aaf09935-3011-4bf6-bdf9-28fe60097c1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.983 186792 DEBUG nova.network.os_vif_util [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.984 186792 DEBUG nova.network.os_vif_util [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.985 186792 DEBUG nova.objects.instance [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_devices' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:08 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.997 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  <uuid>aaf09935-3011-4bf6-bdf9-28fe60097c1c</uuid>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  <name>instance-00000060</name>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestOtherB-server-767428975</nova:name>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:03:08</nova:creationTime>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        <nova:user uuid="d0c5153b41c5499bac372d2df10b9b03">tempest-ServerActionsTestOtherB-270195081-project-member</nova:user>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        <nova:project uuid="62d9a4a13f5d41529bc273c278fae96b">tempest-ServerActionsTestOtherB-270195081</nova:project>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="243c9ed1-3260-4815-a04c-08e51aa5a027"/>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        <nova:port uuid="43ae6beb-d59a-483d-8ced-1303f84a69d1">
Nov 22 03:03:08 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:03:08 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <entry name="serial">aaf09935-3011-4bf6-bdf9-28fe60097c1c</entry>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <entry name="uuid">aaf09935-3011-4bf6-bdf9-28fe60097c1c</entry>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:03:08 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.config"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:57:82:86"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <target dev="tap43ae6beb-d5"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/console.log" append="off"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <input type="keyboard" bus="usb"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:03:09 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:03:09 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:03:09 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:03:09 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.999 186792 DEBUG nova.compute.manager [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Preparing to wait for external event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:08.999 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.000 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.000 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.001 186792 DEBUG nova.virt.libvirt.vif [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:01:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-767428975',display_name='tempest-ServerActionsTestOtherB-server-767428975',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-767428975',id=96,image_ref='243c9ed1-3260-4815-a04c-08e51aa5a027',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-41j79svb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member',shelved_at='2025-11-22T08:02:50.857407',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='243c9ed1-3260-4815-a04c-08e51aa5a027'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=aaf09935-3011-4bf6-bdf9-28fe60097c1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.001 186792 DEBUG nova.network.os_vif_util [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.002 186792 DEBUG nova.network.os_vif_util [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.002 186792 DEBUG os_vif [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.003 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.003 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.004 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.007 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43ae6beb-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.007 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43ae6beb-d5, col_values=(('external_ids', {'iface-id': '43ae6beb-d59a-483d-8ced-1303f84a69d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:82:86', 'vm-uuid': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.009 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:09 np0005531888 NetworkManager[55166]: <info>  [1763798589.0102] manager: (tap43ae6beb-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.012 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.016 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.017 186792 INFO os_vif [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5')#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.077 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.077 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.078 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No VIF found with MAC fa:16:3e:57:82:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.079 186792 INFO nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Using config drive#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.093 186792 DEBUG nova.objects.instance [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'ec2_ids' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:09 np0005531888 nova_compute[186788]: 2025-11-22 08:03:09.239 186792 DEBUG nova.objects.instance [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'keypairs' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:10 np0005531888 nova_compute[186788]: 2025-11-22 08:03:10.629 186792 INFO nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Creating config drive at /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.config#033[00m
Nov 22 03:03:10 np0005531888 nova_compute[186788]: 2025-11-22 08:03:10.636 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0amgxrl7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:10 np0005531888 podman[228566]: 2025-11-22 08:03:10.685379944 +0000 UTC m=+0.061205545 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:03:10 np0005531888 nova_compute[186788]: 2025-11-22 08:03:10.794 186792 DEBUG oslo_concurrency.processutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0amgxrl7" returned: 0 in 0.157s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:10 np0005531888 kernel: tap43ae6beb-d5: entered promiscuous mode
Nov 22 03:03:10 np0005531888 NetworkManager[55166]: <info>  [1763798590.8557] manager: (tap43ae6beb-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Nov 22 03:03:10 np0005531888 nova_compute[186788]: 2025-11-22 08:03:10.856 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:10Z|00280|binding|INFO|Claiming lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 for this chassis.
Nov 22 03:03:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:10Z|00281|binding|INFO|43ae6beb-d59a-483d-8ced-1303f84a69d1: Claiming fa:16:3e:57:82:86 10.100.0.11
Nov 22 03:03:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:10Z|00282|binding|INFO|Setting lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 ovn-installed in OVS
Nov 22 03:03:10 np0005531888 nova_compute[186788]: 2025-11-22 08:03:10.876 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:10 np0005531888 nova_compute[186788]: 2025-11-22 08:03:10.880 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:10.886 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:82:86 10.100.0.11'], port_security=['fa:16:3e:57:82:86 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '7', 'neutron:security_group_ids': '320b38f4-6497-45cc-9e33-00f741d5a1b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=43ae6beb-d59a-483d-8ced-1303f84a69d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:10Z|00283|binding|INFO|Setting lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 up in Southbound
Nov 22 03:03:10 np0005531888 systemd-udevd[228606]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:03:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:10.887 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 43ae6beb-d59a-483d-8ced-1303f84a69d1 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 bound to our chassis#033[00m
Nov 22 03:03:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:10.890 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087#033[00m
Nov 22 03:03:10 np0005531888 NetworkManager[55166]: <info>  [1763798590.9049] device (tap43ae6beb-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:03:10 np0005531888 NetworkManager[55166]: <info>  [1763798590.9061] device (tap43ae6beb-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:03:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:10.906 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0def7e81-b07b-47f7-ac0b-a7d1b2899d15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:10 np0005531888 systemd-machined[153106]: New machine qemu-46-instance-00000060.
Nov 22 03:03:10 np0005531888 systemd[1]: Started Virtual Machine qemu-46-instance-00000060.
Nov 22 03:03:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:10.940 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd37b47-42db-40cf-9d73-8d43e35ee832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:10.947 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[cf3813ea-fca0-4372-b76f-c966085fe5e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:10.983 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d793ef75-4c00-4b1e-982a-72516ecd0bae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:11.001 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6d678384-4dfa-4eeb-8da3-cc1cbe92c379]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506803, 'reachable_time': 17176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228621, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:11.021 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[85c09f49-e87a-4948-bbd0-4636c4a4d63c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506813, 'tstamp': 506813}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228623, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506816, 'tstamp': 506816}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228623, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:11.024 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.027 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.029 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:11.030 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:11.030 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:11.030 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:11 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:11.031 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.273 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798591.2721398, aaf09935-3011-4bf6-bdf9-28fe60097c1c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.274 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] VM Started (Lifecycle Event)#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.294 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.300 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798591.2738554, aaf09935-3011-4bf6-bdf9-28fe60097c1c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.301 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.316 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.320 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.337 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.942 186792 DEBUG nova.compute.manager [req-e0737f4e-025b-42aa-8ec6-7408e513abd9 req-54571ad5-87b8-4a23-875c-309498d41018 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.943 186792 DEBUG oslo_concurrency.lockutils [req-e0737f4e-025b-42aa-8ec6-7408e513abd9 req-54571ad5-87b8-4a23-875c-309498d41018 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.943 186792 DEBUG oslo_concurrency.lockutils [req-e0737f4e-025b-42aa-8ec6-7408e513abd9 req-54571ad5-87b8-4a23-875c-309498d41018 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.943 186792 DEBUG oslo_concurrency.lockutils [req-e0737f4e-025b-42aa-8ec6-7408e513abd9 req-54571ad5-87b8-4a23-875c-309498d41018 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.944 186792 DEBUG nova.compute.manager [req-e0737f4e-025b-42aa-8ec6-7408e513abd9 req-54571ad5-87b8-4a23-875c-309498d41018 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Processing event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.944 186792 DEBUG nova.compute.manager [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.948 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798591.947975, aaf09935-3011-4bf6-bdf9-28fe60097c1c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.948 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.950 186792 DEBUG nova.virt.libvirt.driver [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.953 186792 INFO nova.virt.libvirt.driver [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance spawned successfully.#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.987 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:11 np0005531888 nova_compute[186788]: 2025-11-22 08:03:11.991 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:03:12 np0005531888 nova_compute[186788]: 2025-11-22 08:03:12.010 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:03:12 np0005531888 nova_compute[186788]: 2025-11-22 08:03:12.794 186792 DEBUG nova.compute.manager [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:12 np0005531888 nova_compute[186788]: 2025-11-22 08:03:12.892 186792 DEBUG oslo_concurrency.lockutils [None req-a7b10644-865e-4c9a-994b-782bdb3583a7 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 13.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:13 np0005531888 nova_compute[186788]: 2025-11-22 08:03:13.019 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:14 np0005531888 nova_compute[186788]: 2025-11-22 08:03:14.010 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:14 np0005531888 nova_compute[186788]: 2025-11-22 08:03:14.287 186792 DEBUG nova.compute.manager [req-1b7f21fb-4881-43c7-992e-1b93d9285bbb req-a32f7b3e-1044-4f51-95fa-443157197cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:14 np0005531888 nova_compute[186788]: 2025-11-22 08:03:14.287 186792 DEBUG oslo_concurrency.lockutils [req-1b7f21fb-4881-43c7-992e-1b93d9285bbb req-a32f7b3e-1044-4f51-95fa-443157197cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:14 np0005531888 nova_compute[186788]: 2025-11-22 08:03:14.287 186792 DEBUG oslo_concurrency.lockutils [req-1b7f21fb-4881-43c7-992e-1b93d9285bbb req-a32f7b3e-1044-4f51-95fa-443157197cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:14 np0005531888 nova_compute[186788]: 2025-11-22 08:03:14.288 186792 DEBUG oslo_concurrency.lockutils [req-1b7f21fb-4881-43c7-992e-1b93d9285bbb req-a32f7b3e-1044-4f51-95fa-443157197cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:14 np0005531888 nova_compute[186788]: 2025-11-22 08:03:14.288 186792 DEBUG nova.compute.manager [req-1b7f21fb-4881-43c7-992e-1b93d9285bbb req-a32f7b3e-1044-4f51-95fa-443157197cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] No waiting events found dispatching network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:14 np0005531888 nova_compute[186788]: 2025-11-22 08:03:14.288 186792 WARNING nova.compute.manager [req-1b7f21fb-4881-43c7-992e-1b93d9285bbb req-a32f7b3e-1044-4f51-95fa-443157197cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received unexpected event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:03:14 np0005531888 podman[228632]: 2025-11-22 08:03:14.686473335 +0000 UTC m=+0.055357741 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, release=1755695350, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm)
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.007 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.008 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.066 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.202 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.203 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.208 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.209 186792 INFO nova.compute.claims [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.400 186792 DEBUG nova.compute.provider_tree [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.413 186792 DEBUG nova.scheduler.client.report [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.465 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.466 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.546 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.546 186792 DEBUG nova.network.neutron [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.618 186792 INFO nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.640 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.753 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.755 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.755 186792 INFO nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Creating image(s)#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.756 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.756 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.757 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.769 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.787 186792 DEBUG nova.policy [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.900 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.901 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.901 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.913 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.971 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:15 np0005531888 nova_compute[186788]: 2025-11-22 08:03:15.972 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.064 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk 1073741824" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.065 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.065 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.124 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.125 186792 DEBUG nova.virt.disk.api [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Checking if we can resize image /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.126 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.181 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.182 186792 DEBUG nova.virt.disk.api [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Cannot resize image /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.182 186792 DEBUG nova.objects.instance [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'migration_context' on Instance uuid ad45d92a-70c4-461c-80d8-2c75f978d5e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.193 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.194 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Ensure instance console log exists: /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.194 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.195 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.195 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:16 np0005531888 nova_compute[186788]: 2025-11-22 08:03:16.698 186792 DEBUG nova.network.neutron [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Successfully created port: d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.415 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.415 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.416 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.416 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.416 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.426 186792 INFO nova.compute.manager [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Terminating instance#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.431 186792 DEBUG nova.compute.manager [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:03:17 np0005531888 kernel: tap43ae6beb-d5 (unregistering): left promiscuous mode
Nov 22 03:03:17 np0005531888 NetworkManager[55166]: <info>  [1763798597.4543] device (tap43ae6beb-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.464 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:17Z|00284|binding|INFO|Releasing lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 from this chassis (sb_readonly=0)
Nov 22 03:03:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:17Z|00285|binding|INFO|Setting lport 43ae6beb-d59a-483d-8ced-1303f84a69d1 down in Southbound
Nov 22 03:03:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:17Z|00286|binding|INFO|Removing iface tap43ae6beb-d5 ovn-installed in OVS
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.468 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.474 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:82:86 10.100.0.11'], port_security=['fa:16:3e:57:82:86 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'aaf09935-3011-4bf6-bdf9-28fe60097c1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '320b38f4-6497-45cc-9e33-00f741d5a1b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=43ae6beb-d59a-483d-8ced-1303f84a69d1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.475 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 43ae6beb-d59a-483d-8ced-1303f84a69d1 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 unbound from our chassis#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.477 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.492 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.496 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1169f50e-443b-4235-9b3a-9264ce23f58b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:17 np0005531888 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000060.scope: Deactivated successfully.
Nov 22 03:03:17 np0005531888 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000060.scope: Consumed 5.954s CPU time.
Nov 22 03:03:17 np0005531888 systemd-machined[153106]: Machine qemu-46-instance-00000060 terminated.
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.527 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[68a36c80-cae3-41e7-8a8e-924da861c45e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.532 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5276cdf4-8213-48a1-9299-89142e539d55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.563 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[51a7b0b2-5c1e-4d73-839c-f990d391d4a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.585 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a8becc90-3fb7-4019-bebd-897d91cc4e9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506803, 'reachable_time': 17176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228691, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.605 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8ece99c3-a5a1-40f5-a9da-aa01cebe6986]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506813, 'tstamp': 506813}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228693, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7727db5-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506816, 'tstamp': 506816}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228693, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.607 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.609 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.616 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.617 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.617 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.618 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:17.618 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.720 186792 INFO nova.virt.libvirt.driver [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Instance destroyed successfully.#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.720 186792 DEBUG nova.objects.instance [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'resources' on Instance uuid aaf09935-3011-4bf6-bdf9-28fe60097c1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.734 186792 DEBUG nova.virt.libvirt.vif [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:01:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-767428975',display_name='tempest-ServerActionsTestOtherB-server-767428975',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-767428975',id=96,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-41j79svb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=aaf09935-3011-4bf6-bdf9-28fe60097c1c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.734 186792 DEBUG nova.network.os_vif_util [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "address": "fa:16:3e:57:82:86", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43ae6beb-d5", "ovs_interfaceid": "43ae6beb-d59a-483d-8ced-1303f84a69d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.735 186792 DEBUG nova.network.os_vif_util [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.735 186792 DEBUG os_vif [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.737 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.737 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43ae6beb-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.739 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.741 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.743 186792 INFO os_vif [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:82:86,bridge_name='br-int',has_traffic_filtering=True,id=43ae6beb-d59a-483d-8ced-1303f84a69d1,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43ae6beb-d5')#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.744 186792 INFO nova.virt.libvirt.driver [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Deleting instance files /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c_del#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.749 186792 INFO nova.virt.libvirt.driver [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Deletion of /var/lib/nova/instances/aaf09935-3011-4bf6-bdf9-28fe60097c1c_del complete#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.755 186792 DEBUG nova.compute.manager [req-865c8ff4-65e4-40bc-a1ed-5bd254569a6f req-78ff4668-10c7-4cbb-ab12-bedacd9246e0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-unplugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.756 186792 DEBUG oslo_concurrency.lockutils [req-865c8ff4-65e4-40bc-a1ed-5bd254569a6f req-78ff4668-10c7-4cbb-ab12-bedacd9246e0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.756 186792 DEBUG oslo_concurrency.lockutils [req-865c8ff4-65e4-40bc-a1ed-5bd254569a6f req-78ff4668-10c7-4cbb-ab12-bedacd9246e0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.756 186792 DEBUG oslo_concurrency.lockutils [req-865c8ff4-65e4-40bc-a1ed-5bd254569a6f req-78ff4668-10c7-4cbb-ab12-bedacd9246e0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.756 186792 DEBUG nova.compute.manager [req-865c8ff4-65e4-40bc-a1ed-5bd254569a6f req-78ff4668-10c7-4cbb-ab12-bedacd9246e0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] No waiting events found dispatching network-vif-unplugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.756 186792 DEBUG nova.compute.manager [req-865c8ff4-65e4-40bc-a1ed-5bd254569a6f req-78ff4668-10c7-4cbb-ab12-bedacd9246e0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-unplugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.804 186792 INFO nova.compute.manager [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.804 186792 DEBUG oslo.service.loopingcall [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.804 186792 DEBUG nova.compute.manager [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:03:17 np0005531888 nova_compute[186788]: 2025-11-22 08:03:17.805 186792 DEBUG nova.network.neutron [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:03:18 np0005531888 nova_compute[186788]: 2025-11-22 08:03:18.021 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:18 np0005531888 nova_compute[186788]: 2025-11-22 08:03:18.657 186792 DEBUG nova.network.neutron [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Successfully updated port: d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:03:18 np0005531888 nova_compute[186788]: 2025-11-22 08:03:18.673 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:18 np0005531888 nova_compute[186788]: 2025-11-22 08:03:18.673 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:18 np0005531888 nova_compute[186788]: 2025-11-22 08:03:18.673 186792 DEBUG nova.network.neutron [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:03:18 np0005531888 nova_compute[186788]: 2025-11-22 08:03:18.844 186792 DEBUG nova.network.neutron [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.273 186792 DEBUG nova.network.neutron [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.305 186792 INFO nova.compute.manager [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Took 1.50 seconds to deallocate network for instance.#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.359 186792 DEBUG nova.compute.manager [req-62da6e11-626f-43ec-a727-4f9682441d67 req-e6c5153a-96c4-4496-aaab-c3cb6043aa3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-deleted-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.397 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.397 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.513 186792 DEBUG nova.compute.provider_tree [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.529 186792 DEBUG nova.scheduler.client.report [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.554 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.609 186792 INFO nova.scheduler.client.report [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Deleted allocations for instance aaf09935-3011-4bf6-bdf9-28fe60097c1c#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.698 186792 DEBUG oslo_concurrency.lockutils [None req-d5cbe833-6939-4039-b074-5895de8eaf32 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.959 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "80cb8b15-443c-424b-894c-1ed6674f77d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.960 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.960 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.960 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.961 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.969 186792 INFO nova.compute.manager [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Terminating instance#033[00m
Nov 22 03:03:19 np0005531888 nova_compute[186788]: 2025-11-22 08:03:19.982 186792 DEBUG nova.compute.manager [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:03:20 np0005531888 kernel: tap487183e6-b0 (unregistering): left promiscuous mode
Nov 22 03:03:20 np0005531888 NetworkManager[55166]: <info>  [1763798600.0076] device (tap487183e6-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.012 186792 DEBUG nova.compute.manager [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-changed-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.012 186792 DEBUG nova.compute.manager [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing instance network info cache due to event network-changed-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.012 186792 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.014 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:20Z|00287|binding|INFO|Releasing lport 487183e6-b09b-4561-97a9-8f8e44492221 from this chassis (sb_readonly=0)
Nov 22 03:03:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:20Z|00288|binding|INFO|Setting lport 487183e6-b09b-4561-97a9-8f8e44492221 down in Southbound
Nov 22 03:03:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:20Z|00289|binding|INFO|Removing iface tap487183e6-b0 ovn-installed in OVS
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:20.030 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:0b:78 10.100.0.10'], port_security=['fa:16:3e:2d:0b:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '80cb8b15-443c-424b-894c-1ed6674f77d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80908aff-0365-41dd-a88b-8ec1981e86fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=487183e6-b09b-4561-97a9-8f8e44492221) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.029 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:20.031 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 487183e6-b09b-4561-97a9-8f8e44492221 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 unbound from our chassis#033[00m
Nov 22 03:03:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:20.033 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7727db5-43a6-48f6-abbf-aa184d8ad087, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:03:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:20.034 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e59678b6-b0b8-40dd-8cb6-81c8d6652fd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:20.035 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 namespace which is not needed anymore#033[00m
Nov 22 03:03:20 np0005531888 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Nov 22 03:03:20 np0005531888 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000005c.scope: Consumed 22.102s CPU time.
Nov 22 03:03:20 np0005531888 systemd-machined[153106]: Machine qemu-40-instance-0000005c terminated.
Nov 22 03:03:20 np0005531888 podman[228716]: 2025-11-22 08:03:20.112922034 +0000 UTC m=+0.073642943 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:03:20 np0005531888 podman[228718]: 2025-11-22 08:03:20.137803536 +0000 UTC m=+0.098721749 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.241 186792 INFO nova.virt.libvirt.driver [-] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Instance destroyed successfully.#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.242 186792 DEBUG nova.objects.instance [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'resources' on Instance uuid 80cb8b15-443c-424b-894c-1ed6674f77d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:20 np0005531888 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[225622]: [NOTICE]   (225626) : haproxy version is 2.8.14-c23fe91
Nov 22 03:03:20 np0005531888 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[225622]: [NOTICE]   (225626) : path to executable is /usr/sbin/haproxy
Nov 22 03:03:20 np0005531888 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[225622]: [WARNING]  (225626) : Exiting Master process...
Nov 22 03:03:20 np0005531888 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[225622]: [WARNING]  (225626) : Exiting Master process...
Nov 22 03:03:20 np0005531888 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[225622]: [ALERT]    (225626) : Current worker (225628) exited with code 143 (Terminated)
Nov 22 03:03:20 np0005531888 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[225622]: [WARNING]  (225626) : All workers exited. Exiting... (0)
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.264 186792 DEBUG nova.virt.libvirt.vif [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1246188074',display_name='tempest-ServerActionsTestOtherB-server-1246188074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1246188074',id=92,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-bcb74bw3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:16Z,user_data=None,user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=80cb8b15-443c-424b-894c-1ed6674f77d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.264 186792 DEBUG nova.network.os_vif_util [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "487183e6-b09b-4561-97a9-8f8e44492221", "address": "fa:16:3e:2d:0b:78", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap487183e6-b0", "ovs_interfaceid": "487183e6-b09b-4561-97a9-8f8e44492221", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.265 186792 DEBUG nova.network.os_vif_util [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:0b:78,bridge_name='br-int',has_traffic_filtering=True,id=487183e6-b09b-4561-97a9-8f8e44492221,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap487183e6-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.266 186792 DEBUG os_vif [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:0b:78,bridge_name='br-int',has_traffic_filtering=True,id=487183e6-b09b-4561-97a9-8f8e44492221,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap487183e6-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:03:20 np0005531888 systemd[1]: libpod-f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac.scope: Deactivated successfully.
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.267 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.267 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap487183e6-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.268 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.270 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.273 186792 INFO os_vif [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:0b:78,bridge_name='br-int',has_traffic_filtering=True,id=487183e6-b09b-4561-97a9-8f8e44492221,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap487183e6-b0')#033[00m
Nov 22 03:03:20 np0005531888 podman[228770]: 2025-11-22 08:03:20.273641726 +0000 UTC m=+0.149268922 container died f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.273 186792 INFO nova.virt.libvirt.driver [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Deleting instance files /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5_del#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.274 186792 INFO nova.virt.libvirt.driver [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Deletion of /var/lib/nova/instances/80cb8b15-443c-424b-894c-1ed6674f77d5_del complete#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.409 186792 INFO nova.compute.manager [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.410 186792 DEBUG oslo.service.loopingcall [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.410 186792 DEBUG nova.compute.manager [-] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.411 186792 DEBUG nova.network.neutron [-] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.539 186792 DEBUG nova.compute.manager [req-bfa5a5f8-91c5-45ed-95af-54319a4f0ce4 req-41b21481-730c-492f-aff8-24b1833b6c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received event network-vif-unplugged-487183e6-b09b-4561-97a9-8f8e44492221 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.539 186792 DEBUG oslo_concurrency.lockutils [req-bfa5a5f8-91c5-45ed-95af-54319a4f0ce4 req-41b21481-730c-492f-aff8-24b1833b6c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.541 186792 DEBUG oslo_concurrency.lockutils [req-bfa5a5f8-91c5-45ed-95af-54319a4f0ce4 req-41b21481-730c-492f-aff8-24b1833b6c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.541 186792 DEBUG oslo_concurrency.lockutils [req-bfa5a5f8-91c5-45ed-95af-54319a4f0ce4 req-41b21481-730c-492f-aff8-24b1833b6c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.541 186792 DEBUG nova.compute.manager [req-bfa5a5f8-91c5-45ed-95af-54319a4f0ce4 req-41b21481-730c-492f-aff8-24b1833b6c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] No waiting events found dispatching network-vif-unplugged-487183e6-b09b-4561-97a9-8f8e44492221 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.542 186792 DEBUG nova.compute.manager [req-bfa5a5f8-91c5-45ed-95af-54319a4f0ce4 req-41b21481-730c-492f-aff8-24b1833b6c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received event network-vif-unplugged-487183e6-b09b-4561-97a9-8f8e44492221 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.548 186792 DEBUG nova.network.neutron [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.564 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.564 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Instance network_info: |[{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.565 186792 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.565 186792 DEBUG nova.network.neutron [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing network info cache for port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.568 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Start _get_guest_xml network_info=[{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.574 186792 WARNING nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.580 186792 DEBUG nova.virt.libvirt.host [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.581 186792 DEBUG nova.virt.libvirt.host [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.584 186792 DEBUG nova.virt.libvirt.host [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.585 186792 DEBUG nova.virt.libvirt.host [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.586 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.586 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.586 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.587 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.587 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.587 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.587 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.588 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.588 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.588 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.588 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.589 186792 DEBUG nova.virt.hardware [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.593 186792 DEBUG nova.virt.libvirt.vif [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:03:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1110835689',display_name='tempest-tempest.common.compute-instance-1110835689',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1110835689',id=102,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-0a57k9ms',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:03:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=ad45d92a-70c4-461c-80d8-2c75f978d5e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.593 186792 DEBUG nova.network.os_vif_util [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.594 186792 DEBUG nova.network.os_vif_util [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:62:3d,bridge_name='br-int',has_traffic_filtering=True,id=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7a9f14e-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.595 186792 DEBUG nova.objects.instance [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_devices' on Instance uuid ad45d92a-70c4-461c-80d8-2c75f978d5e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.608 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <uuid>ad45d92a-70c4-461c-80d8-2c75f978d5e6</uuid>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <name>instance-00000066</name>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <nova:name>tempest-tempest.common.compute-instance-1110835689</nova:name>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:03:20</nova:creationTime>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        <nova:port uuid="d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <entry name="serial">ad45d92a-70c4-461c-80d8-2c75f978d5e6</entry>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <entry name="uuid">ad45d92a-70c4-461c-80d8-2c75f978d5e6</entry>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.config"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:65:62:3d"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <target dev="tapd7a9f14e-7a"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/console.log" append="off"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:03:20 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:03:20 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:03:20 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:03:20 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.609 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Preparing to wait for external event network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.609 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.609 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.609 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.610 186792 DEBUG nova.virt.libvirt.vif [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:03:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1110835689',display_name='tempest-tempest.common.compute-instance-1110835689',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1110835689',id=102,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-0a57k9ms',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:03:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=ad45d92a-70c4-461c-80d8-2c75f978d5e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.610 186792 DEBUG nova.network.os_vif_util [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.610 186792 DEBUG nova.network.os_vif_util [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:62:3d,bridge_name='br-int',has_traffic_filtering=True,id=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7a9f14e-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.611 186792 DEBUG os_vif [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:62:3d,bridge_name='br-int',has_traffic_filtering=True,id=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7a9f14e-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.611 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.611 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.612 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.613 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.614 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7a9f14e-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.614 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7a9f14e-7a, col_values=(('external_ids', {'iface-id': 'd7a9f14e-7a03-4a17-aff2-c6c5bf3e9938', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:62:3d', 'vm-uuid': 'ad45d92a-70c4-461c-80d8-2c75f978d5e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.615 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:20 np0005531888 NetworkManager[55166]: <info>  [1763798600.6161] manager: (tapd7a9f14e-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.618 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.620 186792 INFO os_vif [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:62:3d,bridge_name='br-int',has_traffic_filtering=True,id=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7a9f14e-7a')#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.947 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.947 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.947 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:65:62:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:03:20 np0005531888 nova_compute[186788]: 2025-11-22 08:03:20.948 186792 INFO nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Using config drive#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.077 186792 DEBUG nova.network.neutron [-] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.123 186792 INFO nova.compute.manager [-] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Took 0.71 seconds to deallocate network for instance.#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.183 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.184 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.258 186792 DEBUG nova.compute.provider_tree [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.271 186792 DEBUG nova.scheduler.client.report [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.289 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.294 186792 INFO nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Creating config drive at /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.config#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.301 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo2zhqtur execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.340 186792 INFO nova.scheduler.client.report [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Deleted allocations for instance 80cb8b15-443c-424b-894c-1ed6674f77d5#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.412 186792 DEBUG oslo_concurrency.lockutils [None req-ee833654-530f-4494-92f1-d60761b9b271 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.428 186792 DEBUG oslo_concurrency.processutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo2zhqtur" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:21 np0005531888 kernel: tapd7a9f14e-7a: entered promiscuous mode
Nov 22 03:03:21 np0005531888 NetworkManager[55166]: <info>  [1763798601.4857] manager: (tapd7a9f14e-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.484 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:21Z|00290|binding|INFO|Claiming lport d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 for this chassis.
Nov 22 03:03:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:21Z|00291|binding|INFO|d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938: Claiming fa:16:3e:65:62:3d 10.100.0.11
Nov 22 03:03:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:21.495 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:62:3d 10.100.0.11'], port_security=['fa:16:3e:65:62:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ad45d92a-70c4-461c-80d8-2c75f978d5e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6da332c7-e52a-4f92-8c24-c2ee0c6e77d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:21Z|00292|binding|INFO|Setting lport d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 ovn-installed in OVS
Nov 22 03:03:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:21Z|00293|binding|INFO|Setting lport d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 up in Southbound
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.501 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:21 np0005531888 systemd-udevd[228841]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:03:21 np0005531888 NetworkManager[55166]: <info>  [1763798601.5204] device (tapd7a9f14e-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:03:21 np0005531888 NetworkManager[55166]: <info>  [1763798601.5216] device (tapd7a9f14e-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:03:21 np0005531888 systemd-machined[153106]: New machine qemu-47-instance-00000066.
Nov 22 03:03:21 np0005531888 systemd[1]: Started Virtual Machine qemu-47-instance-00000066.
Nov 22 03:03:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay-26a18f2fa3531b624cc4901bea16207a881a8862117de9a8a0d94429734c42a5-merged.mount: Deactivated successfully.
Nov 22 03:03:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac-userdata-shm.mount: Deactivated successfully.
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.678 186792 DEBUG nova.network.neutron [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updated VIF entry in instance network info cache for port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.678 186792 DEBUG nova.network.neutron [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.693 186792 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.693 186792 DEBUG nova.compute.manager [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.694 186792 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.694 186792 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.694 186792 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "aaf09935-3011-4bf6-bdf9-28fe60097c1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.694 186792 DEBUG nova.compute.manager [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] No waiting events found dispatching network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.694 186792 WARNING nova.compute.manager [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Received unexpected event network-vif-plugged-43ae6beb-d59a-483d-8ced-1303f84a69d1 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.744 186792 DEBUG nova.compute.manager [req-d91f83d3-1bff-43ac-a34a-331614324ca6 req-3a0ce8c9-7c97-47d5-bb31-b7b753fb312f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.745 186792 DEBUG oslo_concurrency.lockutils [req-d91f83d3-1bff-43ac-a34a-331614324ca6 req-3a0ce8c9-7c97-47d5-bb31-b7b753fb312f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.745 186792 DEBUG oslo_concurrency.lockutils [req-d91f83d3-1bff-43ac-a34a-331614324ca6 req-3a0ce8c9-7c97-47d5-bb31-b7b753fb312f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.745 186792 DEBUG oslo_concurrency.lockutils [req-d91f83d3-1bff-43ac-a34a-331614324ca6 req-3a0ce8c9-7c97-47d5-bb31-b7b753fb312f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:21 np0005531888 nova_compute[186788]: 2025-11-22 08:03:21.746 186792 DEBUG nova.compute.manager [req-d91f83d3-1bff-43ac-a34a-331614324ca6 req-3a0ce8c9-7c97-47d5-bb31-b7b753fb312f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Processing event network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.051 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.052 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798602.0503652, ad45d92a-70c4-461c-80d8-2c75f978d5e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.052 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] VM Started (Lifecycle Event)#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.057 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.060 186792 INFO nova.virt.libvirt.driver [-] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Instance spawned successfully.#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.060 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.074 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.081 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.086 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.086 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.087 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.087 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.087 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.088 186792 DEBUG nova.virt.libvirt.driver [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.121 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.122 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798602.0542378, ad45d92a-70c4-461c-80d8-2c75f978d5e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.122 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.159 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.163 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798602.056237, ad45d92a-70c4-461c-80d8-2c75f978d5e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.163 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.178 186792 INFO nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Took 6.42 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.178 186792 DEBUG nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.182 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.188 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.218 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.254 186792 INFO nova.compute.manager [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Took 7.09 seconds to build instance.#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.270 186792 DEBUG oslo_concurrency.lockutils [None req-1ee709f3-6392-4393-a6d7-33bd450f6688 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.624 186792 DEBUG nova.compute.manager [req-4622f731-a190-402e-b6ef-77206cdf79a8 req-14f31ce4-4a17-435a-adce-9a6dbc6fad70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received event network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.624 186792 DEBUG oslo_concurrency.lockutils [req-4622f731-a190-402e-b6ef-77206cdf79a8 req-14f31ce4-4a17-435a-adce-9a6dbc6fad70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.625 186792 DEBUG oslo_concurrency.lockutils [req-4622f731-a190-402e-b6ef-77206cdf79a8 req-14f31ce4-4a17-435a-adce-9a6dbc6fad70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.625 186792 DEBUG oslo_concurrency.lockutils [req-4622f731-a190-402e-b6ef-77206cdf79a8 req-14f31ce4-4a17-435a-adce-9a6dbc6fad70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "80cb8b15-443c-424b-894c-1ed6674f77d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.625 186792 DEBUG nova.compute.manager [req-4622f731-a190-402e-b6ef-77206cdf79a8 req-14f31ce4-4a17-435a-adce-9a6dbc6fad70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] No waiting events found dispatching network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.625 186792 WARNING nova.compute.manager [req-4622f731-a190-402e-b6ef-77206cdf79a8 req-14f31ce4-4a17-435a-adce-9a6dbc6fad70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received unexpected event network-vif-plugged-487183e6-b09b-4561-97a9-8f8e44492221 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:03:22 np0005531888 nova_compute[186788]: 2025-11-22 08:03:22.625 186792 DEBUG nova.compute.manager [req-4622f731-a190-402e-b6ef-77206cdf79a8 req-14f31ce4-4a17-435a-adce-9a6dbc6fad70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Received event network-vif-deleted-487183e6-b09b-4561-97a9-8f8e44492221 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:22 np0005531888 podman[228770]: 2025-11-22 08:03:22.716292576 +0000 UTC m=+2.591919772 container cleanup f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:03:22 np0005531888 systemd[1]: libpod-conmon-f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac.scope: Deactivated successfully.
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.024 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:23 np0005531888 podman[228859]: 2025-11-22 08:03:23.380457526 +0000 UTC m=+0.636929361 container remove f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.388 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[60f6cb23-e000-464b-ac75-9ae2e045cea1]: (4, ('Sat Nov 22 08:03:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 (f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac)\nf53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac\nSat Nov 22 08:03:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 (f53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac)\nf53c71bdaf49263980b8c2663cfc8cdf7e6e4e82a95b294d22c0e93f3d162dac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.391 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fdac522a-dfee-4cb5-b408-129dd4547437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.392 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.393 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:23 np0005531888 kernel: tapf7727db5-40: left promiscuous mode
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.413 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.415 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c2cae651-afe4-4997-91a9-d6892341dce0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.438 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0e72cb9b-593b-4d1a-9dd6-12bbd3dac26d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.439 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d3579f53-9820-4d52-84d8-ba5f47bb8269]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.454 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6c833765-a0f3-4167-8abd-f51488421d89]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506796, 'reachable_time': 21849, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228875, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.457 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.457 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[c0530dda-7264-4002-8139-734990586334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.458 104023 INFO neutron.agent.ovn.metadata.agent [-] Port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:03:23 np0005531888 systemd[1]: run-netns-ovnmeta\x2df7727db5\x2d43a6\x2d48f6\x2dabbf\x2daa184d8ad087.mount: Deactivated successfully.
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.459 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.470 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aad8112e-e086-4afc-91c9-b915ce74411b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.471 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a4a282c-d1 in ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.473 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a4a282c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.473 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[74636667-db5e-4cf0-b5c6-2840b4d66dbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.474 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[26664d80-d84c-4c5a-80ae-612eb6329302]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.485 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[945871d0-1685-460e-bd63-66d3028a75a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.518 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0c789082-1e2a-45bd-a906-8a23d11b6dc2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.548 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c822d5e6-5d74-4fdc-a528-c72e56e3700e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 systemd-udevd[228844]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.554 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[22b2ba38-e4cf-4616-b6cf-772affef996b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 NetworkManager[55166]: <info>  [1763798603.5570] manager: (tap6a4a282c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/152)
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.593 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ddf182-d0be-4576-9d32-3f5e85029dd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.597 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e7d0d2-70f6-4771-bec2-8b14db10b3e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 NetworkManager[55166]: <info>  [1763798603.6163] device (tap6a4a282c-d0): carrier: link connected
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.620 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[0cfc5788-280b-4747-85d8-fc40a73f2bb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.638 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[38d8e81f-3076-4e55-8424-5c740b40a549]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534426, 'reachable_time': 38194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228900, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.651 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[24a7106d-ee45-4620-bed9-bd211c3e83e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:7a86'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534426, 'tstamp': 534426}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228901, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.665 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[85488c1c-4ca3-4491-93be-798d6513e207]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534426, 'reachable_time': 38194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228902, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.687 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ce18b211-4f5b-4ad2-a06e-75c94c123963]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.735 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8a7abe-e684-4a6e-bb97-649fe7ab55a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.737 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.737 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.738 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.739 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:23 np0005531888 kernel: tap6a4a282c-d0: entered promiscuous mode
Nov 22 03:03:23 np0005531888 NetworkManager[55166]: <info>  [1763798603.7404] manager: (tap6a4a282c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.745 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.746 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:23 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:23Z|00294|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.749 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.751 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[42a094e2-a09d-4a27-a0bb-6986dd99d614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.752 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-6a4a282c-db22-41de-b34b-2960aa032ca8
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 6a4a282c-db22-41de-b34b-2960aa032ca8
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:03:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:23.753 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'env', 'PROCESS_TAG=haproxy-6a4a282c-db22-41de-b34b-2960aa032ca8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a4a282c-db22-41de-b34b-2960aa032ca8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.759 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.817 186792 DEBUG nova.compute.manager [req-a7b94b4d-cb89-42dd-8a0d-d004e9dabdcd req-e60721b5-c54f-40bc-be3c-1e135992ba0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.818 186792 DEBUG oslo_concurrency.lockutils [req-a7b94b4d-cb89-42dd-8a0d-d004e9dabdcd req-e60721b5-c54f-40bc-be3c-1e135992ba0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.819 186792 DEBUG oslo_concurrency.lockutils [req-a7b94b4d-cb89-42dd-8a0d-d004e9dabdcd req-e60721b5-c54f-40bc-be3c-1e135992ba0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.819 186792 DEBUG oslo_concurrency.lockutils [req-a7b94b4d-cb89-42dd-8a0d-d004e9dabdcd req-e60721b5-c54f-40bc-be3c-1e135992ba0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.819 186792 DEBUG nova.compute.manager [req-a7b94b4d-cb89-42dd-8a0d-d004e9dabdcd req-e60721b5-c54f-40bc-be3c-1e135992ba0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] No waiting events found dispatching network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:23 np0005531888 nova_compute[186788]: 2025-11-22 08:03:23.820 186792 WARNING nova.compute.manager [req-a7b94b4d-cb89-42dd-8a0d-d004e9dabdcd req-e60721b5-c54f-40bc-be3c-1e135992ba0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received unexpected event network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:03:24 np0005531888 podman[228934]: 2025-11-22 08:03:24.186743372 +0000 UTC m=+0.023046727 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:03:24 np0005531888 podman[228934]: 2025-11-22 08:03:24.465787184 +0000 UTC m=+0.302090529 container create cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:03:24 np0005531888 systemd[1]: Started libpod-conmon-cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e.scope.
Nov 22 03:03:24 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:03:24 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f408171d412da31c590fc5ffbfbb3785214ce2f261c2995fba8bdb0ede87ce41/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:03:24 np0005531888 podman[228934]: 2025-11-22 08:03:24.661848875 +0000 UTC m=+0.498152250 container init cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:03:24 np0005531888 podman[228934]: 2025-11-22 08:03:24.670866206 +0000 UTC m=+0.507169541 container start cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:03:24 np0005531888 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228947]: [NOTICE]   (228951) : New worker (228954) forked
Nov 22 03:03:24 np0005531888 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228947]: [NOTICE]   (228951) : Loading success.
Nov 22 03:03:25 np0005531888 nova_compute[186788]: 2025-11-22 08:03:25.616 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:28 np0005531888 nova_compute[186788]: 2025-11-22 08:03:28.025 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:28 np0005531888 nova_compute[186788]: 2025-11-22 08:03:28.630 186792 DEBUG nova.compute.manager [req-8ca1402a-df8f-4559-aa6c-49fc60a36e2e req-85d7ccdb-e50e-46e6-b477-e170aac95272 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-changed-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:28 np0005531888 nova_compute[186788]: 2025-11-22 08:03:28.631 186792 DEBUG nova.compute.manager [req-8ca1402a-df8f-4559-aa6c-49fc60a36e2e req-85d7ccdb-e50e-46e6-b477-e170aac95272 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing instance network info cache due to event network-changed-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:28 np0005531888 nova_compute[186788]: 2025-11-22 08:03:28.631 186792 DEBUG oslo_concurrency.lockutils [req-8ca1402a-df8f-4559-aa6c-49fc60a36e2e req-85d7ccdb-e50e-46e6-b477-e170aac95272 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:28 np0005531888 nova_compute[186788]: 2025-11-22 08:03:28.632 186792 DEBUG oslo_concurrency.lockutils [req-8ca1402a-df8f-4559-aa6c-49fc60a36e2e req-85d7ccdb-e50e-46e6-b477-e170aac95272 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:28 np0005531888 nova_compute[186788]: 2025-11-22 08:03:28.632 186792 DEBUG nova.network.neutron [req-8ca1402a-df8f-4559-aa6c-49fc60a36e2e req-85d7ccdb-e50e-46e6-b477-e170aac95272 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing network info cache for port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:29 np0005531888 podman[228963]: 2025-11-22 08:03:29.678348662 +0000 UTC m=+0.052511592 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:03:29 np0005531888 podman[228964]: 2025-11-22 08:03:29.680499185 +0000 UTC m=+0.051486517 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:03:30 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:30Z|00295|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:03:30 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:30Z|00296|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.555 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.619 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.759 186792 DEBUG nova.network.neutron [req-8ca1402a-df8f-4559-aa6c-49fc60a36e2e req-85d7ccdb-e50e-46e6-b477-e170aac95272 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updated VIF entry in instance network info cache for port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.761 186792 DEBUG nova.network.neutron [req-8ca1402a-df8f-4559-aa6c-49fc60a36e2e req-85d7ccdb-e50e-46e6-b477-e170aac95272 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.791 186792 DEBUG nova.compute.manager [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-changed-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.792 186792 DEBUG nova.compute.manager [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing instance network info cache due to event network-changed-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.792 186792 DEBUG oslo_concurrency.lockutils [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.839 186792 DEBUG oslo_concurrency.lockutils [req-8ca1402a-df8f-4559-aa6c-49fc60a36e2e req-85d7ccdb-e50e-46e6-b477-e170aac95272 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.839 186792 DEBUG oslo_concurrency.lockutils [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:30 np0005531888 nova_compute[186788]: 2025-11-22 08:03:30.840 186792 DEBUG nova.network.neutron [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing network info cache for port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:32 np0005531888 nova_compute[186788]: 2025-11-22 08:03:32.718 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798597.7177637, aaf09935-3011-4bf6-bdf9-28fe60097c1c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:32 np0005531888 nova_compute[186788]: 2025-11-22 08:03:32.721 186792 INFO nova.compute.manager [-] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:03:32 np0005531888 nova_compute[186788]: 2025-11-22 08:03:32.760 186792 DEBUG nova.compute.manager [None req-a53b213c-6775-4888-ab62-aa5f4a96e047 - - - - - -] [instance: aaf09935-3011-4bf6-bdf9-28fe60097c1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:32 np0005531888 nova_compute[186788]: 2025-11-22 08:03:32.763 186792 DEBUG nova.network.neutron [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updated VIF entry in instance network info cache for port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:32 np0005531888 nova_compute[186788]: 2025-11-22 08:03:32.764 186792 DEBUG nova.network.neutron [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:32 np0005531888 nova_compute[186788]: 2025-11-22 08:03:32.796 186792 DEBUG oslo_concurrency.lockutils [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:33 np0005531888 nova_compute[186788]: 2025-11-22 08:03:33.028 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:35 np0005531888 nova_compute[186788]: 2025-11-22 08:03:35.240 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798600.2393901, 80cb8b15-443c-424b-894c-1ed6674f77d5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:35 np0005531888 nova_compute[186788]: 2025-11-22 08:03:35.242 186792 INFO nova.compute.manager [-] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:03:35 np0005531888 nova_compute[186788]: 2025-11-22 08:03:35.268 186792 DEBUG nova.compute.manager [None req-f28eb759-0871-483f-82ee-2fa7371f42ce - - - - - -] [instance: 80cb8b15-443c-424b-894c-1ed6674f77d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:35 np0005531888 nova_compute[186788]: 2025-11-22 08:03:35.624 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:35 np0005531888 nova_compute[186788]: 2025-11-22 08:03:35.967 186792 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:35 np0005531888 nova_compute[186788]: 2025-11-22 08:03:35.968 186792 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:35 np0005531888 nova_compute[186788]: 2025-11-22 08:03:35.968 186792 DEBUG nova.network.neutron [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:03:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:36.816 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:36.817 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:36.820 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:37 np0005531888 nova_compute[186788]: 2025-11-22 08:03:37.990 186792 DEBUG nova.network.neutron [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.009 186792 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.031 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.190 186792 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.191 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Creating file /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/72e2e95a922a43bfb4acbe77123d5e6c.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.191 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/72e2e95a922a43bfb4acbe77123d5e6c.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.667 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/72e2e95a922a43bfb4acbe77123d5e6c.tmp" returned: 1 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.667 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/72e2e95a922a43bfb4acbe77123d5e6c.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.668 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Creating directory /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.668 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:38 np0005531888 podman[229022]: 2025-11-22 08:03:38.694970958 +0000 UTC m=+0.057541006 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.888 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:38 np0005531888 nova_compute[186788]: 2025-11-22 08:03:38.894 186792 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:03:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:39Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:62:3d 10.100.0.11
Nov 22 03:03:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:39Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:62:3d 10.100.0.11
Nov 22 03:03:40 np0005531888 nova_compute[186788]: 2025-11-22 08:03:40.331 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:40 np0005531888 nova_compute[186788]: 2025-11-22 08:03:40.626 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:41 np0005531888 kernel: tapa2f45e58-23 (unregistering): left promiscuous mode
Nov 22 03:03:41 np0005531888 NetworkManager[55166]: <info>  [1763798621.1542] device (tapa2f45e58-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:03:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:41Z|00297|binding|INFO|Releasing lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe from this chassis (sb_readonly=0)
Nov 22 03:03:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:41Z|00298|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe down in Southbound
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.168 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:41Z|00299|binding|INFO|Removing iface tapa2f45e58-23 ovn-installed in OVS
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.170 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:41.179 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:41.181 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.182 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:41.183 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:03:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:41.185 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a6157507-b0fa-4741-8c61-24f499fd1ef7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:41.185 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore#033[00m
Nov 22 03:03:41 np0005531888 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Nov 22 03:03:41 np0005531888 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005d.scope: Consumed 19.899s CPU time.
Nov 22 03:03:41 np0005531888 systemd-machined[153106]: Machine qemu-45-instance-0000005d terminated.
Nov 22 03:03:41 np0005531888 podman[229044]: 2025-11-22 08:03:41.231109578 +0000 UTC m=+0.049336845 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:03:41 np0005531888 NetworkManager[55166]: <info>  [1763798621.3864] manager: (tapa2f45e58-23): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.388 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.395 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:41 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[228117]: [NOTICE]   (228121) : haproxy version is 2.8.14-c23fe91
Nov 22 03:03:41 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[228117]: [NOTICE]   (228121) : path to executable is /usr/sbin/haproxy
Nov 22 03:03:41 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[228117]: [WARNING]  (228121) : Exiting Master process...
Nov 22 03:03:41 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[228117]: [ALERT]    (228121) : Current worker (228123) exited with code 143 (Terminated)
Nov 22 03:03:41 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[228117]: [WARNING]  (228121) : All workers exited. Exiting... (0)
Nov 22 03:03:41 np0005531888 systemd[1]: libpod-d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732.scope: Deactivated successfully.
Nov 22 03:03:41 np0005531888 podman[229090]: 2025-11-22 08:03:41.418248989 +0000 UTC m=+0.143078899 container died d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:03:41 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732-userdata-shm.mount: Deactivated successfully.
Nov 22 03:03:41 np0005531888 systemd[1]: var-lib-containers-storage-overlay-ab8d5943c2fffc70017a6d0ecd1b4ee00bb1d30b210f7e70649fc181d3a87013-merged.mount: Deactivated successfully.
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.913 186792 INFO nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance shutdown successfully after 3 seconds.#033[00m
Nov 22 03:03:41 np0005531888 podman[229090]: 2025-11-22 08:03:41.916738816 +0000 UTC m=+0.641568726 container cleanup d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.922 186792 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance destroyed successfully.#033[00m
Nov 22 03:03:41 np0005531888 systemd[1]: libpod-conmon-d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732.scope: Deactivated successfully.
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.924 186792 DEBUG nova.virt.libvirt.vif [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:df:95:59"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.925 186792 DEBUG nova.network.os_vif_util [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:df:95:59"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.928 186792 DEBUG nova.network.os_vif_util [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.930 186792 DEBUG os_vif [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.934 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.935 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2f45e58-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.937 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.944 186792 INFO os_vif [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:03:41 np0005531888 nova_compute[186788]: 2025-11-22 08:03:41.950 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:42 np0005531888 nova_compute[186788]: 2025-11-22 08:03:42.017 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:42 np0005531888 nova_compute[186788]: 2025-11-22 08:03:42.019 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:42 np0005531888 nova_compute[186788]: 2025-11-22 08:03:42.084 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:42 np0005531888 nova_compute[186788]: 2025-11-22 08:03:42.086 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Copying file /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk to 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:03:42 np0005531888 nova_compute[186788]: 2025-11-22 08:03:42.086 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:42 np0005531888 podman[229135]: 2025-11-22 08:03:42.25689036 +0000 UTC m=+0.315409376 container remove d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.262 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[13ab0acc-b518-4cdf-b12b-9155ecec158d]: (4, ('Sat Nov 22 08:03:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732)\nd074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732\nSat Nov 22 08:03:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (d074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732)\nd074a3938ec9ff969e127f962479616a27c720ffc8a6aa5264eb0f8a260b3732\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.264 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2912797d-2895-4c94-9114-9e46ff12f855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.265 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:42 np0005531888 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 03:03:42 np0005531888 nova_compute[186788]: 2025-11-22 08:03:42.267 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:42 np0005531888 nova_compute[186788]: 2025-11-22 08:03:42.285 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:42 np0005531888 nova_compute[186788]: 2025-11-22 08:03:42.286 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.289 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[24c87b9a-614f-43d1-928e-b0bbacffd8e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.321 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[da5e392c-f294-4fe5-a7a5-4d368d6e4ab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.322 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ab9cac-6c07-4cfe-a2ed-e5bf1267d06e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.338 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[73fdd0e1-870c-45b2-b853-b1794f442ec8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526145, 'reachable_time': 32311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229158, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.340 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:03:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:03:42.340 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f8dfe217-d69a-41e1-9bc1-2c31fafcf597]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:42 np0005531888 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.033 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.131 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.262 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "scp -r /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk" returned: 0 in 1.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.263 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Copying file /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.264 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk.config 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.488 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "scp -C -r /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk.config 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.489 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Copying file /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.489 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk.info 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.683 186792 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "scp -C -r /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_resize/disk.info 192.168.122.100:/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.info" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:43 np0005531888 nova_compute[186788]: 2025-11-22 08:03:43.870 186792 DEBUG neutronclient.v2_0.client [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:03:44 np0005531888 nova_compute[186788]: 2025-11-22 08:03:44.005 186792 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:44 np0005531888 nova_compute[186788]: 2025-11-22 08:03:44.005 186792 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:44 np0005531888 nova_compute[186788]: 2025-11-22 08:03:44.006 186792 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:44 np0005531888 nova_compute[186788]: 2025-11-22 08:03:44.774 186792 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-changed-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:44 np0005531888 nova_compute[186788]: 2025-11-22 08:03:44.775 186792 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing instance network info cache due to event network-changed-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:44 np0005531888 nova_compute[186788]: 2025-11-22 08:03:44.775 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:44 np0005531888 nova_compute[186788]: 2025-11-22 08:03:44.775 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:44 np0005531888 nova_compute[186788]: 2025-11-22 08:03:44.776 186792 DEBUG nova.network.neutron [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing network info cache for port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:45 np0005531888 podman[229164]: 2025-11-22 08:03:45.69426703 +0000 UTC m=+0.057949036 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.542 186792 DEBUG oslo_concurrency.lockutils [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-ad45d92a-70c4-461c-80d8-2c75f978d5e6-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.542 186792 DEBUG oslo_concurrency.lockutils [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-ad45d92a-70c4-461c-80d8-2c75f978d5e6-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.543 186792 DEBUG nova.objects.instance [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid ad45d92a-70c4-461c-80d8-2c75f978d5e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.732 186792 DEBUG nova.network.neutron [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updated VIF entry in instance network info cache for port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.732 186792 DEBUG nova.network.neutron [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.767 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.768 186792 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.768 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.768 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.768 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.769 186792 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.769 186792 WARNING nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.769 186792 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.770 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.770 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.770 186792 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.771 186792 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.771 186792 WARNING nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.929 186792 DEBUG nova.compute.manager [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-changed-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.930 186792 DEBUG nova.compute.manager [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Refreshing instance network info cache due to event network-changed-a2f45e58-237f-4de0-8339-5f17a4ad3cfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.930 186792 DEBUG oslo_concurrency.lockutils [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.930 186792 DEBUG oslo_concurrency.lockutils [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.931 186792 DEBUG nova.network.neutron [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Refreshing network info cache for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:46 np0005531888 nova_compute[186788]: 2025-11-22 08:03:46.937 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:47 np0005531888 nova_compute[186788]: 2025-11-22 08:03:47.179 186792 DEBUG nova.objects.instance [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_requests' on Instance uuid ad45d92a-70c4-461c-80d8-2c75f978d5e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:03:47Z|00300|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:03:47 np0005531888 nova_compute[186788]: 2025-11-22 08:03:47.191 186792 DEBUG nova.network.neutron [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:03:47 np0005531888 nova_compute[186788]: 2025-11-22 08:03:47.236 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:47 np0005531888 nova_compute[186788]: 2025-11-22 08:03:47.736 186792 DEBUG nova.policy [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:03:48 np0005531888 nova_compute[186788]: 2025-11-22 08:03:48.035 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:48 np0005531888 nova_compute[186788]: 2025-11-22 08:03:48.748 186792 DEBUG nova.network.neutron [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updated VIF entry in instance network info cache for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:48 np0005531888 nova_compute[186788]: 2025-11-22 08:03:48.748 186792 DEBUG nova.network.neutron [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:48 np0005531888 nova_compute[186788]: 2025-11-22 08:03:48.859 186792 DEBUG oslo_concurrency.lockutils [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:49 np0005531888 nova_compute[186788]: 2025-11-22 08:03:49.525 186792 DEBUG nova.compute.manager [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:49 np0005531888 nova_compute[186788]: 2025-11-22 08:03:49.526 186792 DEBUG oslo_concurrency.lockutils [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:49 np0005531888 nova_compute[186788]: 2025-11-22 08:03:49.526 186792 DEBUG oslo_concurrency.lockutils [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:49 np0005531888 nova_compute[186788]: 2025-11-22 08:03:49.526 186792 DEBUG oslo_concurrency.lockutils [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:49 np0005531888 nova_compute[186788]: 2025-11-22 08:03:49.526 186792 DEBUG nova.compute.manager [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:49 np0005531888 nova_compute[186788]: 2025-11-22 08:03:49.527 186792 WARNING nova.compute.manager [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state resized and task_state None.#033[00m
Nov 22 03:03:49 np0005531888 nova_compute[186788]: 2025-11-22 08:03:49.891 186792 DEBUG nova.network.neutron [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Successfully updated port: 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.283 186792 DEBUG nova.compute.manager [req-dbf0cab1-2162-431a-b6ec-25a322a0bb18 req-1334fb5b-1948-425d-991e-35b13826691d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-changed-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.283 186792 DEBUG nova.compute.manager [req-dbf0cab1-2162-431a-b6ec-25a322a0bb18 req-1334fb5b-1948-425d-991e-35b13826691d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing instance network info cache due to event network-changed-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.284 186792 DEBUG oslo_concurrency.lockutils [req-dbf0cab1-2162-431a-b6ec-25a322a0bb18 req-1334fb5b-1948-425d-991e-35b13826691d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.285 186792 DEBUG oslo_concurrency.lockutils [req-dbf0cab1-2162-431a-b6ec-25a322a0bb18 req-1334fb5b-1948-425d-991e-35b13826691d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.285 186792 DEBUG nova.network.neutron [req-dbf0cab1-2162-431a-b6ec-25a322a0bb18 req-1334fb5b-1948-425d-991e-35b13826691d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Refreshing network info cache for port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.331 186792 DEBUG oslo_concurrency.lockutils [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.394 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.395 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.396 186792 DEBUG nova.compute.manager [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Going to confirm migration 15 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 22 03:03:50 np0005531888 nova_compute[186788]: 2025-11-22 08:03:50.432 186792 DEBUG nova.objects.instance [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'info_cache' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:50 np0005531888 podman[229185]: 2025-11-22 08:03:50.689898064 +0000 UTC m=+0.064120598 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Nov 22 03:03:50 np0005531888 podman[229186]: 2025-11-22 08:03:50.724758071 +0000 UTC m=+0.091525311 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.241 186792 DEBUG neutronclient.v2_0.client [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.242 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.242 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.242 186792 DEBUG nova.network.neutron [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.640 186792 DEBUG nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.641 186792 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.641 186792 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.641 186792 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.642 186792 DEBUG nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.642 186792 WARNING nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state resized and task_state None.#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.940 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:03:51 np0005531888 nova_compute[186788]: 2025-11-22 08:03:51.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:03:52 np0005531888 nova_compute[186788]: 2025-11-22 08:03:52.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:52 np0005531888 nova_compute[186788]: 2025-11-22 08:03:52.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:52 np0005531888 nova_compute[186788]: 2025-11-22 08:03:52.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:52 np0005531888 nova_compute[186788]: 2025-11-22 08:03:52.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:03:53 np0005531888 nova_compute[186788]: 2025-11-22 08:03:53.037 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:53 np0005531888 nova_compute[186788]: 2025-11-22 08:03:53.966 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:53 np0005531888 nova_compute[186788]: 2025-11-22 08:03:53.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:03:53 np0005531888 nova_compute[186788]: 2025-11-22 08:03:53.992 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.005 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.006 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.042 186792 DEBUG nova.network.neutron [req-dbf0cab1-2162-431a-b6ec-25a322a0bb18 req-1334fb5b-1948-425d-991e-35b13826691d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Added VIF to instance network info cache for port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3489#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.043 186792 DEBUG nova.network.neutron [req-dbf0cab1-2162-431a-b6ec-25a322a0bb18 req-1334fb5b-1948-425d-991e-35b13826691d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.439 186792 DEBUG oslo_concurrency.lockutils [req-dbf0cab1-2162-431a-b6ec-25a322a0bb18 req-1334fb5b-1948-425d-991e-35b13826691d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.439 186792 DEBUG oslo_concurrency.lockutils [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.440 186792 DEBUG nova.network.neutron [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.880 186792 WARNING nova.network.neutron [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.881 186792 WARNING nova.network.neutron [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.881 186792 WARNING nova.network.neutron [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc already exists in list: port_ids containing: ['650f9e14-a6b8-46d0-8167-1eb22fcbc8fc']. ignoring it#033[00m
Nov 22 03:03:54 np0005531888 nova_compute[186788]: 2025-11-22 08:03:54.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.273 186792 DEBUG nova.network.neutron [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.498 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.499 186792 DEBUG nova.objects.instance [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.532 186792 DEBUG nova.virt.libvirt.vif [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.532 186792 DEBUG nova.network.os_vif_util [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.533 186792 DEBUG nova.network.os_vif_util [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.534 186792 DEBUG os_vif [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.536 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.537 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2f45e58-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.537 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.539 186792 INFO os_vif [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.539 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.539 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.745 186792 DEBUG nova.compute.provider_tree [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.768 186792 DEBUG nova.scheduler.client.report [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:03:55 np0005531888 nova_compute[186788]: 2025-11-22 08:03:55.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:56 np0005531888 nova_compute[186788]: 2025-11-22 08:03:56.184 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:56 np0005531888 nova_compute[186788]: 2025-11-22 08:03:56.437 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798621.435761, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:56 np0005531888 nova_compute[186788]: 2025-11-22 08:03:56.437 186792 INFO nova.compute.manager [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:03:56 np0005531888 nova_compute[186788]: 2025-11-22 08:03:56.474 186792 DEBUG nova.compute.manager [None req-54bd39a5-928b-4714-827f-f76962c5c235 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:56 np0005531888 nova_compute[186788]: 2025-11-22 08:03:56.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:57 np0005531888 nova_compute[186788]: 2025-11-22 08:03:57.332 186792 INFO nova.scheduler.client.report [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Deleted allocation for migration 90f3f02c-becf-4e76-be2e-e639916871d2#033[00m
Nov 22 03:03:57 np0005531888 nova_compute[186788]: 2025-11-22 08:03:57.475 186792 DEBUG oslo_concurrency.lockutils [None req-1f14f6b7-e570-4c6e-8b51-3363946aefee b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 7.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:58 np0005531888 nova_compute[186788]: 2025-11-22 08:03:58.040 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:58 np0005531888 nova_compute[186788]: 2025-11-22 08:03:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:58 np0005531888 nova_compute[186788]: 2025-11-22 08:03:58.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.144 186792 DEBUG nova.network.neutron [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.215 186792 DEBUG oslo_concurrency.lockutils [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.221 186792 DEBUG nova.virt.libvirt.vif [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:03:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1110835689',display_name='tempest-tempest.common.compute-instance-1110835689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1110835689',id=102,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-0a57k9ms',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=ad45d92a-70c4-461c-80d8-2c75f978d5e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.222 186792 DEBUG nova.network.os_vif_util [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.222 186792 DEBUG nova.network.os_vif_util [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.223 186792 DEBUG os_vif [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.223 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.224 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.224 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.227 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.228 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap650f9e14-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.228 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap650f9e14-a6, col_values=(('external_ids', {'iface-id': '650f9e14-a6b8-46d0-8167-1eb22fcbc8fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:e8:28', 'vm-uuid': 'ad45d92a-70c4-461c-80d8-2c75f978d5e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.231 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:00 np0005531888 NetworkManager[55166]: <info>  [1763798640.2326] manager: (tap650f9e14-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.234 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.241 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.242 186792 INFO os_vif [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6')#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.243 186792 DEBUG nova.virt.libvirt.vif [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:03:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1110835689',display_name='tempest-tempest.common.compute-instance-1110835689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1110835689',id=102,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-0a57k9ms',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=ad45d92a-70c4-461c-80d8-2c75f978d5e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.243 186792 DEBUG nova.network.os_vif_util [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.244 186792 DEBUG nova.network.os_vif_util [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.247 186792 DEBUG nova.virt.libvirt.guest [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] attach device xml: <interface type="ethernet">
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <mac address="fa:16:3e:35:e8:28"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <model type="virtio"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <mtu size="1442"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <target dev="tap650f9e14-a6"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]: </interface>
Nov 22 03:04:00 np0005531888 nova_compute[186788]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 22 03:04:00 np0005531888 kernel: tap650f9e14-a6: entered promiscuous mode
Nov 22 03:04:00 np0005531888 NetworkManager[55166]: <info>  [1763798640.2607] manager: (tap650f9e14-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Nov 22 03:04:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:00Z|00301|binding|INFO|Claiming lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for this chassis.
Nov 22 03:04:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:00Z|00302|binding|INFO|650f9e14-a6b8-46d0-8167-1eb22fcbc8fc: Claiming fa:16:3e:35:e8:28 10.100.0.6
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.261 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:00Z|00303|binding|INFO|Setting lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc ovn-installed in OVS
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.278 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:00 np0005531888 systemd-udevd[229251]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:04:00 np0005531888 NetworkManager[55166]: <info>  [1763798640.3256] device (tap650f9e14-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:04:00 np0005531888 NetworkManager[55166]: <info>  [1763798640.3265] device (tap650f9e14-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:04:00 np0005531888 podman[229235]: 2025-11-22 08:04:00.362740556 +0000 UTC m=+0.069443208 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:04:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:00Z|00304|binding|INFO|Setting lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc up in Southbound
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.365 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e8:28 10.100.0.6'], port_security=['fa:16:3e:35:e8:28 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-638313878', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ad45d92a-70c4-461c-80d8-2c75f978d5e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-638313878', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.366 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.368 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:04:00 np0005531888 podman[229234]: 2025-11-22 08:04:00.379451647 +0000 UTC m=+0.087395430 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.390 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e2736961-6d72-4a69-b15b-fef713024f83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.420 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0c05f5-998d-4a82-a03d-0ad304b2eb5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.425 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2c329e17-1e3e-4ad6-b2c8-ff1fcfe65bf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.449 186792 DEBUG nova.virt.libvirt.driver [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.450 186792 DEBUG nova.virt.libvirt.driver [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.450 186792 DEBUG nova.virt.libvirt.driver [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:65:62:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.450 186792 DEBUG nova.virt.libvirt.driver [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:35:e8:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.456 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3b304b38-2d86-4a63-b657-a95c4ea1e624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.476 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eef6f705-c90a-4eb2-9792-997018de7cd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534426, 'reachable_time': 38194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229283, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.492 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd7ee74-0d75-4b01-b9a4-e79a090796fd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534435, 'tstamp': 534435}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229284, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534437, 'tstamp': 534437}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229284, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.494 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.496 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.498 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.498 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.499 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:00.499 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.533 186792 DEBUG nova.virt.libvirt.guest [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <nova:name>tempest-tempest.common.compute-instance-1110835689</nova:name>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:04:00</nova:creationTime>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:port uuid="d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938">
Nov 22 03:04:00 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    <nova:port uuid="650f9e14-a6b8-46d0-8167-1eb22fcbc8fc">
Nov 22 03:04:00 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:04:00 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:04:00 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:04:00 np0005531888 nova_compute[186788]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.566 186792 DEBUG oslo_concurrency.lockutils [None req-d1cba9ab-259c-4bbc-bcbc-036c95d54b6c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-ad45d92a-70c4-461c-80d8-2c75f978d5e6-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 14.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.967 186792 DEBUG nova.compute.manager [req-4ad9b441-6e56-4803-8b38-7ee872408892 req-d7ffbcdb-58ba-43da-946a-ea93c1c47f4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.967 186792 DEBUG oslo_concurrency.lockutils [req-4ad9b441-6e56-4803-8b38-7ee872408892 req-d7ffbcdb-58ba-43da-946a-ea93c1c47f4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.967 186792 DEBUG oslo_concurrency.lockutils [req-4ad9b441-6e56-4803-8b38-7ee872408892 req-d7ffbcdb-58ba-43da-946a-ea93c1c47f4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.968 186792 DEBUG oslo_concurrency.lockutils [req-4ad9b441-6e56-4803-8b38-7ee872408892 req-d7ffbcdb-58ba-43da-946a-ea93c1c47f4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.968 186792 DEBUG nova.compute.manager [req-4ad9b441-6e56-4803-8b38-7ee872408892 req-d7ffbcdb-58ba-43da-946a-ea93c1c47f4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] No waiting events found dispatching network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:00 np0005531888 nova_compute[186788]: 2025-11-22 08:04:00.968 186792 WARNING nova.compute.manager [req-4ad9b441-6e56-4803-8b38-7ee872408892 req-d7ffbcdb-58ba-43da-946a-ea93c1c47f4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received unexpected event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:04:01 np0005531888 nova_compute[186788]: 2025-11-22 08:04:01.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.408 186792 DEBUG oslo_concurrency.lockutils [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-ad45d92a-70c4-461c-80d8-2c75f978d5e6-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.409 186792 DEBUG oslo_concurrency.lockutils [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-ad45d92a-70c4-461c-80d8-2c75f978d5e6-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.420 186792 DEBUG nova.objects.instance [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid ad45d92a-70c4-461c-80d8-2c75f978d5e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.460 186792 DEBUG nova.virt.libvirt.vif [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:03:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1110835689',display_name='tempest-tempest.common.compute-instance-1110835689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1110835689',id=102,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-0a57k9ms',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=ad45d92a-70c4-461c-80d8-2c75f978d5e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.460 186792 DEBUG nova.network.os_vif_util [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.461 186792 DEBUG nova.network.os_vif_util [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.464 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.467 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.469 186792 DEBUG nova.virt.libvirt.driver [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Attempting to detach device tap650f9e14-a6 from instance ad45d92a-70c4-461c-80d8-2c75f978d5e6 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.469 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <mac address="fa:16:3e:35:e8:28"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <model type="virtio"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <mtu size="1442"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <target dev="tap650f9e14-a6"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: </interface>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.600 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.605 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface>not found in domain: <domain type='kvm' id='47'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <name>instance-00000066</name>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <uuid>ad45d92a-70c4-461c-80d8-2c75f978d5e6</uuid>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:name>tempest-tempest.common.compute-instance-1110835689</nova:name>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:04:00</nova:creationTime>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:port uuid="d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:port uuid="650f9e14-a6b8-46d0-8167-1eb22fcbc8fc">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <memory unit='KiB'>131072</memory>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <resource>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <partition>/machine</partition>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </resource>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <sysinfo type='smbios'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='serial'>ad45d92a-70c4-461c-80d8-2c75f978d5e6</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='uuid'>ad45d92a-70c4-461c-80d8-2c75f978d5e6</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <boot dev='hd'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <smbios mode='sysinfo'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <vmcoreinfo state='on'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <feature policy='require' name='x2apic'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <feature policy='require' name='vme'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <clock offset='utc'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <timer name='hpet' present='no'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <on_reboot>restart</on_reboot>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <on_crash>destroy</on_crash>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <disk type='file' device='disk'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk' index='2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <backingStore type='file' index='3'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:        <format type='raw'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:        <backingStore/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      </backingStore>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target dev='vda' bus='virtio'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='virtio-disk0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <disk type='file' device='cdrom'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.config' index='1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <backingStore/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target dev='sda' bus='sata'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <readonly/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='sata0-0-0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pcie.0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='1' port='0x10'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='2' port='0x11'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='3' port='0x12'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.3'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='4' port='0x13'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.4'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='5' port='0x14'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.5'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='6' port='0x15'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='7' port='0x16'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.7'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='8' port='0x17'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.8'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='9' port='0x18'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.9'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='10' port='0x19'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.10'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='11' port='0x1a'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.11'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='12' port='0x1b'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.12'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='13' port='0x1c'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.13'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='14' port='0x1d'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.14'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='15' port='0x1e'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.15'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='16' port='0x1f'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.16'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='17' port='0x20'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.17'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='18' port='0x21'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.18'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='19' port='0x22'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.19'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='20' port='0x23'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.20'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='21' port='0x24'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.21'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='22' port='0x25'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.22'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='23' port='0x26'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.23'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='24' port='0x27'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.24'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='25' port='0x28'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.25'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-pci-bridge'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.26'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='usb'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='sata' index='0'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='ide'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <interface type='ethernet'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <mac address='fa:16:3e:65:62:3d'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target dev='tapd7a9f14e-7a'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model type='virtio'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <mtu size='1442'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='net0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <interface type='ethernet'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <mac address='fa:16:3e:35:e8:28'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target dev='tap650f9e14-a6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model type='virtio'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <mtu size='1442'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='net1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <serial type='pty'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/console.log' append='off'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target type='isa-serial' port='0'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:        <model name='isa-serial'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      </target>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/console.log' append='off'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target type='serial' port='0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </console>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <input type='tablet' bus='usb'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='input0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <input type='mouse' bus='ps2'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='input1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <input type='keyboard' bus='ps2'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='input2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <listen type='address' address='::0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <audio id='1' type='none'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='video0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <watchdog model='itco' action='reset'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='watchdog0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </watchdog>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <memballoon model='virtio'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <stats period='10'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='balloon0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <rng model='virtio'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='rng0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <label>system_u:system_r:svirt_t:s0:c116,c803</label>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c116,c803</imagelabel>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <label>+107:+107</label>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.606 186792 INFO nova.virt.libvirt.driver [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully detached device tap650f9e14-a6 from instance ad45d92a-70c4-461c-80d8-2c75f978d5e6 from the persistent domain config.#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.607 186792 DEBUG nova.virt.libvirt.driver [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] (1/8): Attempting to detach device tap650f9e14-a6 with device alias net1 from instance ad45d92a-70c4-461c-80d8-2c75f978d5e6 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.607 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <mac address="fa:16:3e:35:e8:28"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <model type="virtio"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <mtu size="1442"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <target dev="tap650f9e14-a6"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: </interface>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:04:02 np0005531888 kernel: tap650f9e14-a6 (unregistering): left promiscuous mode
Nov 22 03:04:02 np0005531888 NetworkManager[55166]: <info>  [1763798642.7192] device (tap650f9e14-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:04:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:02Z|00305|binding|INFO|Releasing lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc from this chassis (sb_readonly=0)
Nov 22 03:04:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:02Z|00306|binding|INFO|Setting lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc down in Southbound
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.726 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:02Z|00307|binding|INFO|Removing iface tap650f9e14-a6 ovn-installed in OVS
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.728 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.741 186792 DEBUG nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Received event <DeviceRemovedEvent: 1763798642.741342, ad45d92a-70c4-461c-80d8-2c75f978d5e6 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.743 186792 DEBUG nova.virt.libvirt.driver [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Start waiting for the detach event from libvirt for device tap650f9e14-a6 with device alias net1 for instance ad45d92a-70c4-461c-80d8-2c75f978d5e6 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.743 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.746 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.747 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface>not found in domain: <domain type='kvm' id='47'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <name>instance-00000066</name>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <uuid>ad45d92a-70c4-461c-80d8-2c75f978d5e6</uuid>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:name>tempest-tempest.common.compute-instance-1110835689</nova:name>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:04:00</nova:creationTime>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:port uuid="d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:port uuid="650f9e14-a6b8-46d0-8167-1eb22fcbc8fc">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <memory unit='KiB'>131072</memory>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <resource>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <partition>/machine</partition>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </resource>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <sysinfo type='smbios'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='serial'>ad45d92a-70c4-461c-80d8-2c75f978d5e6</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='uuid'>ad45d92a-70c4-461c-80d8-2c75f978d5e6</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <boot dev='hd'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <smbios mode='sysinfo'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <vmcoreinfo state='on'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <feature policy='require' name='x2apic'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <feature policy='require' name='vme'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <clock offset='utc'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <timer name='hpet' present='no'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <on_reboot>restart</on_reboot>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <on_crash>destroy</on_crash>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <disk type='file' device='disk'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk' index='2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <backingStore type='file' index='3'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:        <format type='raw'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:        <backingStore/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      </backingStore>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target dev='vda' bus='virtio'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='virtio-disk0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <disk type='file' device='cdrom'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk.config' index='1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <backingStore/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target dev='sda' bus='sata'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <readonly/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='sata0-0-0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pcie.0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='1' port='0x10'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='2' port='0x11'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='3' port='0x12'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.3'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='4' port='0x13'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.4'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='5' port='0x14'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.5'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='6' port='0x15'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='7' port='0x16'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.7'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='8' port='0x17'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.8'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='9' port='0x18'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.9'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='10' port='0x19'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.10'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='11' port='0x1a'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.11'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='12' port='0x1b'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.12'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='13' port='0x1c'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.13'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='14' port='0x1d'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.14'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='15' port='0x1e'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.15'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='16' port='0x1f'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.16'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='17' port='0x20'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.17'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='18' port='0x21'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.18'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='19' port='0x22'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.19'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='20' port='0x23'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.20'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='21' port='0x24'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.21'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='22' port='0x25'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.22'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='23' port='0x26'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.23'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='24' port='0x27'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.24'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target chassis='25' port='0x28'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.25'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model name='pcie-pci-bridge'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='pci.26'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='usb'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <controller type='sata' index='0'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='ide'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <interface type='ethernet'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <mac address='fa:16:3e:65:62:3d'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target dev='tapd7a9f14e-7a'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model type='virtio'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <mtu size='1442'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='net0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <serial type='pty'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/console.log' append='off'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target type='isa-serial' port='0'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:        <model name='isa-serial'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      </target>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/console.log' append='off'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <target type='serial' port='0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </console>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <input type='tablet' bus='usb'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='input0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <input type='mouse' bus='ps2'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='input1'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <input type='keyboard' bus='ps2'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='input2'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <listen type='address' address='::0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <audio id='1' type='none'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='video0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <watchdog model='itco' action='reset'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='watchdog0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </watchdog>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <memballoon model='virtio'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <stats period='10'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='balloon0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <rng model='virtio'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <alias name='rng0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <label>system_u:system_r:svirt_t:s0:c116,c803</label>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c116,c803</imagelabel>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <label>+107:+107</label>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.747 186792 INFO nova.virt.libvirt.driver [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully detached device tap650f9e14-a6 from instance ad45d92a-70c4-461c-80d8-2c75f978d5e6 from the live domain config.#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.748 186792 DEBUG nova.virt.libvirt.vif [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:03:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1110835689',display_name='tempest-tempest.common.compute-instance-1110835689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1110835689',id=102,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-0a57k9ms',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=ad45d92a-70c4-461c-80d8-2c75f978d5e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.748 186792 DEBUG nova.network.os_vif_util [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.749 186792 DEBUG nova.network.os_vif_util [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.749 186792 DEBUG os_vif [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.751 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.751 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650f9e14-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.753 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.754 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.756 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e8:28 10.100.0.6'], port_security=['fa:16:3e:35:e8:28 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-638313878', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ad45d92a-70c4-461c-80d8-2c75f978d5e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-638313878', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.757 186792 INFO os_vif [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6')#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.757 186792 DEBUG nova.virt.libvirt.guest [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:name>tempest-tempest.common.compute-instance-1110835689</nova:name>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:04:02</nova:creationTime>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    <nova:port uuid="d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938">
Nov 22 03:04:02 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:04:02 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:04:02 np0005531888 nova_compute[186788]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.758 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.760 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.774 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[abd87ad1-ed96-42ad-bf73-39444d229fbf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.801 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[865ad798-9b0b-449d-9f91-39c9c5fe5b77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.804 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[da3550b8-f271-4dd5-acec-bbeedf077df3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.834 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[473dc569-7674-4ea9-9f20-0e02c8c5e00c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.852 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9636a1a1-0439-42be-9850-b1fcf6478148]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534426, 'reachable_time': 38194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229295, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.868 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fa429843-5867-4470-b6fe-36ac96583466]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534435, 'tstamp': 534435}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229296, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534437, 'tstamp': 534437}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229296, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.870 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.872 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.875 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.875 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.876 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.876 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:02.877 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:02 np0005531888 nova_compute[186788]: 2025-11-22 08:04:02.989 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.043 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.074 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.100 186792 DEBUG nova.compute.manager [req-28d828f5-6b2b-4a98-91e2-0e8adde84ef7 req-3fe9ed19-5476-49dc-b9ce-647bea506de5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.101 186792 DEBUG oslo_concurrency.lockutils [req-28d828f5-6b2b-4a98-91e2-0e8adde84ef7 req-3fe9ed19-5476-49dc-b9ce-647bea506de5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.101 186792 DEBUG oslo_concurrency.lockutils [req-28d828f5-6b2b-4a98-91e2-0e8adde84ef7 req-3fe9ed19-5476-49dc-b9ce-647bea506de5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.101 186792 DEBUG oslo_concurrency.lockutils [req-28d828f5-6b2b-4a98-91e2-0e8adde84ef7 req-3fe9ed19-5476-49dc-b9ce-647bea506de5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.102 186792 DEBUG nova.compute.manager [req-28d828f5-6b2b-4a98-91e2-0e8adde84ef7 req-3fe9ed19-5476-49dc-b9ce-647bea506de5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] No waiting events found dispatching network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.102 186792 WARNING nova.compute.manager [req-28d828f5-6b2b-4a98-91e2-0e8adde84ef7 req-3fe9ed19-5476-49dc-b9ce-647bea506de5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received unexpected event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.137 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.138 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.197 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.394 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.396 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5514MB free_disk=73.24660110473633GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.396 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.396 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.681 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance ad45d92a-70c4-461c-80d8-2c75f978d5e6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.681 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.681 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.876 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.890 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.982 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:03 np0005531888 nova_compute[186788]: 2025-11-22 08:04:03.983 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:04.123 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:04:04 np0005531888 nova_compute[186788]: 2025-11-22 08:04:04.124 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:04.126 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:04:04 np0005531888 nova_compute[186788]: 2025-11-22 08:04:04.886 186792 DEBUG oslo_concurrency.lockutils [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:04 np0005531888 nova_compute[186788]: 2025-11-22 08:04:04.887 186792 DEBUG oslo_concurrency.lockutils [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:04 np0005531888 nova_compute[186788]: 2025-11-22 08:04:04.887 186792 DEBUG nova.network.neutron [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.408 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.408 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.409 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.409 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.409 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.418 186792 INFO nova.compute.manager [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Terminating instance#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.426 186792 DEBUG nova.compute.manager [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:04:05 np0005531888 kernel: tapd7a9f14e-7a (unregistering): left promiscuous mode
Nov 22 03:04:05 np0005531888 NetworkManager[55166]: <info>  [1763798645.4626] device (tapd7a9f14e-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.470 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:05 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:05Z|00308|binding|INFO|Releasing lport d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 from this chassis (sb_readonly=0)
Nov 22 03:04:05 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:05Z|00309|binding|INFO|Setting lport d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 down in Southbound
Nov 22 03:04:05 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:05Z|00310|binding|INFO|Removing iface tapd7a9f14e-7a ovn-installed in OVS
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.472 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.488 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:05 np0005531888 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000066.scope: Deactivated successfully.
Nov 22 03:04:05 np0005531888 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000066.scope: Consumed 16.431s CPU time.
Nov 22 03:04:05 np0005531888 systemd-machined[153106]: Machine qemu-47-instance-00000066 terminated.
Nov 22 03:04:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:05.514 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:62:3d 10.100.0.11'], port_security=['fa:16:3e:65:62:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ad45d92a-70c4-461c-80d8-2c75f978d5e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6da332c7-e52a-4f92-8c24-c2ee0c6e77d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:04:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:05.515 104023 INFO neutron.agent.ovn.metadata.agent [-] Port d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:04:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:05.517 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a4a282c-db22-41de-b34b-2960aa032ca8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:04:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:05.518 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1b0459-889b-47ee-940a-4fd260f48e4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:05.518 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 namespace which is not needed anymore#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.593 186792 DEBUG nova.compute.manager [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-unplugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.594 186792 DEBUG oslo_concurrency.lockutils [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.594 186792 DEBUG oslo_concurrency.lockutils [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.595 186792 DEBUG oslo_concurrency.lockutils [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.595 186792 DEBUG nova.compute.manager [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] No waiting events found dispatching network-vif-unplugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.596 186792 DEBUG nova.compute.manager [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-unplugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.596 186792 DEBUG nova.compute.manager [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.596 186792 DEBUG oslo_concurrency.lockutils [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.597 186792 DEBUG oslo_concurrency.lockutils [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.597 186792 DEBUG oslo_concurrency.lockutils [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.597 186792 DEBUG nova.compute.manager [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] No waiting events found dispatching network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.598 186792 WARNING nova.compute.manager [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received unexpected event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.707 186792 INFO nova.virt.libvirt.driver [-] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Instance destroyed successfully.#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.708 186792 DEBUG nova.objects.instance [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'resources' on Instance uuid ad45d92a-70c4-461c-80d8-2c75f978d5e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.756 186792 DEBUG nova.virt.libvirt.vif [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:03:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1110835689',display_name='tempest-tempest.common.compute-instance-1110835689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1110835689',id=102,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-0a57k9ms',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=ad45d92a-70c4-461c-80d8-2c75f978d5e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.757 186792 DEBUG nova.network.os_vif_util [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.757 186792 DEBUG nova.network.os_vif_util [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:62:3d,bridge_name='br-int',has_traffic_filtering=True,id=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7a9f14e-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.758 186792 DEBUG os_vif [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:62:3d,bridge_name='br-int',has_traffic_filtering=True,id=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7a9f14e-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.759 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.759 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7a9f14e-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.761 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.763 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.766 186792 INFO os_vif [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:62:3d,bridge_name='br-int',has_traffic_filtering=True,id=d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7a9f14e-7a')#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.767 186792 DEBUG nova.virt.libvirt.vif [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:03:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1110835689',display_name='tempest-tempest.common.compute-instance-1110835689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1110835689',id=102,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-0a57k9ms',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=ad45d92a-70c4-461c-80d8-2c75f978d5e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.767 186792 DEBUG nova.network.os_vif_util [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.767 186792 DEBUG nova.network.os_vif_util [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.768 186792 DEBUG os_vif [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.769 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.769 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650f9e14-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.769 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.771 186792 INFO os_vif [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6')#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.772 186792 INFO nova.virt.libvirt.driver [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Deleting instance files /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6_del#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.772 186792 INFO nova.virt.libvirt.driver [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Deletion of /var/lib/nova/instances/ad45d92a-70c4-461c-80d8-2c75f978d5e6_del complete#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.867 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.891 186792 INFO nova.compute.manager [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.892 186792 DEBUG oslo.service.loopingcall [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.892 186792 DEBUG nova.compute.manager [-] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:04:05 np0005531888 nova_compute[186788]: 2025-11-22 08:04:05.893 186792 DEBUG nova.network.neutron [-] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:04:05 np0005531888 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228947]: [NOTICE]   (228951) : haproxy version is 2.8.14-c23fe91
Nov 22 03:04:05 np0005531888 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228947]: [NOTICE]   (228951) : path to executable is /usr/sbin/haproxy
Nov 22 03:04:05 np0005531888 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228947]: [WARNING]  (228951) : Exiting Master process...
Nov 22 03:04:05 np0005531888 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228947]: [ALERT]    (228951) : Current worker (228954) exited with code 143 (Terminated)
Nov 22 03:04:05 np0005531888 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228947]: [WARNING]  (228951) : All workers exited. Exiting... (0)
Nov 22 03:04:05 np0005531888 systemd[1]: libpod-cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e.scope: Deactivated successfully.
Nov 22 03:04:05 np0005531888 podman[229328]: 2025-11-22 08:04:05.917503439 +0000 UTC m=+0.302227563 container died cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:04:06 np0005531888 nova_compute[186788]: 2025-11-22 08:04:06.104 186792 DEBUG nova.compute.manager [req-d32874f2-208c-436d-86cf-15904e1664dc req-b2a6cb77-c2b4-4b77-8796-4ea6bc0575df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-unplugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:06 np0005531888 nova_compute[186788]: 2025-11-22 08:04:06.104 186792 DEBUG oslo_concurrency.lockutils [req-d32874f2-208c-436d-86cf-15904e1664dc req-b2a6cb77-c2b4-4b77-8796-4ea6bc0575df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:06 np0005531888 nova_compute[186788]: 2025-11-22 08:04:06.105 186792 DEBUG oslo_concurrency.lockutils [req-d32874f2-208c-436d-86cf-15904e1664dc req-b2a6cb77-c2b4-4b77-8796-4ea6bc0575df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:06 np0005531888 nova_compute[186788]: 2025-11-22 08:04:06.105 186792 DEBUG oslo_concurrency.lockutils [req-d32874f2-208c-436d-86cf-15904e1664dc req-b2a6cb77-c2b4-4b77-8796-4ea6bc0575df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:06 np0005531888 nova_compute[186788]: 2025-11-22 08:04:06.105 186792 DEBUG nova.compute.manager [req-d32874f2-208c-436d-86cf-15904e1664dc req-b2a6cb77-c2b4-4b77-8796-4ea6bc0575df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] No waiting events found dispatching network-vif-unplugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:06 np0005531888 nova_compute[186788]: 2025-11-22 08:04:06.106 186792 DEBUG nova.compute.manager [req-d32874f2-208c-436d-86cf-15904e1664dc req-b2a6cb77-c2b4-4b77-8796-4ea6bc0575df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-unplugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:04:06 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e-userdata-shm.mount: Deactivated successfully.
Nov 22 03:04:06 np0005531888 systemd[1]: var-lib-containers-storage-overlay-f408171d412da31c590fc5ffbfbb3785214ce2f261c2995fba8bdb0ede87ce41-merged.mount: Deactivated successfully.
Nov 22 03:04:07 np0005531888 podman[229328]: 2025-11-22 08:04:07.028786294 +0000 UTC m=+1.413510418 container cleanup cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:04:07 np0005531888 systemd[1]: libpod-conmon-cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e.scope: Deactivated successfully.
Nov 22 03:04:07 np0005531888 podman[229374]: 2025-11-22 08:04:07.764392131 +0000 UTC m=+0.712007848 container remove cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.771 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f44309-e8fc-4b17-869c-cd06c4f563d3]: (4, ('Sat Nov 22 08:04:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 (cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e)\ncf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e\nSat Nov 22 08:04:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 (cf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e)\ncf9d3122e6a50c674cb17625cada50bcdc3a20b0af48958863b78bbe8326f48e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.774 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[88bc39f5-ad48-49d8-99cd-78d471c7f573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.776 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:07 np0005531888 nova_compute[186788]: 2025-11-22 08:04:07.779 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:07 np0005531888 kernel: tap6a4a282c-d0: left promiscuous mode
Nov 22 03:04:07 np0005531888 nova_compute[186788]: 2025-11-22 08:04:07.795 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.799 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb86e3f-8a1b-493c-9824-656c903d686b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.813 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1c867b-1f71-407e-8dbc-ce49d95bcafb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.815 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[da472087-bfb5-4a74-a132-d6a03099a731]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.833 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3ef74a-02c0-4697-8c97-c040c63877ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534418, 'reachable_time': 42507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229390, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.836 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:04:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:07.836 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ec1052-fe19-4097-8fc3-52caf61bd4af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:07 np0005531888 systemd[1]: run-netns-ovnmeta\x2d6a4a282c\x2ddb22\x2d41de\x2db34b\x2d2960aa032ca8.mount: Deactivated successfully.
Nov 22 03:04:08 np0005531888 nova_compute[186788]: 2025-11-22 08:04:08.044 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:08.127 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.050 186792 DEBUG nova.compute.manager [req-bfae6980-4ccd-45cf-b626-4000380a4439 req-ae9efda0-6b08-413f-b785-4c9d5aa77c43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.051 186792 DEBUG oslo_concurrency.lockutils [req-bfae6980-4ccd-45cf-b626-4000380a4439 req-ae9efda0-6b08-413f-b785-4c9d5aa77c43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.051 186792 DEBUG oslo_concurrency.lockutils [req-bfae6980-4ccd-45cf-b626-4000380a4439 req-ae9efda0-6b08-413f-b785-4c9d5aa77c43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.051 186792 DEBUG oslo_concurrency.lockutils [req-bfae6980-4ccd-45cf-b626-4000380a4439 req-ae9efda0-6b08-413f-b785-4c9d5aa77c43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.051 186792 DEBUG nova.compute.manager [req-bfae6980-4ccd-45cf-b626-4000380a4439 req-ae9efda0-6b08-413f-b785-4c9d5aa77c43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] No waiting events found dispatching network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.052 186792 WARNING nova.compute.manager [req-bfae6980-4ccd-45cf-b626-4000380a4439 req-ae9efda0-6b08-413f-b785-4c9d5aa77c43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received unexpected event network-vif-plugged-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.347 186792 INFO nova.network.neutron [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.348 186792 DEBUG nova.network.neutron [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [{"id": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "address": "fa:16:3e:65:62:3d", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7a9f14e-7a", "ovs_interfaceid": "d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.417 186792 DEBUG oslo_concurrency.lockutils [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-ad45d92a-70c4-461c-80d8-2c75f978d5e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:09 np0005531888 nova_compute[186788]: 2025-11-22 08:04:09.462 186792 DEBUG oslo_concurrency.lockutils [None req-56553f01-23bb-4856-b447-e86803b6a919 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-ad45d92a-70c4-461c-80d8-2c75f978d5e6-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 7.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:09 np0005531888 podman[229391]: 2025-11-22 08:04:09.687673531 +0000 UTC m=+0.056920221 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:04:10 np0005531888 nova_compute[186788]: 2025-11-22 08:04:10.000 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:10 np0005531888 nova_compute[186788]: 2025-11-22 08:04:10.762 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:11 np0005531888 podman[229411]: 2025-11-22 08:04:11.679446115 +0000 UTC m=+0.052440470 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:04:12 np0005531888 nova_compute[186788]: 2025-11-22 08:04:12.888 186792 DEBUG nova.network.neutron [-] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:12 np0005531888 nova_compute[186788]: 2025-11-22 08:04:12.891 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:12 np0005531888 nova_compute[186788]: 2025-11-22 08:04:12.951 186792 INFO nova.compute.manager [-] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Took 7.06 seconds to deallocate network for instance.#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.048 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.081 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.082 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.119 186792 DEBUG nova.compute.manager [req-9f6927d3-2fa5-4004-8867-d02f3c02bba6 req-6bfde669-c924-4f41-b545-21daf8052591 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Received event network-vif-deleted-d7a9f14e-7a03-4a17-aff2-c6c5bf3e9938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.151 186792 DEBUG nova.compute.provider_tree [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.176 186792 DEBUG nova.scheduler.client.report [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.210 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.263 186792 INFO nova.scheduler.client.report [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Deleted allocations for instance ad45d92a-70c4-461c-80d8-2c75f978d5e6#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.376 186792 DEBUG oslo_concurrency.lockutils [None req-a11df09d-2b46-4857-b1bb-b2365b23267c 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "ad45d92a-70c4-461c-80d8-2c75f978d5e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.886 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.887 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:13 np0005531888 nova_compute[186788]: 2025-11-22 08:04:13.936 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.100 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.144 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.145 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.152 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.153 186792 INFO nova.compute.claims [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.394 186792 DEBUG nova.compute.provider_tree [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.423 186792 DEBUG nova.scheduler.client.report [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.448 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.448 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.520 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.521 186792 DEBUG nova.network.neutron [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.542 186792 INFO nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.564 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.731 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.732 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.733 186792 INFO nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Creating image(s)#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.733 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.733 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.734 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.746 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.805 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.806 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.806 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.818 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.870 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.871 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:14 np0005531888 nova_compute[186788]: 2025-11-22 08:04:14.903 186792 DEBUG nova.policy [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.407 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk 1073741824" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.408 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.408 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.471 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.472 186792 DEBUG nova.virt.disk.api [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.472 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.535 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.536 186792 DEBUG nova.virt.disk.api [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.537 186792 DEBUG nova.objects.instance [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.559 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.560 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Ensure instance console log exists: /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.561 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.561 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.561 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:15 np0005531888 nova_compute[186788]: 2025-11-22 08:04:15.766 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:16 np0005531888 podman[229450]: 2025-11-22 08:04:16.683112958 +0000 UTC m=+0.060933299 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:04:18 np0005531888 nova_compute[186788]: 2025-11-22 08:04:18.050 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:19 np0005531888 nova_compute[186788]: 2025-11-22 08:04:19.677 186792 DEBUG nova.network.neutron [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Successfully created port: 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:04:20 np0005531888 nova_compute[186788]: 2025-11-22 08:04:20.705 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798645.7037418, ad45d92a-70c4-461c-80d8-2c75f978d5e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:04:20 np0005531888 nova_compute[186788]: 2025-11-22 08:04:20.705 186792 INFO nova.compute.manager [-] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:04:20 np0005531888 nova_compute[186788]: 2025-11-22 08:04:20.723 186792 DEBUG nova.compute.manager [None req-d0bccb10-9441-4ec9-a8ed-281e11bf6dee - - - - - -] [instance: ad45d92a-70c4-461c-80d8-2c75f978d5e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:20 np0005531888 nova_compute[186788]: 2025-11-22 08:04:20.768 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:21 np0005531888 podman[229471]: 2025-11-22 08:04:21.6883829 +0000 UTC m=+0.059828402 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 03:04:21 np0005531888 podman[229472]: 2025-11-22 08:04:21.732961306 +0000 UTC m=+0.096936695 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:04:21 np0005531888 nova_compute[186788]: 2025-11-22 08:04:21.902 186792 DEBUG nova.network.neutron [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Successfully updated port: 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:04:21 np0005531888 nova_compute[186788]: 2025-11-22 08:04:21.945 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:21 np0005531888 nova_compute[186788]: 2025-11-22 08:04:21.946 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:21 np0005531888 nova_compute[186788]: 2025-11-22 08:04:21.946 186792 DEBUG nova.network.neutron [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:04:22 np0005531888 nova_compute[186788]: 2025-11-22 08:04:22.114 186792 DEBUG nova.compute.manager [req-a80d6b6f-9650-484f-a05d-29c3456a6cee req-cc744c7b-fadc-4b35-9f9c-7d8bca88177f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:22 np0005531888 nova_compute[186788]: 2025-11-22 08:04:22.114 186792 DEBUG nova.compute.manager [req-a80d6b6f-9650-484f-a05d-29c3456a6cee req-cc744c7b-fadc-4b35-9f9c-7d8bca88177f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing instance network info cache due to event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:04:22 np0005531888 nova_compute[186788]: 2025-11-22 08:04:22.115 186792 DEBUG oslo_concurrency.lockutils [req-a80d6b6f-9650-484f-a05d-29c3456a6cee req-cc744c7b-fadc-4b35-9f9c-7d8bca88177f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:22 np0005531888 nova_compute[186788]: 2025-11-22 08:04:22.188 186792 DEBUG nova.network.neutron [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:04:23 np0005531888 nova_compute[186788]: 2025-11-22 08:04:23.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.183 186792 DEBUG nova.network.neutron [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.212 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.212 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance network_info: |[{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.215 186792 DEBUG oslo_concurrency.lockutils [req-a80d6b6f-9650-484f-a05d-29c3456a6cee req-cc744c7b-fadc-4b35-9f9c-7d8bca88177f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.215 186792 DEBUG nova.network.neutron [req-a80d6b6f-9650-484f-a05d-29c3456a6cee req-cc744c7b-fadc-4b35-9f9c-7d8bca88177f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.219 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Start _get_guest_xml network_info=[{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.226 186792 WARNING nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.232 186792 DEBUG nova.virt.libvirt.host [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.233 186792 DEBUG nova.virt.libvirt.host [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.239 186792 DEBUG nova.virt.libvirt.host [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.241 186792 DEBUG nova.virt.libvirt.host [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.242 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.243 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.243 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.244 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.244 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.244 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.245 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.245 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.245 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.246 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.246 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.246 186792 DEBUG nova.virt.hardware [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.252 186792 DEBUG nova.virt.libvirt.vif [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:04:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.253 186792 DEBUG nova.network.os_vif_util [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.254 186792 DEBUG nova.network.os_vif_util [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.255 186792 DEBUG nova.objects.instance [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.300 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <uuid>b9ee5ebd-90a8-426a-b369-d38bf61616f2</uuid>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <name>instance-00000067</name>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestJSON-server-316749730</nova:name>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:04:24</nova:creationTime>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        <nova:port uuid="348c8bec-11f0-4b6d-9dce-ae3c3f37efbc">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <entry name="serial">b9ee5ebd-90a8-426a-b369-d38bf61616f2</entry>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <entry name="uuid">b9ee5ebd-90a8-426a-b369-d38bf61616f2</entry>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:ba:d5:b9"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <target dev="tap348c8bec-11"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/console.log" append="off"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:04:24 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:04:24 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:04:24 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:04:24 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.301 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Preparing to wait for external event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.302 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.302 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.302 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.303 186792 DEBUG nova.virt.libvirt.vif [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:04:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.303 186792 DEBUG nova.network.os_vif_util [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.304 186792 DEBUG nova.network.os_vif_util [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.304 186792 DEBUG os_vif [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.304 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.305 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.305 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.307 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap348c8bec-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.308 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap348c8bec-11, col_values=(('external_ids', {'iface-id': '348c8bec-11f0-4b6d-9dce-ae3c3f37efbc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:d5:b9', 'vm-uuid': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:24 np0005531888 NetworkManager[55166]: <info>  [1763798664.3101] manager: (tap348c8bec-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.311 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.316 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.317 186792 INFO os_vif [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11')#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.549 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.549 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.551 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No VIF found with MAC fa:16:3e:ba:d5:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:04:24 np0005531888 nova_compute[186788]: 2025-11-22 08:04:24.551 186792 INFO nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Using config drive#033[00m
Nov 22 03:04:25 np0005531888 nova_compute[186788]: 2025-11-22 08:04:25.572 186792 INFO nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Creating config drive at /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config#033[00m
Nov 22 03:04:25 np0005531888 nova_compute[186788]: 2025-11-22 08:04:25.579 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxei_vuii execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:25 np0005531888 nova_compute[186788]: 2025-11-22 08:04:25.705 186792 DEBUG oslo_concurrency.processutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxei_vuii" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:25 np0005531888 kernel: tap348c8bec-11: entered promiscuous mode
Nov 22 03:04:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:25Z|00311|binding|INFO|Claiming lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for this chassis.
Nov 22 03:04:25 np0005531888 NetworkManager[55166]: <info>  [1763798665.7743] manager: (tap348c8bec-11): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Nov 22 03:04:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:25Z|00312|binding|INFO|348c8bec-11f0-4b6d-9dce-ae3c3f37efbc: Claiming fa:16:3e:ba:d5:b9 10.100.0.14
Nov 22 03:04:25 np0005531888 nova_compute[186788]: 2025-11-22 08:04:25.774 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:25Z|00313|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc ovn-installed in OVS
Nov 22 03:04:25 np0005531888 nova_compute[186788]: 2025-11-22 08:04:25.788 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:25 np0005531888 nova_compute[186788]: 2025-11-22 08:04:25.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.799 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:04:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:25Z|00314|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc up in Southbound
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.800 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.802 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518#033[00m
Nov 22 03:04:25 np0005531888 systemd-udevd[229541]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.814 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0d07ee4b-7efe-47d9-a986-e5e9b7cb8aca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.815 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.817 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.817 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2be428db-0584-44cf-aba2-470501dd50bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 systemd-machined[153106]: New machine qemu-48-instance-00000067.
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.818 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c1090841-c99a-4269-95de-e1e30354e20e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 NetworkManager[55166]: <info>  [1763798665.8265] device (tap348c8bec-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:04:25 np0005531888 NetworkManager[55166]: <info>  [1763798665.8274] device (tap348c8bec-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:04:25 np0005531888 systemd[1]: Started Virtual Machine qemu-48-instance-00000067.
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.832 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[1458cc4f-db10-4a27-8be7-307411b7199f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.851 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9a64d69f-1322-4f4f-8068-4131be012042]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.880 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[1d10665d-43e3-47e1-98d8-ec69dae5431c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.886 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2d30ece9-23aa-41fc-a63a-4caf61280420]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 NetworkManager[55166]: <info>  [1763798665.8880] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/159)
Nov 22 03:04:25 np0005531888 systemd-udevd[229544]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.923 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c61cb388-bcc3-47a0-bf47-437c0f65c5af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.927 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[69874142-03c0-46d7-855d-e2f1eb05363d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 NetworkManager[55166]: <info>  [1763798665.9533] device (tap165f7f23-d0): carrier: link connected
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.961 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2a966a38-f3e9-4f2a-8fd6-02d77befb48e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.978 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[62b5817f-36d4-4518-9fe7-ee165419dfc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540659, 'reachable_time': 21224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229573, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:25.997 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[03a8400b-dadf-477e-888c-6b655bc3530e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540659, 'tstamp': 540659}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229574, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.017 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4da7a43e-573c-453a-bf5f-a047913147da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540659, 'reachable_time': 21224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229575, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.053 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d785125c-0a18-43bf-958f-79b48759f68d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.107 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0225ab1c-bbc6-4b41-8f95-9f8c11d77e94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.109 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.109 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.110 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.111 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:26 np0005531888 NetworkManager[55166]: <info>  [1763798666.1121] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Nov 22 03:04:26 np0005531888 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.113 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.114 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:26Z|00315|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.126 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.127 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.128 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[54a69551-8c6b-41b5-b1cf-e9e8b4f7f581]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.129 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:04:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:26.130 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.570 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798666.570159, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.571 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Started (Lifecycle Event)#033[00m
Nov 22 03:04:26 np0005531888 podman[229609]: 2025-11-22 08:04:26.508352986 +0000 UTC m=+0.026418091 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.605 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.610 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798666.5712838, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.611 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.635 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.640 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:04:26 np0005531888 nova_compute[186788]: 2025-11-22 08:04:26.670 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:04:26 np0005531888 podman[229609]: 2025-11-22 08:04:26.955480051 +0000 UTC m=+0.473545126 container create 6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:04:27 np0005531888 systemd[1]: Started libpod-conmon-6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73.scope.
Nov 22 03:04:27 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:04:27 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cd0834e9f0dd5a843732e9bdd78baf1a54121e28808c3e0e1682ab6d6bd23d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:04:27 np0005531888 podman[229609]: 2025-11-22 08:04:27.239276648 +0000 UTC m=+0.757341753 container init 6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:04:27 np0005531888 podman[229609]: 2025-11-22 08:04:27.249462849 +0000 UTC m=+0.767527934 container start 6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:04:27 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[229630]: [NOTICE]   (229634) : New worker (229636) forked
Nov 22 03:04:27 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[229630]: [NOTICE]   (229634) : Loading success.
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.714 186792 DEBUG nova.compute.manager [req-6075fb62-ae52-4dac-9ed7-392d84d12e5b req-4c284c5f-3abf-4443-85b5-f6ec20d3d5ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.715 186792 DEBUG oslo_concurrency.lockutils [req-6075fb62-ae52-4dac-9ed7-392d84d12e5b req-4c284c5f-3abf-4443-85b5-f6ec20d3d5ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.715 186792 DEBUG oslo_concurrency.lockutils [req-6075fb62-ae52-4dac-9ed7-392d84d12e5b req-4c284c5f-3abf-4443-85b5-f6ec20d3d5ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.715 186792 DEBUG oslo_concurrency.lockutils [req-6075fb62-ae52-4dac-9ed7-392d84d12e5b req-4c284c5f-3abf-4443-85b5-f6ec20d3d5ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.715 186792 DEBUG nova.compute.manager [req-6075fb62-ae52-4dac-9ed7-392d84d12e5b req-4c284c5f-3abf-4443-85b5-f6ec20d3d5ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Processing event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.716 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.721 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798667.720834, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.721 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.723 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.726 186792 INFO nova.virt.libvirt.driver [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance spawned successfully.#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.726 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.742 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.748 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.751 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.752 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.752 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.753 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.753 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.753 186792 DEBUG nova.virt.libvirt.driver [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:27 np0005531888 nova_compute[186788]: 2025-11-22 08:04:27.801 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:04:28 np0005531888 nova_compute[186788]: 2025-11-22 08:04:28.058 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:29 np0005531888 nova_compute[186788]: 2025-11-22 08:04:29.311 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:29 np0005531888 nova_compute[186788]: 2025-11-22 08:04:29.411 186792 INFO nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Took 14.68 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:04:29 np0005531888 nova_compute[186788]: 2025-11-22 08:04:29.412 186792 DEBUG nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:29 np0005531888 nova_compute[186788]: 2025-11-22 08:04:29.509 186792 INFO nova.compute.manager [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Took 15.45 seconds to build instance.#033[00m
Nov 22 03:04:29 np0005531888 nova_compute[186788]: 2025-11-22 08:04:29.552 186792 DEBUG oslo_concurrency.lockutils [None req-dc1f8b25-85a7-4ea0-ab54-2d5b9c4626c4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:29 np0005531888 nova_compute[186788]: 2025-11-22 08:04:29.865 186792 DEBUG nova.network.neutron [req-a80d6b6f-9650-484f-a05d-29c3456a6cee req-cc744c7b-fadc-4b35-9f9c-7d8bca88177f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updated VIF entry in instance network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:04:29 np0005531888 nova_compute[186788]: 2025-11-22 08:04:29.865 186792 DEBUG nova.network.neutron [req-a80d6b6f-9650-484f-a05d-29c3456a6cee req-cc744c7b-fadc-4b35-9f9c-7d8bca88177f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:29 np0005531888 nova_compute[186788]: 2025-11-22 08:04:29.884 186792 DEBUG oslo_concurrency.lockutils [req-a80d6b6f-9650-484f-a05d-29c3456a6cee req-cc744c7b-fadc-4b35-9f9c-7d8bca88177f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:30 np0005531888 podman[229645]: 2025-11-22 08:04:30.675389337 +0000 UTC m=+0.046394982 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:04:30 np0005531888 podman[229646]: 2025-11-22 08:04:30.67714041 +0000 UTC m=+0.046996897 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:04:30 np0005531888 nova_compute[186788]: 2025-11-22 08:04:30.865 186792 DEBUG nova.compute.manager [req-2478e85f-2cf4-496f-bbc6-fcc31e221fb7 req-f17a9c29-4eeb-4a3a-a68d-250be07d426b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:30 np0005531888 nova_compute[186788]: 2025-11-22 08:04:30.865 186792 DEBUG oslo_concurrency.lockutils [req-2478e85f-2cf4-496f-bbc6-fcc31e221fb7 req-f17a9c29-4eeb-4a3a-a68d-250be07d426b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:30 np0005531888 nova_compute[186788]: 2025-11-22 08:04:30.865 186792 DEBUG oslo_concurrency.lockutils [req-2478e85f-2cf4-496f-bbc6-fcc31e221fb7 req-f17a9c29-4eeb-4a3a-a68d-250be07d426b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:30 np0005531888 nova_compute[186788]: 2025-11-22 08:04:30.865 186792 DEBUG oslo_concurrency.lockutils [req-2478e85f-2cf4-496f-bbc6-fcc31e221fb7 req-f17a9c29-4eeb-4a3a-a68d-250be07d426b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:30 np0005531888 nova_compute[186788]: 2025-11-22 08:04:30.865 186792 DEBUG nova.compute.manager [req-2478e85f-2cf4-496f-bbc6-fcc31e221fb7 req-f17a9c29-4eeb-4a3a-a68d-250be07d426b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:30 np0005531888 nova_compute[186788]: 2025-11-22 08:04:30.866 186792 WARNING nova.compute.manager [req-2478e85f-2cf4-496f-bbc6-fcc31e221fb7 req-f17a9c29-4eeb-4a3a-a68d-250be07d426b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:04:33 np0005531888 nova_compute[186788]: 2025-11-22 08:04:33.059 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:34 np0005531888 nova_compute[186788]: 2025-11-22 08:04:34.314 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:36.817 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:36.821 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:04:36.823 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.846 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'name': 'tempest-ServerActionsTestJSON-server-316749730', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000067', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'hostId': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.847 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.850 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b9ee5ebd-90a8-426a-b369-d38bf61616f2 / tap348c8bec-11 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.850 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2912dcaa-d4a8-48f8-9904-6a66a6525127', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.847642', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e3652b1c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '9a2a343de4c31d8ebede078aae192c3a5595cb05de6330fc02c9dbcffe729858'}]}, 'timestamp': '2025-11-22 08:04:36.851619', '_unique_id': '241f1273afaf4522af7ae8fe5d665f0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.852 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.853 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.880 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.read.latency volume: 1743935519 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.881 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.read.latency volume: 6764207 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a8a461d-2cd7-4d60-ab0a-9fe027cd4766', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1743935519, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.853990', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e369b9a2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '9f901c4a9047ed58ec17d7a4cda006f33ae6b61b106abe7814566f808b87baea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6764207, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.853990', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e369cd70-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '371558d5648084a2dba80c2ba0cc859095abc211c86b301ef69c89a3dc6aee86'}]}, 'timestamp': '2025-11-22 08:04:36.881899', '_unique_id': 'f2b071bfceb84e8196f4693a5ec0de9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.883 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.898 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.899 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b9ee5ebd-90a8-426a-b369-d38bf61616f2: ceilometer.compute.pollsters.NoVolumeException
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.899 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.899 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-316749730>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-316749730>]
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.899 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.909 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.910 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83bb5013-4981-453f-9702-5ba8d574bec2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.899908', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e36e2aaa-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.599401671, 'message_signature': '7850ee5e0f94a705a5b3cb1b6490aadadf7b27901d9d663426af74c808896235'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.899908', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e36e3a04-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.599401671, 'message_signature': '28055d3ee2fbe886edb47d999bea69d359b4b17eb782b82eb000f164c64f637d'}]}, 'timestamp': '2025-11-22 08:04:36.911043', '_unique_id': 'd271829bffb34793a3e996f654df8d86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.913 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.913 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/cpu volume: 8870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08058b01-a07f-4d0b-89e5-3dd7f370d166', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8870000000, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'timestamp': '2025-11-22T08:04:36.913429', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e36eaa02-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.598288403, 'message_signature': 'ad4f489c020ffda761e04921f44a01347e1c59c4387db0a3fcb4f191aa03914b'}]}, 'timestamp': '2025-11-22 08:04:36.913756', '_unique_id': '3f5c35b8e5434f9e8d33ff74a2db2039'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.915 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.915 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-316749730>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-316749730>]
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.916 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e1961e5-4540-42ac-a50b-05f76b6167c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.916148', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e36f13a2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': 'd3527292f94bb2ebec37053cc2ea782f145bcdb6125f9a4f4835dd7a04736a05'}]}, 'timestamp': '2025-11-22 08:04:36.916473', '_unique_id': '5fe7f028a8524cdfa1c221fa964d4f9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.917 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.918 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bc984b1-1e99-4d31-88c6-59c1c8c87efc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.917916', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e36f57f4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.599401671, 'message_signature': 'a377ad4be157613788f9c84e7b578468175a8c6b2aa9853b6a9576e828382149'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.917916', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e36f62b2-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.599401671, 'message_signature': '22113f310bb02fda6287eab2667b329c5357a4cd66daece262612462a9e6ff5c'}]}, 'timestamp': '2025-11-22 08:04:36.918478', '_unique_id': '1e9956d65b284a5bb1e9e96b6067ee78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.919 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.920 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d0dbdd6-72d8-40bc-835e-8146735e613f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.919966', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e36fa81c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': 'bd96360ffa9e85edf9e56c8d3b26845a4de685dd6490b3b5fd63a571f401dcd3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.919966', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e36fb30c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '489e949e4c9fca35bd9f617deececc14131ef4b5d23c00f6e87d746f7a0e8fd7'}]}, 'timestamp': '2025-11-22 08:04:36.920533', '_unique_id': 'd873945890d346a39c7b04c7e63c8bfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.922 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.922 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '872e2f14-be23-4373-b70d-372f21e0d03b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.921993', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e36ff736-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '0f954331e50fd24e5203e03c2c274734b50c044253e91062e325e6d2c1f979fe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.921993', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e37001cc-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '470854825b421ac14eec789cedfa420ce8addd12b8259c9870054917af18defa'}]}, 'timestamp': '2025-11-22 08:04:36.922547', '_unique_id': '099351ce7cce4143a4eef610e5cea041'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.924 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.924 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b31a2c03-68da-4d46-b802-de606634c2c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.924046', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e3704790-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '9fb9e84345d462b400facaae769331da2fb243212cf44e8eeed47ea57e1aa88e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.924046', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e3705316-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '904649dc1163b44a04ca324ea988e55ad748d2a745a0d89bc75eec1668f674e2'}]}, 'timestamp': '2025-11-22 08:04:36.924657', '_unique_id': '7dcb94fa40584b86b84955618e0322ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.926 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0531ddd5-8217-4cd4-9967-3dae4365bbbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.926266', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e3709e84-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '371943448fcbb362f44274423509b07529def379e8278d0390e8a958e2add57b'}]}, 'timestamp': '2025-11-22 08:04:36.926596', '_unique_id': '0457d63c2fec4c938d6fe60e58e15e3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dbb3c47-77dd-46ee-8bbc-90096bd439c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.928037', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e370e36c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '007d70783955a8c607a41ca2b92ed561971ee400af917be0aa98ca9b767cd2a7'}]}, 'timestamp': '2025-11-22 08:04:36.928342', '_unique_id': 'b56df8c5fbe847158eec836801c07e3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.929 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cadb9f8a-7bdd-404e-929b-08773e7c093c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.929776', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e37127b4-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '2166abb6698718a1413b52b99e3675f7e0c7a1844375d612626367cca30489ab'}]}, 'timestamp': '2025-11-22 08:04:36.930090', '_unique_id': '57a6bce49c264b39af741dd792a98357'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.931 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.931 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-316749730>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-316749730>]
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.931 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.932 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-316749730>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-316749730>]
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.932 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '973aab1e-0aad-4d13-bd88-33e6ee8fb0ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.932331', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e3718b5a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '4e1b8a3a7d3ea41274fbd21264664d5269ffe956654b0d349764c9b8518c01fa'}]}, 'timestamp': '2025-11-22 08:04:36.932665', '_unique_id': 'a0ae83da328e4e15a9b5bfba02c5ae29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.934 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9f3877e-04d4-4047-9c84-fb3d000f3954', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.934116', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e371d15a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '788caa5476d5f799c29cc612abffb21371403d6ebe4782e6b7799f58044ee6af'}]}, 'timestamp': '2025-11-22 08:04:36.934446', '_unique_id': '95bc1ac937b34d8c98a3fb447ad1eaa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9654f3fb-6e39-4149-81f9-baf3bb42b56a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.935995', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e3721a52-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '4867699d7d57355bea7f8b97cc06d611456af9bb3faf4ac533ff6309bf276a4c'}]}, 'timestamp': '2025-11-22 08:04:36.936298', '_unique_id': '8e2bcf6fb5714fa9a26b6936b2b7d83b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.937 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c21aa3a-5cb3-4388-ac02-c74271ddbabd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.937730', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e3725dfa-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '5c2ca08ef35f33f83b76b4c2aac8bd699d3e83b27b9e0f143acf906ec17bf99d'}]}, 'timestamp': '2025-11-22 08:04:36.938031', '_unique_id': '5f3a6a885e044661b5d9170d7a2610ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.939 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.939 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.939 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1be736a-7b0c-47e4-a433-f8d9e06eff23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.939432', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e372a12a-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.599401671, 'message_signature': 'dfe1ed8175d72a5ec69a1891f2b4f16b6d886e11bf4b0424a420cbb9d7c9e51b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.939432', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e372aca6-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.599401671, 'message_signature': '3477e96579f177369ca945eb580dc8cab6c50a535471a90667bc66feab951f78'}]}, 'timestamp': '2025-11-22 08:04:36.940030', '_unique_id': '2667350e10824a5f93892a769d5ba69d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.941 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.941 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.941 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee5d341e-3bc0-41f2-90bd-9664e2e7b81e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.941669', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e372f7ba-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': 'a77edb40f589de02343004aa8349b5702644fab3ea904a8e1d729a14147a0d75'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.941669', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e373023c-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '1604c8bf46f82206073a77d04a7aa915a7d5856f15bb41a57c0db42ee5feb2db'}]}, 'timestamp': '2025-11-22 08:04:36.942219', '_unique_id': '4fd229917a874825bb9f7a6a0b901312'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.943 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.943 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.943 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70192d1b-0ed0-4873-b76f-f8955e61133e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-vda', 'timestamp': '2025-11-22T08:04:36.943674', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e373463e-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '07ce5c7362994c1d805bf38ddbecfbc19ea90c7d4c76ba40e3c835d8a688ab6f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2-sda', 'timestamp': '2025-11-22T08:04:36.943674', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'instance-00000067', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e37351b0-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.553499172, 'message_signature': '06d8242fe4cada22d026157d0e166755779b8ce1042fecdbda3b4d82f2f5ebc1'}]}, 'timestamp': '2025-11-22 08:04:36.944257', '_unique_id': '74940339aa9849ce86cff817549e6bab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.945 12 DEBUG ceilometer.compute.pollsters [-] b9ee5ebd-90a8-426a-b369-d38bf61616f2/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2527f25b-0c2b-4337-89d6-f2145da86c1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-00000067-b9ee5ebd-90a8-426a-b369-d38bf61616f2-tap348c8bec-11', 'timestamp': '2025-11-22T08:04:36.945817', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-316749730', 'name': 'tap348c8bec-11', 'instance_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'instance_type': 'm1.nano', 'host': 'a699a159568fb3c9f0058c3fa61469755389cbcb833f7ad447d1c82b', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:d5:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap348c8bec-11'}, 'message_id': 'e37399fe-c779-11f0-941d-fa163e6775e5', 'monotonic_time': 5417.547136216, 'message_signature': '0468b76948870197db6b4240943bdaecb778158922e3a9054dfeeabb1323726b'}]}, 'timestamp': '2025-11-22 08:04:36.946121', '_unique_id': '6dca736786fa4d5380c6350237895672'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:04:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:04:38 np0005531888 nova_compute[186788]: 2025-11-22 08:04:38.060 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:39 np0005531888 nova_compute[186788]: 2025-11-22 08:04:39.129 186792 DEBUG nova.compute.manager [req-da4f950c-103e-4003-a163-2be910e2d8a5 req-64977bcd-f104-4e88-92fb-9a99a21c1c89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:39 np0005531888 nova_compute[186788]: 2025-11-22 08:04:39.130 186792 DEBUG nova.compute.manager [req-da4f950c-103e-4003-a163-2be910e2d8a5 req-64977bcd-f104-4e88-92fb-9a99a21c1c89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing instance network info cache due to event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:04:39 np0005531888 nova_compute[186788]: 2025-11-22 08:04:39.130 186792 DEBUG oslo_concurrency.lockutils [req-da4f950c-103e-4003-a163-2be910e2d8a5 req-64977bcd-f104-4e88-92fb-9a99a21c1c89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:39 np0005531888 nova_compute[186788]: 2025-11-22 08:04:39.130 186792 DEBUG oslo_concurrency.lockutils [req-da4f950c-103e-4003-a163-2be910e2d8a5 req-64977bcd-f104-4e88-92fb-9a99a21c1c89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:39 np0005531888 nova_compute[186788]: 2025-11-22 08:04:39.131 186792 DEBUG nova.network.neutron [req-da4f950c-103e-4003-a163-2be910e2d8a5 req-64977bcd-f104-4e88-92fb-9a99a21c1c89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:04:39 np0005531888 nova_compute[186788]: 2025-11-22 08:04:39.317 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:40 np0005531888 podman[229687]: 2025-11-22 08:04:40.694635443 +0000 UTC m=+0.068647078 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:04:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:40Z|00316|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:04:40 np0005531888 nova_compute[186788]: 2025-11-22 08:04:40.784 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:42 np0005531888 podman[229718]: 2025-11-22 08:04:42.68165418 +0000 UTC m=+0.053287311 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:04:42 np0005531888 nova_compute[186788]: 2025-11-22 08:04:42.830 186792 DEBUG nova.network.neutron [req-da4f950c-103e-4003-a163-2be910e2d8a5 req-64977bcd-f104-4e88-92fb-9a99a21c1c89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updated VIF entry in instance network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:04:42 np0005531888 nova_compute[186788]: 2025-11-22 08:04:42.831 186792 DEBUG nova.network.neutron [req-da4f950c-103e-4003-a163-2be910e2d8a5 req-64977bcd-f104-4e88-92fb-9a99a21c1c89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:43 np0005531888 nova_compute[186788]: 2025-11-22 08:04:43.062 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:44 np0005531888 nova_compute[186788]: 2025-11-22 08:04:44.320 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:45 np0005531888 nova_compute[186788]: 2025-11-22 08:04:45.024 186792 DEBUG oslo_concurrency.lockutils [req-da4f950c-103e-4003-a163-2be910e2d8a5 req-64977bcd-f104-4e88-92fb-9a99a21c1c89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:47 np0005531888 podman[229744]: 2025-11-22 08:04:47.680930556 +0000 UTC m=+0.056959762 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:04:48 np0005531888 nova_compute[186788]: 2025-11-22 08:04:48.064 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:48 np0005531888 nova_compute[186788]: 2025-11-22 08:04:48.880 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:48 np0005531888 nova_compute[186788]: 2025-11-22 08:04:48.898 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Triggering sync for uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 03:04:48 np0005531888 nova_compute[186788]: 2025-11-22 08:04:48.899 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:48 np0005531888 nova_compute[186788]: 2025-11-22 08:04:48.900 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:48 np0005531888 nova_compute[186788]: 2025-11-22 08:04:48.923 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:49 np0005531888 nova_compute[186788]: 2025-11-22 08:04:49.324 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:51Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ba:d5:b9 10.100.0.14
Nov 22 03:04:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:04:51Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:d5:b9 10.100.0.14
Nov 22 03:04:52 np0005531888 podman[229772]: 2025-11-22 08:04:52.696915252 +0000 UTC m=+0.062777715 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 03:04:52 np0005531888 podman[229773]: 2025-11-22 08:04:52.720310677 +0000 UTC m=+0.084593381 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:04:52 np0005531888 nova_compute[186788]: 2025-11-22 08:04:52.967 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:53 np0005531888 nova_compute[186788]: 2025-11-22 08:04:53.066 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:53 np0005531888 nova_compute[186788]: 2025-11-22 08:04:53.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:53 np0005531888 nova_compute[186788]: 2025-11-22 08:04:53.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:04:53 np0005531888 nova_compute[186788]: 2025-11-22 08:04:53.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:04:54 np0005531888 nova_compute[186788]: 2025-11-22 08:04:54.328 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:54 np0005531888 nova_compute[186788]: 2025-11-22 08:04:54.805 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:54 np0005531888 nova_compute[186788]: 2025-11-22 08:04:54.806 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:54 np0005531888 nova_compute[186788]: 2025-11-22 08:04:54.806 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:04:54 np0005531888 nova_compute[186788]: 2025-11-22 08:04:54.807 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:58 np0005531888 nova_compute[186788]: 2025-11-22 08:04:58.067 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:59 np0005531888 nova_compute[186788]: 2025-11-22 08:04:59.330 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:59 np0005531888 nova_compute[186788]: 2025-11-22 08:04:59.816 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:59 np0005531888 nova_compute[186788]: 2025-11-22 08:04:59.847 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:59 np0005531888 nova_compute[186788]: 2025-11-22 08:04:59.847 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:04:59 np0005531888 nova_compute[186788]: 2025-11-22 08:04:59.848 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:59 np0005531888 nova_compute[186788]: 2025-11-22 08:04:59.849 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:59 np0005531888 nova_compute[186788]: 2025-11-22 08:04:59.849 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:59 np0005531888 nova_compute[186788]: 2025-11-22 08:04:59.850 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:00 np0005531888 nova_compute[186788]: 2025-11-22 08:05:00.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:00 np0005531888 nova_compute[186788]: 2025-11-22 08:05:00.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:05:01 np0005531888 podman[229817]: 2025-11-22 08:05:01.680601744 +0000 UTC m=+0.048475983 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:05:01 np0005531888 podman[229816]: 2025-11-22 08:05:01.703521957 +0000 UTC m=+0.075697332 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:05:01 np0005531888 nova_compute[186788]: 2025-11-22 08:05:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:02 np0005531888 nova_compute[186788]: 2025-11-22 08:05:02.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:02 np0005531888 nova_compute[186788]: 2025-11-22 08:05:02.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:02 np0005531888 nova_compute[186788]: 2025-11-22 08:05:02.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:02 np0005531888 nova_compute[186788]: 2025-11-22 08:05:02.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:02 np0005531888 nova_compute[186788]: 2025-11-22 08:05:02.981 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:05:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:02Z|00317|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.047 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.072 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.078 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.153 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.154 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.213 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.363 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.364 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5520MB free_disk=73.24621963500977GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.364 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.365 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.486 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance b9ee5ebd-90a8-426a-b369-d38bf61616f2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.487 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.487 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.509 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.528 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.528 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.558 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.612 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.658 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.682 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.705 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:05:03 np0005531888 nova_compute[186788]: 2025-11-22 08:05:03.706 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:04 np0005531888 nova_compute[186788]: 2025-11-22 08:05:04.334 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:04 np0005531888 nova_compute[186788]: 2025-11-22 08:05:04.420 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:05.344 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:05.344 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:05:05 np0005531888 nova_compute[186788]: 2025-11-22 08:05:05.345 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:08 np0005531888 nova_compute[186788]: 2025-11-22 08:05:08.072 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:09 np0005531888 nova_compute[186788]: 2025-11-22 08:05:09.336 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:10 np0005531888 nova_compute[186788]: 2025-11-22 08:05:10.636 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:11 np0005531888 podman[229865]: 2025-11-22 08:05:11.688232097 +0000 UTC m=+0.061961845 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:05:13 np0005531888 nova_compute[186788]: 2025-11-22 08:05:13.075 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:13.347 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:13 np0005531888 podman[229885]: 2025-11-22 08:05:13.675701758 +0000 UTC m=+0.051456076 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:05:14 np0005531888 nova_compute[186788]: 2025-11-22 08:05:14.339 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:17 np0005531888 nova_compute[186788]: 2025-11-22 08:05:17.489 186792 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:17 np0005531888 nova_compute[186788]: 2025-11-22 08:05:17.490 186792 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:17 np0005531888 nova_compute[186788]: 2025-11-22 08:05:17.490 186792 DEBUG nova.network.neutron [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:05:18 np0005531888 nova_compute[186788]: 2025-11-22 08:05:18.078 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:18 np0005531888 podman[229908]: 2025-11-22 08:05:18.674465745 +0000 UTC m=+0.047386726 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7)
Nov 22 03:05:19 np0005531888 nova_compute[186788]: 2025-11-22 08:05:19.341 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.088 186792 DEBUG nova.network.neutron [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.129 186792 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.534 186792 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.535 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Creating file /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/915c295cf1df4023a43f1f72a1e18054.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.536 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/915c295cf1df4023a43f1f72a1e18054.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.938 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/915c295cf1df4023a43f1f72a1e18054.tmp" returned: 1 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.939 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/915c295cf1df4023a43f1f72a1e18054.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.939 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Creating directory /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 03:05:22 np0005531888 nova_compute[186788]: 2025-11-22 08:05:22.939 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.081 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.138 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.142 186792 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.531 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.532 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.585 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:05:23 np0005531888 podman[229932]: 2025-11-22 08:05:23.680722036 +0000 UTC m=+0.055634289 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:05:23 np0005531888 podman[229933]: 2025-11-22 08:05:23.703347932 +0000 UTC m=+0.075232151 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.782 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.783 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.801 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:05:23 np0005531888 nova_compute[186788]: 2025-11-22 08:05:23.802 186792 INFO nova.compute.claims [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.209 186792 DEBUG nova.compute.provider_tree [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.228 186792 DEBUG nova.scheduler.client.report [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.300 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.301 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.345 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.429 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.430 186792 DEBUG nova.network.neutron [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.458 186792 INFO nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.485 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.730 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.731 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.732 186792 INFO nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Creating image(s)#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.733 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "/var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.733 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "/var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.734 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "/var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.749 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.812 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.813 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.814 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.827 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.886 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:24 np0005531888 nova_compute[186788]: 2025-11-22 08:05:24.887 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.006 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk 1073741824" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.008 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.008 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.062 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.063 186792 DEBUG nova.virt.disk.api [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Checking if we can resize image /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.064 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.108 186792 DEBUG nova.policy [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '68050a7a73c74478bc5c540f68e3e639', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3cb45f10b9cc44f28a854f445948ff8d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.118 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.120 186792 DEBUG nova.virt.disk.api [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Cannot resize image /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.120 186792 DEBUG nova.objects.instance [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lazy-loading 'migration_context' on Instance uuid c25fedf6-8ee9-48d2-a91c-f5040b45cb61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.152 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.152 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Ensure instance console log exists: /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.153 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.153 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.153 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:25 np0005531888 kernel: tap348c8bec-11 (unregistering): left promiscuous mode
Nov 22 03:05:25 np0005531888 NetworkManager[55166]: <info>  [1763798725.3325] device (tap348c8bec-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00318|binding|INFO|Releasing lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc from this chassis (sb_readonly=0)
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00319|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc down in Southbound
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00320|binding|INFO|Removing iface tap348c8bec-11 ovn-installed in OVS
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.343 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:25.355 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:25.356 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:05:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:25.358 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:05:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:25.360 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eadd9fc7-153e-47b4-b8f8-33a02658052f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:25.360 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.363 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:25 np0005531888 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000067.scope: Deactivated successfully.
Nov 22 03:05:25 np0005531888 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000067.scope: Consumed 17.699s CPU time.
Nov 22 03:05:25 np0005531888 systemd-machined[153106]: Machine qemu-48-instance-00000067 terminated.
Nov 22 03:05:25 np0005531888 kernel: tap348c8bec-11: entered promiscuous mode
Nov 22 03:05:25 np0005531888 kernel: tap348c8bec-11 (unregistering): left promiscuous mode
Nov 22 03:05:25 np0005531888 NetworkManager[55166]: <info>  [1763798725.5822] manager: (tap348c8bec-11): new Tun device (/org/freedesktop/NetworkManager/Devices/161)
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.583 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00321|binding|INFO|Claiming lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for this chassis.
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00322|binding|INFO|348c8bec-11f0-4b6d-9dce-ae3c3f37efbc: Claiming fa:16:3e:ba:d5:b9 10.100.0.14
Nov 22 03:05:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:25.596 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00323|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc ovn-installed in OVS
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00324|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc up in Southbound
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00325|binding|INFO|Releasing lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc from this chassis (sb_readonly=1)
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.599 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00326|if_status|INFO|Not setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc down as sb is readonly
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00327|binding|INFO|Removing iface tap348c8bec-11 ovn-installed in OVS
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.601 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.611 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00328|binding|INFO|Releasing lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc from this chassis (sb_readonly=0)
Nov 22 03:05:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:25Z|00329|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc down in Southbound
Nov 22 03:05:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:25.660 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:25 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[229630]: [NOTICE]   (229634) : haproxy version is 2.8.14-c23fe91
Nov 22 03:05:25 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[229630]: [NOTICE]   (229634) : path to executable is /usr/sbin/haproxy
Nov 22 03:05:25 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[229630]: [WARNING]  (229634) : Exiting Master process...
Nov 22 03:05:25 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[229630]: [ALERT]    (229634) : Current worker (229636) exited with code 143 (Terminated)
Nov 22 03:05:25 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[229630]: [WARNING]  (229634) : All workers exited. Exiting... (0)
Nov 22 03:05:25 np0005531888 systemd[1]: libpod-6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73.scope: Deactivated successfully.
Nov 22 03:05:25 np0005531888 nova_compute[186788]: 2025-11-22 08:05:25.753 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:25 np0005531888 podman[230018]: 2025-11-22 08:05:25.754646433 +0000 UTC m=+0.309269675 container died 6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.159 186792 INFO nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance shutdown successfully after 3 seconds.#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.165 186792 INFO nova.virt.libvirt.driver [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance destroyed successfully.#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.166 186792 DEBUG nova.virt.libvirt.vif [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:04:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:05:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:ba:d5:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.167 186792 DEBUG nova.network.os_vif_util [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:ba:d5:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.167 186792 DEBUG nova.network.os_vif_util [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.168 186792 DEBUG os_vif [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.170 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.171 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap348c8bec-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.172 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.173 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.175 186792 INFO os_vif [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11')#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.180 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.241 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:26 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73-userdata-shm.mount: Deactivated successfully.
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.242 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:26 np0005531888 systemd[1]: var-lib-containers-storage-overlay-6cd0834e9f0dd5a843732e9bdd78baf1a54121e28808c3e0e1682ab6d6bd23d9-merged.mount: Deactivated successfully.
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.304 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.305 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Copying file /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk to 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.306 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:26 np0005531888 podman[230018]: 2025-11-22 08:05:26.316790456 +0000 UTC m=+0.871413698 container cleanup 6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:05:26 np0005531888 systemd[1]: libpod-conmon-6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73.scope: Deactivated successfully.
Nov 22 03:05:26 np0005531888 podman[230071]: 2025-11-22 08:05:26.402223196 +0000 UTC m=+0.062290502 container remove 6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.408 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b6d3a104-be79-4500-8b0d-68c4f98b2ade]: (4, ('Sat Nov 22 08:05:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73)\n6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73\nSat Nov 22 08:05:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73)\n6953107fd1ffdb4f5917332a6285b6ad1491556bf9581f3a9a80d11e06942d73\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.410 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5860cc8b-7e2a-4ea1-bfc2-9567e208cede]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.412 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.414 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:26 np0005531888 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 03:05:26 np0005531888 nova_compute[186788]: 2025-11-22 08:05:26.427 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.431 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[206488e2-bbcc-4fe3-b99a-96d73dc8b953]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.448 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[42321c90-a7c3-4f61-a7b5-476d608bb12c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.450 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b628ba88-2767-4010-91c7-655695dbb400]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.464 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b0faa7a8-4a66-4e30-b279-dec36799f3b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540651, 'reachable_time': 43163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230087, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.467 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.467 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[631a2f07-1eb6-41bc-89e9-5136f95d451e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:26 np0005531888 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.469 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.470 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.471 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[af19462d-016d-4764-9c2e-f5c9ca9877b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.472 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.473 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:05:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:26.474 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2c22a800-b3cc-4990-982f-40b8290e88df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.045 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "scp -r /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk" returned: 0 in 0.740s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.047 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Copying file /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.047 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk.config 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.176 186792 DEBUG nova.compute.manager [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.177 186792 DEBUG oslo_concurrency.lockutils [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.177 186792 DEBUG oslo_concurrency.lockutils [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.177 186792 DEBUG oslo_concurrency.lockutils [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.177 186792 DEBUG nova.compute.manager [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.178 186792 WARNING nova.compute.manager [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.263 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "scp -C -r /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk.config 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config" returned: 0 in 0.216s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.264 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Copying file /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.264 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk.info 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:27 np0005531888 nova_compute[186788]: 2025-11-22 08:05:27.722 186792 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "scp -C -r /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_resize/disk.info 192.168.122.100:/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:28 np0005531888 nova_compute[186788]: 2025-11-22 08:05:28.085 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:28 np0005531888 nova_compute[186788]: 2025-11-22 08:05:28.336 186792 DEBUG neutronclient.v2_0.client [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:05:28 np0005531888 nova_compute[186788]: 2025-11-22 08:05:28.919 186792 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:28 np0005531888 nova_compute[186788]: 2025-11-22 08:05:28.919 186792 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:28 np0005531888 nova_compute[186788]: 2025-11-22 08:05:28.920 186792 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:29 np0005531888 nova_compute[186788]: 2025-11-22 08:05:29.714 186792 DEBUG nova.compute.manager [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:29 np0005531888 nova_compute[186788]: 2025-11-22 08:05:29.714 186792 DEBUG oslo_concurrency.lockutils [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:29 np0005531888 nova_compute[186788]: 2025-11-22 08:05:29.715 186792 DEBUG oslo_concurrency.lockutils [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:29 np0005531888 nova_compute[186788]: 2025-11-22 08:05:29.715 186792 DEBUG oslo_concurrency.lockutils [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:29 np0005531888 nova_compute[186788]: 2025-11-22 08:05:29.715 186792 DEBUG nova.compute.manager [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:29 np0005531888 nova_compute[186788]: 2025-11-22 08:05:29.716 186792 WARNING nova.compute.manager [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:05:30 np0005531888 nova_compute[186788]: 2025-11-22 08:05:30.437 186792 DEBUG nova.network.neutron [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Successfully created port: faf94f4d-c395-46ad-93f8-9d7ef5f27d12 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:05:31 np0005531888 nova_compute[186788]: 2025-11-22 08:05:31.176 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.357 186792 DEBUG nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.358 186792 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.358 186792 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.359 186792 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.359 186792 DEBUG nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.359 186792 WARNING nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.359 186792 DEBUG nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.360 186792 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.360 186792 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.360 186792 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.360 186792 DEBUG nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:32 np0005531888 nova_compute[186788]: 2025-11-22 08:05:32.361 186792 WARNING nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:05:32 np0005531888 podman[230093]: 2025-11-22 08:05:32.685971781 +0000 UTC m=+0.052029091 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:05:32 np0005531888 podman[230094]: 2025-11-22 08:05:32.704679681 +0000 UTC m=+0.061400661 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 03:05:33 np0005531888 nova_compute[186788]: 2025-11-22 08:05:33.091 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.636 186792 DEBUG nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.637 186792 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.637 186792 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.637 186792 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.637 186792 DEBUG nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.637 186792 WARNING nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.638 186792 DEBUG nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.638 186792 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.638 186792 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.638 186792 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.638 186792 DEBUG nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:34 np0005531888 nova_compute[186788]: 2025-11-22 08:05:34.638 186792 WARNING nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:05:36 np0005531888 nova_compute[186788]: 2025-11-22 08:05:36.179 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:36 np0005531888 nova_compute[186788]: 2025-11-22 08:05:36.805 186792 DEBUG nova.compute.manager [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:36 np0005531888 nova_compute[186788]: 2025-11-22 08:05:36.805 186792 DEBUG nova.compute.manager [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing instance network info cache due to event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:05:36 np0005531888 nova_compute[186788]: 2025-11-22 08:05:36.806 186792 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:36 np0005531888 nova_compute[186788]: 2025-11-22 08:05:36.806 186792 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:36 np0005531888 nova_compute[186788]: 2025-11-22 08:05:36.806 186792 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:05:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:36.819 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:36.820 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:36.820 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:37 np0005531888 nova_compute[186788]: 2025-11-22 08:05:37.002 186792 DEBUG nova.network.neutron [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Successfully updated port: faf94f4d-c395-46ad-93f8-9d7ef5f27d12 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:05:37 np0005531888 nova_compute[186788]: 2025-11-22 08:05:37.024 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:37 np0005531888 nova_compute[186788]: 2025-11-22 08:05:37.025 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquired lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:37 np0005531888 nova_compute[186788]: 2025-11-22 08:05:37.025 186792 DEBUG nova.network.neutron [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:05:37 np0005531888 nova_compute[186788]: 2025-11-22 08:05:37.146 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:37 np0005531888 nova_compute[186788]: 2025-11-22 08:05:37.874 186792 DEBUG nova.network.neutron [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:05:38 np0005531888 nova_compute[186788]: 2025-11-22 08:05:38.095 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:39 np0005531888 nova_compute[186788]: 2025-11-22 08:05:39.536 186792 DEBUG nova.compute.manager [req-bc53fabd-c92d-4242-8d81-fc152d58a3f0 req-b9bd444a-4a9a-4cfe-8624-c585f94683a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received event network-changed-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:39 np0005531888 nova_compute[186788]: 2025-11-22 08:05:39.536 186792 DEBUG nova.compute.manager [req-bc53fabd-c92d-4242-8d81-fc152d58a3f0 req-b9bd444a-4a9a-4cfe-8624-c585f94683a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Refreshing instance network info cache due to event network-changed-faf94f4d-c395-46ad-93f8-9d7ef5f27d12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:05:39 np0005531888 nova_compute[186788]: 2025-11-22 08:05:39.537 186792 DEBUG oslo_concurrency.lockutils [req-bc53fabd-c92d-4242-8d81-fc152d58a3f0 req-b9bd444a-4a9a-4cfe-8624-c585f94683a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.628 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798725.6268322, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.629 186792 INFO nova.compute.manager [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.651 186792 DEBUG nova.compute.manager [None req-9f5cec60-ac2a-43c7-884f-7a0ea0119722 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.656 186792 DEBUG nova.compute.manager [None req-9f5cec60-ac2a-43c7-884f-7a0ea0119722 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_migrated, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.696 186792 INFO nova.compute.manager [None req-9f5cec60-ac2a-43c7-884f-7a0ea0119722 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.767 186792 DEBUG nova.network.neutron [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Updating instance_info_cache with network_info: [{"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.841 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Releasing lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.841 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Instance network_info: |[{"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.841 186792 DEBUG oslo_concurrency.lockutils [req-bc53fabd-c92d-4242-8d81-fc152d58a3f0 req-b9bd444a-4a9a-4cfe-8624-c585f94683a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.842 186792 DEBUG nova.network.neutron [req-bc53fabd-c92d-4242-8d81-fc152d58a3f0 req-b9bd444a-4a9a-4cfe-8624-c585f94683a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Refreshing network info cache for port faf94f4d-c395-46ad-93f8-9d7ef5f27d12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.844 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Start _get_guest_xml network_info=[{"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.849 186792 WARNING nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.858 186792 DEBUG nova.virt.libvirt.host [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.858 186792 DEBUG nova.virt.libvirt.host [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.871 186792 DEBUG nova.virt.libvirt.host [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.872 186792 DEBUG nova.virt.libvirt.host [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.874 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.874 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.875 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.875 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.875 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.875 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.875 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.876 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.876 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.876 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.876 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.877 186792 DEBUG nova.virt.hardware [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.882 186792 DEBUG nova.virt.libvirt.vif [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:05:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=105,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK71tIxRMdncuWpyEZ3UI/snljNEMVFB7YULd++BTz1vmTgY+HZOnudxZ7/apao9vSiBdmD/tbqA0CZQO59zT5/wzEr9cAO0OmDAaw1kCXpC2NqTutjtHc5UhNEIUzEqeA==',key_name='tempest-keypair-1150638835',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3cb45f10b9cc44f28a854f445948ff8d',ramdisk_id='',reservation_id='r-matfnu47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-867436661',owner_user_name='tempest-ServersTestFqdnHostnames-867436661-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:05:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='68050a7a73c74478bc5c540f68e3e639',uuid=c25fedf6-8ee9-48d2-a91c-f5040b45cb61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.883 186792 DEBUG nova.network.os_vif_util [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Converting VIF {"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.883 186792 DEBUG nova.network.os_vif_util [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:98:6f,bridge_name='br-int',has_traffic_filtering=True,id=faf94f4d-c395-46ad-93f8-9d7ef5f27d12,network=Network(256b80dd-b157-4570-860a-47654e0873df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaf94f4d-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.885 186792 DEBUG nova.objects.instance [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lazy-loading 'pci_devices' on Instance uuid c25fedf6-8ee9-48d2-a91c-f5040b45cb61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.981 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <uuid>c25fedf6-8ee9-48d2-a91c-f5040b45cb61</uuid>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <name>instance-00000069</name>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <nova:name>guest-instance-1.domain.com</nova:name>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:05:40</nova:creationTime>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        <nova:user uuid="68050a7a73c74478bc5c540f68e3e639">tempest-ServersTestFqdnHostnames-867436661-project-member</nova:user>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        <nova:project uuid="3cb45f10b9cc44f28a854f445948ff8d">tempest-ServersTestFqdnHostnames-867436661</nova:project>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        <nova:port uuid="faf94f4d-c395-46ad-93f8-9d7ef5f27d12">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <entry name="serial">c25fedf6-8ee9-48d2-a91c-f5040b45cb61</entry>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <entry name="uuid">c25fedf6-8ee9-48d2-a91c-f5040b45cb61</entry>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk.config"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:9f:98:6f"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <target dev="tapfaf94f4d-c3"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/console.log" append="off"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:05:40 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:05:40 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:05:40 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:05:40 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.983 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Preparing to wait for external event network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.983 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.984 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.984 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.985 186792 DEBUG nova.virt.libvirt.vif [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:05:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=105,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK71tIxRMdncuWpyEZ3UI/snljNEMVFB7YULd++BTz1vmTgY+HZOnudxZ7/apao9vSiBdmD/tbqA0CZQO59zT5/wzEr9cAO0OmDAaw1kCXpC2NqTutjtHc5UhNEIUzEqeA==',key_name='tempest-keypair-1150638835',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3cb45f10b9cc44f28a854f445948ff8d',ramdisk_id='',reservation_id='r-matfnu47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-867436661',owner_user_name='tempest-ServersTestFqdnHostnames-867436661-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:05:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='68050a7a73c74478bc5c540f68e3e639',uuid=c25fedf6-8ee9-48d2-a91c-f5040b45cb61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.985 186792 DEBUG nova.network.os_vif_util [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Converting VIF {"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.986 186792 DEBUG nova.network.os_vif_util [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:98:6f,bridge_name='br-int',has_traffic_filtering=True,id=faf94f4d-c395-46ad-93f8-9d7ef5f27d12,network=Network(256b80dd-b157-4570-860a-47654e0873df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaf94f4d-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.987 186792 DEBUG os_vif [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:98:6f,bridge_name='br-int',has_traffic_filtering=True,id=faf94f4d-c395-46ad-93f8-9d7ef5f27d12,network=Network(256b80dd-b157-4570-860a-47654e0873df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaf94f4d-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.987 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.988 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.989 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.992 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.992 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfaf94f4d-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.993 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfaf94f4d-c3, col_values=(('external_ids', {'iface-id': 'faf94f4d-c395-46ad-93f8-9d7ef5f27d12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:98:6f', 'vm-uuid': 'c25fedf6-8ee9-48d2-a91c-f5040b45cb61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:40 np0005531888 NetworkManager[55166]: <info>  [1763798740.9956] manager: (tapfaf94f4d-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Nov 22 03:05:40 np0005531888 nova_compute[186788]: 2025-11-22 08:05:40.997 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.000 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.001 186792 INFO os_vif [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:98:6f,bridge_name='br-int',has_traffic_filtering=True,id=faf94f4d-c395-46ad-93f8-9d7ef5f27d12,network=Network(256b80dd-b157-4570-860a-47654e0873df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaf94f4d-c3')#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.103 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.103 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.103 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] No VIF found with MAC fa:16:3e:9f:98:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.104 186792 INFO nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Using config drive#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.190 186792 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updated VIF entry in instance network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.191 186792 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:41 np0005531888 nova_compute[186788]: 2025-11-22 08:05:41.249 186792 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.197 186792 INFO nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Creating config drive at /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk.config#033[00m
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.206 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpob1kr_9o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.338 186792 DEBUG oslo_concurrency.processutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpob1kr_9o" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:42 np0005531888 kernel: tapfaf94f4d-c3: entered promiscuous mode
Nov 22 03:05:42 np0005531888 NetworkManager[55166]: <info>  [1763798742.4196] manager: (tapfaf94f4d-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Nov 22 03:05:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:42Z|00330|binding|INFO|Claiming lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 for this chassis.
Nov 22 03:05:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:42Z|00331|binding|INFO|faf94f4d-c395-46ad-93f8-9d7ef5f27d12: Claiming fa:16:3e:9f:98:6f 10.100.0.12
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.420 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:42Z|00332|binding|INFO|Setting lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 ovn-installed in OVS
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.436 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.438 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:42 np0005531888 systemd-udevd[230170]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:05:42 np0005531888 NetworkManager[55166]: <info>  [1763798742.4653] device (tapfaf94f4d-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:05:42 np0005531888 NetworkManager[55166]: <info>  [1763798742.4664] device (tapfaf94f4d-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:05:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:42Z|00333|binding|INFO|Setting lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 up in Southbound
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.471 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:98:6f 10.100.0.12'], port_security=['fa:16:3e:9f:98:6f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c25fedf6-8ee9-48d2-a91c-f5040b45cb61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-256b80dd-b157-4570-860a-47654e0873df', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3cb45f10b9cc44f28a854f445948ff8d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cb8b4101-05bc-4911-9b59-cd8e80593f59', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7820d97-20c2-4b38-84a7-3f5b00fd5c33, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=faf94f4d-c395-46ad-93f8-9d7ef5f27d12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.472 104023 INFO neutron.agent.ovn.metadata.agent [-] Port faf94f4d-c395-46ad-93f8-9d7ef5f27d12 in datapath 256b80dd-b157-4570-860a-47654e0873df bound to our chassis#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.474 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 256b80dd-b157-4570-860a-47654e0873df#033[00m
Nov 22 03:05:42 np0005531888 systemd-machined[153106]: New machine qemu-49-instance-00000069.
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.484 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9e217a-bd40-455e-8cf7-7d9177b5a060]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.486 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap256b80dd-b1 in ovnmeta-256b80dd-b157-4570-860a-47654e0873df namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.488 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap256b80dd-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.488 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[93630f14-f6ea-4e5f-8e59-45eaa39e3faa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.489 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[03673ca7-a65b-49f2-be50-fc1d6a431f1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 systemd[1]: Started Virtual Machine qemu-49-instance-00000069.
Nov 22 03:05:42 np0005531888 podman[230149]: 2025-11-22 08:05:42.499987832 +0000 UTC m=+0.093708835 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS)
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.500 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[c7feb261-0b24-4dcc-996e-b2e0c3f39a7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.523 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd70958-197a-4740-b42d-4f8000d5c05e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.556 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[06cd58b0-e677-4ab8-9bd4-c0aed43f42ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.564 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6b5d6196-7736-40d8-977b-931fde86c60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 NetworkManager[55166]: <info>  [1763798742.5652] manager: (tap256b80dd-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/164)
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.598 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[feeae7c4-0d17-4c7a-a217-ce2188a518aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.601 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[aa8c743a-dffa-4cbd-b8f2-7ccdfe90fb73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 NetworkManager[55166]: <info>  [1763798742.6241] device (tap256b80dd-b0): carrier: link connected
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.630 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[56c448f2-ef13-4f8f-9d87-439d714ab3e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.648 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f18d2c45-1e61-4f94-92c9-23c6d242e5c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap256b80dd-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:f4:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548326, 'reachable_time': 24039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230212, 'error': None, 'target': 'ovnmeta-256b80dd-b157-4570-860a-47654e0873df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.664 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a10b8ddd-a42b-40fa-9383-9c01d3817a1c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:f465'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548326, 'tstamp': 548326}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230213, 'error': None, 'target': 'ovnmeta-256b80dd-b157-4570-860a-47654e0873df', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.680 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5b1cb141-3cad-426a-8b25-27cd8c4d1cb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap256b80dd-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:f4:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548326, 'reachable_time': 24039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230214, 'error': None, 'target': 'ovnmeta-256b80dd-b157-4570-860a-47654e0873df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.712 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f999e7ba-575f-45c9-a77b-4432c64c2f29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.777 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9b473eb6-7ac0-4134-aca8-b3c2256f7247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.778 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap256b80dd-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.778 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.779 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap256b80dd-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:42 np0005531888 NetworkManager[55166]: <info>  [1763798742.7812] manager: (tap256b80dd-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.780 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:42 np0005531888 kernel: tap256b80dd-b0: entered promiscuous mode
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.784 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.786 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap256b80dd-b0, col_values=(('external_ids', {'iface-id': 'a3980c72-08b4-467a-93da-70e8d0a69589'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.787 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:42Z|00334|binding|INFO|Releasing lport a3980c72-08b4-467a-93da-70e8d0a69589 from this chassis (sb_readonly=0)
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.790 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/256b80dd-b157-4570-860a-47654e0873df.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/256b80dd-b157-4570-860a-47654e0873df.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.791 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d647fa31-f5f7-403f-9481-84783d642edd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.792 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-256b80dd-b157-4570-860a-47654e0873df
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/256b80dd-b157-4570-860a-47654e0873df.pid.haproxy
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 256b80dd-b157-4570-860a-47654e0873df
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:05:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:42.792 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-256b80dd-b157-4570-860a-47654e0873df', 'env', 'PROCESS_TAG=haproxy-256b80dd-b157-4570-860a-47654e0873df', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/256b80dd-b157-4570-860a-47654e0873df.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:05:42 np0005531888 nova_compute[186788]: 2025-11-22 08:05:42.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.097 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.166 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798743.1653128, c25fedf6-8ee9-48d2-a91c-f5040b45cb61 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.167 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] VM Started (Lifecycle Event)#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.208 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:43 np0005531888 podman[230252]: 2025-11-22 08:05:43.218731095 +0000 UTC m=+0.066677901 container create e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.218 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798743.1660976, c25fedf6-8ee9-48d2-a91c-f5040b45cb61 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.219 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.245 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.256 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:05:43 np0005531888 systemd[1]: Started libpod-conmon-e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065.scope.
Nov 22 03:05:43 np0005531888 podman[230252]: 2025-11-22 08:05:43.182528135 +0000 UTC m=+0.030474961 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:05:43 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:05:43 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e331bb25c8b3124f5bc65313327194a794e5d1519701bd5514a20e84c9bf3423/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.303 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:05:43 np0005531888 podman[230252]: 2025-11-22 08:05:43.309444816 +0000 UTC m=+0.157391642 container init e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:05:43 np0005531888 podman[230252]: 2025-11-22 08:05:43.317521054 +0000 UTC m=+0.165467860 container start e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:05:43 np0005531888 neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df[230267]: [NOTICE]   (230271) : New worker (230273) forked
Nov 22 03:05:43 np0005531888 neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df[230267]: [NOTICE]   (230271) : Loading success.
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.662 186792 DEBUG nova.compute.manager [req-1db5a156-19fd-4f3e-b233-2c699e0ddb97 req-533f5308-6a19-4028-bb6d-9cb35a646323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received event network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.663 186792 DEBUG oslo_concurrency.lockutils [req-1db5a156-19fd-4f3e-b233-2c699e0ddb97 req-533f5308-6a19-4028-bb6d-9cb35a646323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.663 186792 DEBUG oslo_concurrency.lockutils [req-1db5a156-19fd-4f3e-b233-2c699e0ddb97 req-533f5308-6a19-4028-bb6d-9cb35a646323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.664 186792 DEBUG oslo_concurrency.lockutils [req-1db5a156-19fd-4f3e-b233-2c699e0ddb97 req-533f5308-6a19-4028-bb6d-9cb35a646323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.664 186792 DEBUG nova.compute.manager [req-1db5a156-19fd-4f3e-b233-2c699e0ddb97 req-533f5308-6a19-4028-bb6d-9cb35a646323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Processing event network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.665 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.669 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798743.6688662, c25fedf6-8ee9-48d2-a91c-f5040b45cb61 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.670 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.671 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.680 186792 INFO nova.virt.libvirt.driver [-] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Instance spawned successfully.#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.682 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.700 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.707 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.715 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.717 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.718 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.718 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.719 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.719 186792 DEBUG nova.virt.libvirt.driver [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.762 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.773 186792 DEBUG nova.compute.manager [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.774 186792 DEBUG oslo_concurrency.lockutils [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.774 186792 DEBUG oslo_concurrency.lockutils [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.774 186792 DEBUG oslo_concurrency.lockutils [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.774 186792 DEBUG nova.compute.manager [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.775 186792 WARNING nova.compute.manager [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_finish.#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.816 186792 INFO nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Took 19.09 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.817 186792 DEBUG nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.841 186792 DEBUG nova.network.neutron [req-bc53fabd-c92d-4242-8d81-fc152d58a3f0 req-b9bd444a-4a9a-4cfe-8624-c585f94683a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Updated VIF entry in instance network info cache for port faf94f4d-c395-46ad-93f8-9d7ef5f27d12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.842 186792 DEBUG nova.network.neutron [req-bc53fabd-c92d-4242-8d81-fc152d58a3f0 req-b9bd444a-4a9a-4cfe-8624-c585f94683a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Updating instance_info_cache with network_info: [{"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.881 186792 DEBUG oslo_concurrency.lockutils [req-bc53fabd-c92d-4242-8d81-fc152d58a3f0 req-b9bd444a-4a9a-4cfe-8624-c585f94683a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.954 186792 INFO nova.compute.manager [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Took 20.24 seconds to build instance.#033[00m
Nov 22 03:05:43 np0005531888 nova_compute[186788]: 2025-11-22 08:05:43.987 186792 DEBUG oslo_concurrency.lockutils [None req-59ca39e8-e53f-4702-8a5e-828a27380f29 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:44 np0005531888 podman[230282]: 2025-11-22 08:05:44.710946158 +0000 UTC m=+0.077920427 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:05:45 np0005531888 nova_compute[186788]: 2025-11-22 08:05:45.995 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.268 186792 DEBUG nova.compute.manager [req-abeca885-a02b-43ba-ad6f-4f88d13f52cf req-2e85e9a9-4938-49e7-a315-e9b161ffa912 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received event network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.269 186792 DEBUG oslo_concurrency.lockutils [req-abeca885-a02b-43ba-ad6f-4f88d13f52cf req-2e85e9a9-4938-49e7-a315-e9b161ffa912 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.269 186792 DEBUG oslo_concurrency.lockutils [req-abeca885-a02b-43ba-ad6f-4f88d13f52cf req-2e85e9a9-4938-49e7-a315-e9b161ffa912 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.270 186792 DEBUG oslo_concurrency.lockutils [req-abeca885-a02b-43ba-ad6f-4f88d13f52cf req-2e85e9a9-4938-49e7-a315-e9b161ffa912 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.270 186792 DEBUG nova.compute.manager [req-abeca885-a02b-43ba-ad6f-4f88d13f52cf req-2e85e9a9-4938-49e7-a315-e9b161ffa912 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] No waiting events found dispatching network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.270 186792 WARNING nova.compute.manager [req-abeca885-a02b-43ba-ad6f-4f88d13f52cf req-2e85e9a9-4938-49e7-a315-e9b161ffa912 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received unexpected event network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.775 186792 DEBUG nova.compute.manager [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.775 186792 DEBUG oslo_concurrency.lockutils [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.775 186792 DEBUG oslo_concurrency.lockutils [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.775 186792 DEBUG oslo_concurrency.lockutils [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.776 186792 DEBUG nova.compute.manager [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:46 np0005531888 nova_compute[186788]: 2025-11-22 08:05:46.776 186792 WARNING nova.compute.manager [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 22 03:05:48 np0005531888 nova_compute[186788]: 2025-11-22 08:05:48.099 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:48 np0005531888 nova_compute[186788]: 2025-11-22 08:05:48.479 186792 DEBUG nova.compute.manager [req-9012d04e-d027-4886-8a06-1bd27857840c req-716bc999-e948-426c-85c0-ecd616c40c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received event network-changed-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:48 np0005531888 nova_compute[186788]: 2025-11-22 08:05:48.480 186792 DEBUG nova.compute.manager [req-9012d04e-d027-4886-8a06-1bd27857840c req-716bc999-e948-426c-85c0-ecd616c40c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Refreshing instance network info cache due to event network-changed-faf94f4d-c395-46ad-93f8-9d7ef5f27d12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:05:48 np0005531888 nova_compute[186788]: 2025-11-22 08:05:48.481 186792 DEBUG oslo_concurrency.lockutils [req-9012d04e-d027-4886-8a06-1bd27857840c req-716bc999-e948-426c-85c0-ecd616c40c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:48 np0005531888 nova_compute[186788]: 2025-11-22 08:05:48.481 186792 DEBUG oslo_concurrency.lockutils [req-9012d04e-d027-4886-8a06-1bd27857840c req-716bc999-e948-426c-85c0-ecd616c40c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:48 np0005531888 nova_compute[186788]: 2025-11-22 08:05:48.482 186792 DEBUG nova.network.neutron [req-9012d04e-d027-4886-8a06-1bd27857840c req-716bc999-e948-426c-85c0-ecd616c40c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Refreshing network info cache for port faf94f4d-c395-46ad-93f8-9d7ef5f27d12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:05:48 np0005531888 nova_compute[186788]: 2025-11-22 08:05:48.675 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:49 np0005531888 podman[230306]: 2025-11-22 08:05:49.708937825 +0000 UTC m=+0.076935973 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, vendor=Red Hat, Inc.)
Nov 22 03:05:50 np0005531888 nova_compute[186788]: 2025-11-22 08:05:50.870 186792 DEBUG nova.network.neutron [req-9012d04e-d027-4886-8a06-1bd27857840c req-716bc999-e948-426c-85c0-ecd616c40c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Updated VIF entry in instance network info cache for port faf94f4d-c395-46ad-93f8-9d7ef5f27d12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:05:50 np0005531888 nova_compute[186788]: 2025-11-22 08:05:50.873 186792 DEBUG nova.network.neutron [req-9012d04e-d027-4886-8a06-1bd27857840c req-716bc999-e948-426c-85c0-ecd616c40c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Updating instance_info_cache with network_info: [{"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:50 np0005531888 nova_compute[186788]: 2025-11-22 08:05:50.937 186792 DEBUG oslo_concurrency.lockutils [req-9012d04e-d027-4886-8a06-1bd27857840c req-716bc999-e948-426c-85c0-ecd616c40c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:50 np0005531888 nova_compute[186788]: 2025-11-22 08:05:50.999 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.016 186792 DEBUG nova.compute.manager [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.017 186792 DEBUG oslo_concurrency.lockutils [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.017 186792 DEBUG oslo_concurrency.lockutils [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.018 186792 DEBUG oslo_concurrency.lockutils [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.018 186792 DEBUG nova.compute.manager [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.018 186792 WARNING nova.compute.manager [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.340 186792 INFO nova.compute.manager [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Swapping old allocation on dict_keys(['1afd6948-7df7-46e7-8718-35e2b3007a5d']) held by migration 0696147e-c4a0-43ec-ad4c-96615e87f0b0 for instance#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.419 186792 DEBUG nova.scheduler.client.report [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Overwriting current allocation {'allocations': {'0a011418-630a-4be8-ab23-41ec1c11a5ea': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}, 'generation': 66}}, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'consumer_generation': 1} on consumer b9ee5ebd-90a8-426a-b369-d38bf61616f2 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Nov 22 03:05:51 np0005531888 nova_compute[186788]: 2025-11-22 08:05:51.856 186792 INFO nova.network.neutron [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.102 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.383 186792 DEBUG nova.compute.manager [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.384 186792 DEBUG oslo_concurrency.lockutils [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.385 186792 DEBUG oslo_concurrency.lockutils [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.385 186792 DEBUG oslo_concurrency.lockutils [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.386 186792 DEBUG nova.compute.manager [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.387 186792 WARNING nova.compute.manager [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.958 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.959 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:53 np0005531888 nova_compute[186788]: 2025-11-22 08:05:53.959 186792 DEBUG nova.network.neutron [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:05:54 np0005531888 nova_compute[186788]: 2025-11-22 08:05:54.173 186792 DEBUG nova.compute.manager [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:54 np0005531888 nova_compute[186788]: 2025-11-22 08:05:54.174 186792 DEBUG nova.compute.manager [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing instance network info cache due to event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:05:54 np0005531888 nova_compute[186788]: 2025-11-22 08:05:54.174 186792 DEBUG oslo_concurrency.lockutils [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:54 np0005531888 nova_compute[186788]: 2025-11-22 08:05:54.705 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:54 np0005531888 podman[230326]: 2025-11-22 08:05:54.721611314 +0000 UTC m=+0.089611814 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 22 03:05:54 np0005531888 podman[230325]: 2025-11-22 08:05:54.734310086 +0000 UTC m=+0.105367371 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:05:54 np0005531888 nova_compute[186788]: 2025-11-22 08:05:54.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:54 np0005531888 nova_compute[186788]: 2025-11-22 08:05:54.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:54 np0005531888 nova_compute[186788]: 2025-11-22 08:05:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:05:54 np0005531888 nova_compute[186788]: 2025-11-22 08:05:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:05:55 np0005531888 nova_compute[186788]: 2025-11-22 08:05:55.328 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:55 np0005531888 nova_compute[186788]: 2025-11-22 08:05:55.329 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:55 np0005531888 nova_compute[186788]: 2025-11-22 08:05:55.329 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:05:55 np0005531888 nova_compute[186788]: 2025-11-22 08:05:55.330 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c25fedf6-8ee9-48d2-a91c-f5040b45cb61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:55 np0005531888 nova_compute[186788]: 2025-11-22 08:05:55.619 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.002 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.776 186792 DEBUG nova.network.neutron [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.829 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.830 186792 DEBUG nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.832 186792 DEBUG oslo_concurrency.lockutils [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.833 186792 DEBUG nova.network.neutron [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.840 186792 DEBUG nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Start _get_guest_xml network_info=[{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.844 186792 WARNING nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.850 186792 DEBUG nova.virt.libvirt.host [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.850 186792 DEBUG nova.virt.libvirt.host [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.853 186792 DEBUG nova.virt.libvirt.host [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.854 186792 DEBUG nova.virt.libvirt.host [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.855 186792 DEBUG nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.856 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.856 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.856 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.856 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.857 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.857 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.857 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.857 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.858 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.858 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.858 186792 DEBUG nova.virt.hardware [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.858 186792 DEBUG nova.objects.instance [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.893 186792 DEBUG oslo_concurrency.processutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.954 186792 DEBUG oslo_concurrency.processutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.955 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.956 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.957 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.958 186792 DEBUG nova.virt.libvirt.vif [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:05:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:05:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.958 186792 DEBUG nova.network.os_vif_util [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.960 186792 DEBUG nova.network.os_vif_util [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.962 186792 DEBUG nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <uuid>b9ee5ebd-90a8-426a-b369-d38bf61616f2</uuid>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <name>instance-00000067</name>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerActionsTestJSON-server-316749730</nova:name>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:05:56</nova:creationTime>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        <nova:port uuid="348c8bec-11f0-4b6d-9dce-ae3c3f37efbc">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <entry name="serial">b9ee5ebd-90a8-426a-b369-d38bf61616f2</entry>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <entry name="uuid">b9ee5ebd-90a8-426a-b369-d38bf61616f2</entry>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:ba:d5:b9"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <target dev="tap348c8bec-11"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/console.log" append="off"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <input type="keyboard" bus="usb"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:05:56 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:05:56 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:05:56 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:05:56 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.963 186792 DEBUG nova.compute.manager [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Preparing to wait for external event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.964 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.964 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.964 186792 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.965 186792 DEBUG nova.virt.libvirt.vif [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:05:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:05:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.965 186792 DEBUG nova.network.os_vif_util [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.966 186792 DEBUG nova.network.os_vif_util [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.966 186792 DEBUG os_vif [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.967 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.968 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.970 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.970 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap348c8bec-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.970 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap348c8bec-11, col_values=(('external_ids', {'iface-id': '348c8bec-11f0-4b6d-9dce-ae3c3f37efbc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:d5:b9', 'vm-uuid': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.972 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:56 np0005531888 NetworkManager[55166]: <info>  [1763798756.9735] manager: (tap348c8bec-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:56 np0005531888 nova_compute[186788]: 2025-11-22 08:05:56.982 186792 INFO os_vif [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11')#033[00m
Nov 22 03:05:57 np0005531888 kernel: tap348c8bec-11: entered promiscuous mode
Nov 22 03:05:57 np0005531888 NetworkManager[55166]: <info>  [1763798757.1119] manager: (tap348c8bec-11): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Nov 22 03:05:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:57Z|00335|binding|INFO|Claiming lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for this chassis.
Nov 22 03:05:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:57Z|00336|binding|INFO|348c8bec-11f0-4b6d-9dce-ae3c3f37efbc: Claiming fa:16:3e:ba:d5:b9 10.100.0.14
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.116 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.127 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.129 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis#033[00m
Nov 22 03:05:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:57Z|00337|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc ovn-installed in OVS
Nov 22 03:05:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:57Z|00338|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc up in Southbound
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.131 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.136 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.144 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[31269bde-03d5-4781-980d-228531804b24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.145 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.146 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.147 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b291a7-7639-4d0f-b697-1a7e04ecdff4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.147 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dde546c8-b2d2-4fe9-8529-5cbe1bfebbe1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 systemd-udevd[230396]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:05:57 np0005531888 systemd-machined[153106]: New machine qemu-50-instance-00000067.
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.159 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0426d8-f6e9-43e8-ab19-8a0ee8ff78d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 NetworkManager[55166]: <info>  [1763798757.1659] device (tap348c8bec-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:05:57 np0005531888 NetworkManager[55166]: <info>  [1763798757.1669] device (tap348c8bec-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.175 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[562283f5-da5b-4d5b-af74-6bc844cf67c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 systemd[1]: Started Virtual Machine qemu-50-instance-00000067.
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.206 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[043c023d-662c-48bb-8a0d-75de13734804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 systemd-udevd[230401]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:05:57 np0005531888 NetworkManager[55166]: <info>  [1763798757.2158] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/168)
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.214 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[79f7ac00-220e-45bc-94f4-be8110060943]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.249 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[667d6e7a-bc33-49ed-8143-850ca7ec35a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.253 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4b20b4af-d02b-467d-921a-95d001e8befd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 NetworkManager[55166]: <info>  [1763798757.2783] device (tap165f7f23-d0): carrier: link connected
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.284 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6bda9683-b259-4a95-adab-4ff1878bd97a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.302 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2a041caf-e90a-450c-b55c-4bcf8d10ffd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549792, 'reachable_time': 38373, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230429, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.318 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5b461401-7e60-48df-91a5-6a54b1ff6365]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549792, 'tstamp': 549792}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230430, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.333 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d8e4ca-fd98-47f2-9db9-071bd276803d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549792, 'reachable_time': 38373, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230431, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.375 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc382d1-2d0a-4246-b695-fd76e9f2a434]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.436 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2fcc9ee5-1e0f-470a-b4f9-349c824e8674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.438 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.438 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.439 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.441 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:57 np0005531888 NetworkManager[55166]: <info>  [1763798757.4426] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Nov 22 03:05:57 np0005531888 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.445 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.447 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:57Z|00339|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.451 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.452 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.453 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[daa09bf8-4112-464b-8a7d-2fce482dc3f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.454 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:05:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:05:57.455 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.464 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.639 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798757.638813, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.639 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Started (Lifecycle Event)#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.668 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.672 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798757.6416094, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.673 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.702 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.706 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.737 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.896 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Updating instance_info_cache with network_info: [{"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.913 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-c25fedf6-8ee9-48d2-a91c-f5040b45cb61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.913 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.913 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:57 np0005531888 nova_compute[186788]: 2025-11-22 08:05:57.914 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:57 np0005531888 podman[230469]: 2025-11-22 08:05:57.842490564 +0000 UTC m=+0.024049652 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:05:58 np0005531888 nova_compute[186788]: 2025-11-22 08:05:58.103 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:58 np0005531888 podman[230469]: 2025-11-22 08:05:58.467044552 +0000 UTC m=+0.648603600 container create a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:05:58 np0005531888 systemd[1]: Started libpod-conmon-a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc.scope.
Nov 22 03:05:58 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:05:58 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ad3e32291f488ca52d48998ce9ff49a84d03566d0ab21cc22899b2816eb7f9c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:05:58 np0005531888 podman[230469]: 2025-11-22 08:05:58.72604285 +0000 UTC m=+0.907601938 container init a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:05:58 np0005531888 podman[230469]: 2025-11-22 08:05:58.733501594 +0000 UTC m=+0.915060652 container start a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:05:58 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230496]: [NOTICE]   (230500) : New worker (230502) forked
Nov 22 03:05:58 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230496]: [NOTICE]   (230500) : Loading success.
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.228 186792 DEBUG nova.compute.manager [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.228 186792 DEBUG oslo_concurrency.lockutils [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.228 186792 DEBUG oslo_concurrency.lockutils [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.229 186792 DEBUG oslo_concurrency.lockutils [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.229 186792 DEBUG nova.compute.manager [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Processing event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.230 186792 DEBUG nova.compute.manager [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.234 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798759.234073, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.234 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.238 186792 INFO nova.virt.libvirt.driver [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance running successfully.#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.238 186792 DEBUG nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.383 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:59Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:98:6f 10.100.0.12
Nov 22 03:05:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:05:59Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:98:6f 10.100.0.12
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.388 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.486 186792 INFO nova.compute.manager [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance to original state: 'active'#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.493 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.617 186792 DEBUG nova.network.neutron [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updated VIF entry in instance network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.618 186792 DEBUG nova.network.neutron [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.649 186792 DEBUG oslo_concurrency.lockutils [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:59 np0005531888 nova_compute[186788]: 2025-11-22 08:05:59.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.406 186792 DEBUG nova.compute.manager [req-4ef580b0-9c6c-465b-8cd3-ee8c6a424ee4 req-380fc759-a0d5-4175-a045-3deb48fd4ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.406 186792 DEBUG oslo_concurrency.lockutils [req-4ef580b0-9c6c-465b-8cd3-ee8c6a424ee4 req-380fc759-a0d5-4175-a045-3deb48fd4ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.406 186792 DEBUG oslo_concurrency.lockutils [req-4ef580b0-9c6c-465b-8cd3-ee8c6a424ee4 req-380fc759-a0d5-4175-a045-3deb48fd4ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.407 186792 DEBUG oslo_concurrency.lockutils [req-4ef580b0-9c6c-465b-8cd3-ee8c6a424ee4 req-380fc759-a0d5-4175-a045-3deb48fd4ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.407 186792 DEBUG nova.compute.manager [req-4ef580b0-9c6c-465b-8cd3-ee8c6a424ee4 req-380fc759-a0d5-4175-a045-3deb48fd4ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.407 186792 WARNING nova.compute.manager [req-4ef580b0-9c6c-465b-8cd3-ee8c6a424ee4 req-380fc759-a0d5-4175-a045-3deb48fd4ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:06:01 np0005531888 nova_compute[186788]: 2025-11-22 08:06:01.974 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:02Z|00340|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 03:06:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:02Z|00341|binding|INFO|Releasing lport a3980c72-08b4-467a-93da-70e8d0a69589 from this chassis (sb_readonly=0)
Nov 22 03:06:02 np0005531888 nova_compute[186788]: 2025-11-22 08:06:02.186 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.105 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:03 np0005531888 podman[230512]: 2025-11-22 08:06:03.683470861 +0000 UTC m=+0.052694267 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:06:03 np0005531888 podman[230513]: 2025-11-22 08:06:03.683501371 +0000 UTC m=+0.051212199 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.795 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.795 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.796 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.796 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.796 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.803 186792 INFO nova.compute.manager [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Terminating instance#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.808 186792 DEBUG nova.compute.manager [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:06:03 np0005531888 kernel: tap348c8bec-11 (unregistering): left promiscuous mode
Nov 22 03:06:03 np0005531888 NetworkManager[55166]: <info>  [1763798763.8407] device (tap348c8bec-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:06:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:03Z|00342|binding|INFO|Releasing lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc from this chassis (sb_readonly=0)
Nov 22 03:06:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:03Z|00343|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc down in Southbound
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.844 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:03Z|00344|binding|INFO|Removing iface tap348c8bec-11 ovn-installed in OVS
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.848 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:03.855 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:03.857 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:06:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:03.858 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:06:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:03.860 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[132615c1-278d-478a-83c9-5e3ba67d0d9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:03.860 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.864 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:03 np0005531888 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000067.scope: Deactivated successfully.
Nov 22 03:06:03 np0005531888 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000067.scope: Consumed 5.196s CPU time.
Nov 22 03:06:03 np0005531888 systemd-machined[153106]: Machine qemu-50-instance-00000067 terminated.
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:03 np0005531888 nova_compute[186788]: 2025-11-22 08:06:03.984 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:06:04 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230496]: [NOTICE]   (230500) : haproxy version is 2.8.14-c23fe91
Nov 22 03:06:04 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230496]: [NOTICE]   (230500) : path to executable is /usr/sbin/haproxy
Nov 22 03:06:04 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230496]: [WARNING]  (230500) : Exiting Master process...
Nov 22 03:06:04 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230496]: [ALERT]    (230500) : Current worker (230502) exited with code 143 (Terminated)
Nov 22 03:06:04 np0005531888 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230496]: [WARNING]  (230500) : All workers exited. Exiting... (0)
Nov 22 03:06:04 np0005531888 systemd[1]: libpod-a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc.scope: Deactivated successfully.
Nov 22 03:06:04 np0005531888 podman[230577]: 2025-11-22 08:06:04.011808815 +0000 UTC m=+0.062642342 container died a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:06:04 np0005531888 kernel: tap348c8bec-11: entered promiscuous mode
Nov 22 03:06:04 np0005531888 NetworkManager[55166]: <info>  [1763798764.0281] manager: (tap348c8bec-11): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Nov 22 03:06:04 np0005531888 kernel: tap348c8bec-11 (unregistering): left promiscuous mode
Nov 22 03:06:04 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:04Z|00345|binding|INFO|Claiming lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for this chassis.
Nov 22 03:06:04 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:04Z|00346|binding|INFO|348c8bec-11f0-4b6d-9dce-ae3c3f37efbc: Claiming fa:16:3e:ba:d5:b9 10.100.0.14
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.035 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.050 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:04 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc-userdata-shm.mount: Deactivated successfully.
Nov 22 03:06:04 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:04Z|00347|binding|INFO|Releasing lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc from this chassis (sb_readonly=0)
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.056 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:04 np0005531888 systemd[1]: var-lib-containers-storage-overlay-8ad3e32291f488ca52d48998ce9ff49a84d03566d0ab21cc22899b2816eb7f9c-merged.mount: Deactivated successfully.
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.065 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.084 186792 INFO nova.virt.libvirt.driver [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance destroyed successfully.#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.084 186792 DEBUG nova.objects.instance [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:06:04 np0005531888 podman[230577]: 2025-11-22 08:06:04.095579304 +0000 UTC m=+0.146412821 container cleanup a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.101 186792 DEBUG nova.virt.libvirt.vif [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:05:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:05:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.102 186792 DEBUG nova.network.os_vif_util [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.103 186792 DEBUG nova.network.os_vif_util [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.103 186792 DEBUG os_vif [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:06:04 np0005531888 systemd[1]: libpod-conmon-a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc.scope: Deactivated successfully.
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.105 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.106 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap348c8bec-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.110 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.112 186792 INFO os_vif [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11')#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.113 186792 INFO nova.virt.libvirt.driver [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Deleting instance files /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_del#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.119 186792 INFO nova.virt.libvirt.driver [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Deletion of /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_del complete#033[00m
Nov 22 03:06:04 np0005531888 podman[230616]: 2025-11-22 08:06:04.175225882 +0000 UTC m=+0.056835048 container remove a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.180 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3805981e-553f-4d90-955d-e28706c44e41]: (4, ('Sat Nov 22 08:06:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc)\na5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc\nSat Nov 22 08:06:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (a5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc)\na5196bc894f8be93ea4c26381d1ddb4a5e6a32c3197b2772b56926a0dc967cdc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.182 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cc954cae-12a2-4c8b-83c7-77dfde24cdd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.184 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.185 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:04 np0005531888 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.189 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.192 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb82a6d-055b-4769-b5c1-ecc31dc898af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.213 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3038a011-851a-40e7-9c41-ecd78cef2c54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.214 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[559f7ebd-d9bd-490b-a06d-04a8f8ceff06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.216 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000067, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.221 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.230 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8147834e-286f-48e0-beac-f64b198111f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549784, 'reachable_time': 17484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230631, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.233 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.233 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f19577b4-bef9-4eca-93f2-75174fc8248e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.234 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:06:04 np0005531888 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.235 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.236 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ff5c37-b104-4285-8752-0b37bf7ca335]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.237 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.238 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:06:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:04.238 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[26417880-7420-4f39-871d-a2af2b2102e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.252 186792 INFO nova.compute.manager [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.253 186792 DEBUG oslo.service.loopingcall [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.253 186792 DEBUG nova.compute.manager [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.253 186792 DEBUG nova.network.neutron [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.297 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.298 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.360 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.537 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.538 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5514MB free_disk=73.21733093261719GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.538 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.538 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.687 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance c25fedf6-8ee9-48d2-a91c-f5040b45cb61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.688 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance b9ee5ebd-90a8-426a-b369-d38bf61616f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.688 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.688 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:06:04 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:04Z|00348|binding|INFO|Releasing lport a3980c72-08b4-467a-93da-70e8d0a69589 from this chassis (sb_readonly=0)
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.794 186792 DEBUG nova.compute.manager [req-d80c43d5-fdec-4b3f-9bc7-937c5e180828 req-fd1cfbda-9d9d-427f-ab9f-84eaf858e459 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.795 186792 DEBUG oslo_concurrency.lockutils [req-d80c43d5-fdec-4b3f-9bc7-937c5e180828 req-fd1cfbda-9d9d-427f-ab9f-84eaf858e459 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.795 186792 DEBUG oslo_concurrency.lockutils [req-d80c43d5-fdec-4b3f-9bc7-937c5e180828 req-fd1cfbda-9d9d-427f-ab9f-84eaf858e459 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.795 186792 DEBUG oslo_concurrency.lockutils [req-d80c43d5-fdec-4b3f-9bc7-937c5e180828 req-fd1cfbda-9d9d-427f-ab9f-84eaf858e459 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.795 186792 DEBUG nova.compute.manager [req-d80c43d5-fdec-4b3f-9bc7-937c5e180828 req-fd1cfbda-9d9d-427f-ab9f-84eaf858e459 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.796 186792 DEBUG nova.compute.manager [req-d80c43d5-fdec-4b3f-9bc7-937c5e180828 req-fd1cfbda-9d9d-427f-ab9f-84eaf858e459 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.800 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.823 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.846 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.869 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:06:04 np0005531888 nova_compute[186788]: 2025-11-22 08:06:04.870 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.593 186792 DEBUG nova.network.neutron [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.628 186792 INFO nova.compute.manager [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Took 2.37 seconds to deallocate network for instance.#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.714 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.715 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.794 186792 DEBUG nova.compute.provider_tree [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.812 186792 DEBUG nova.compute.manager [req-e8bb9f4b-d882-43fa-818e-4b4b17ab23ca req-983c1604-82fa-44c6-9f17-d78ae3d1b98c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-deleted-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.832 186792 DEBUG nova.scheduler.client.report [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.913 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:06.981 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:06 np0005531888 nova_compute[186788]: 2025-11-22 08:06:06.982 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:06.982 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:06:07 np0005531888 nova_compute[186788]: 2025-11-22 08:06:07.001 186792 DEBUG nova.compute.manager [req-f02d3a41-bf26-4c7d-8092-884b5acdd7f5 req-dea96c1b-b4d2-4d7b-9b40-562b438e3e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:07 np0005531888 nova_compute[186788]: 2025-11-22 08:06:07.002 186792 DEBUG oslo_concurrency.lockutils [req-f02d3a41-bf26-4c7d-8092-884b5acdd7f5 req-dea96c1b-b4d2-4d7b-9b40-562b438e3e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:07 np0005531888 nova_compute[186788]: 2025-11-22 08:06:07.002 186792 DEBUG oslo_concurrency.lockutils [req-f02d3a41-bf26-4c7d-8092-884b5acdd7f5 req-dea96c1b-b4d2-4d7b-9b40-562b438e3e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:07 np0005531888 nova_compute[186788]: 2025-11-22 08:06:07.002 186792 DEBUG oslo_concurrency.lockutils [req-f02d3a41-bf26-4c7d-8092-884b5acdd7f5 req-dea96c1b-b4d2-4d7b-9b40-562b438e3e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:07 np0005531888 nova_compute[186788]: 2025-11-22 08:06:07.002 186792 DEBUG nova.compute.manager [req-f02d3a41-bf26-4c7d-8092-884b5acdd7f5 req-dea96c1b-b4d2-4d7b-9b40-562b438e3e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:06:07 np0005531888 nova_compute[186788]: 2025-11-22 08:06:07.003 186792 WARNING nova.compute.manager [req-f02d3a41-bf26-4c7d-8092-884b5acdd7f5 req-dea96c1b-b4d2-4d7b-9b40-562b438e3e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:06:07 np0005531888 nova_compute[186788]: 2025-11-22 08:06:07.058 186792 INFO nova.scheduler.client.report [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Deleted allocations for instance b9ee5ebd-90a8-426a-b369-d38bf61616f2#033[00m
Nov 22 03:06:07 np0005531888 nova_compute[186788]: 2025-11-22 08:06:07.165 186792 DEBUG oslo_concurrency.lockutils [None req-586c370a-56c6-4cd8-b172-0e2921ce12db b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:07.985 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:08 np0005531888 nova_compute[186788]: 2025-11-22 08:06:08.106 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:09 np0005531888 nova_compute[186788]: 2025-11-22 08:06:09.109 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:09 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:09Z|00349|binding|INFO|Releasing lport a3980c72-08b4-467a-93da-70e8d0a69589 from this chassis (sb_readonly=0)
Nov 22 03:06:09 np0005531888 nova_compute[186788]: 2025-11-22 08:06:09.935 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:10 np0005531888 nova_compute[186788]: 2025-11-22 08:06:10.866 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:12 np0005531888 podman[230638]: 2025-11-22 08:06:12.693532205 +0000 UTC m=+0.061077773 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:06:13 np0005531888 nova_compute[186788]: 2025-11-22 08:06:13.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:14 np0005531888 nova_compute[186788]: 2025-11-22 08:06:14.111 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:15 np0005531888 podman[230659]: 2025-11-22 08:06:15.678581275 +0000 UTC m=+0.050394270 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:06:15 np0005531888 nova_compute[186788]: 2025-11-22 08:06:15.719 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:18 np0005531888 nova_compute[186788]: 2025-11-22 08:06:18.109 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:19 np0005531888 nova_compute[186788]: 2025-11-22 08:06:19.080 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798764.0785303, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:19 np0005531888 nova_compute[186788]: 2025-11-22 08:06:19.081 186792 INFO nova.compute.manager [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:06:19 np0005531888 nova_compute[186788]: 2025-11-22 08:06:19.096 186792 DEBUG nova.compute.manager [None req-1a4585bc-0c32-42dc-a4b9-3c3cb5a1378a - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:19 np0005531888 nova_compute[186788]: 2025-11-22 08:06:19.113 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:19 np0005531888 nova_compute[186788]: 2025-11-22 08:06:19.370 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.099 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.100 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.102 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.102 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.102 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.109 186792 INFO nova.compute.manager [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Terminating instance#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.116 186792 DEBUG nova.compute.manager [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:06:20 np0005531888 kernel: tapfaf94f4d-c3 (unregistering): left promiscuous mode
Nov 22 03:06:20 np0005531888 NetworkManager[55166]: <info>  [1763798780.1545] device (tapfaf94f4d-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00350|binding|INFO|Releasing lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 from this chassis (sb_readonly=0)
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00351|binding|INFO|Setting lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 down in Southbound
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.156 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00352|binding|INFO|Removing iface tapfaf94f4d-c3 ovn-installed in OVS
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.158 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.173 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000069.scope: Deactivated successfully.
Nov 22 03:06:20 np0005531888 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000069.scope: Consumed 15.953s CPU time.
Nov 22 03:06:20 np0005531888 systemd-machined[153106]: Machine qemu-49-instance-00000069 terminated.
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.243 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:98:6f 10.100.0.12'], port_security=['fa:16:3e:9f:98:6f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c25fedf6-8ee9-48d2-a91c-f5040b45cb61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-256b80dd-b157-4570-860a-47654e0873df', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3cb45f10b9cc44f28a854f445948ff8d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cb8b4101-05bc-4911-9b59-cd8e80593f59', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7820d97-20c2-4b38-84a7-3f5b00fd5c33, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=faf94f4d-c395-46ad-93f8-9d7ef5f27d12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.245 104023 INFO neutron.agent.ovn.metadata.agent [-] Port faf94f4d-c395-46ad-93f8-9d7ef5f27d12 in datapath 256b80dd-b157-4570-860a-47654e0873df unbound from our chassis#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.248 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 256b80dd-b157-4570-860a-47654e0873df, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.249 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[47986c43-6a5f-482c-8e6f-dbab07e76c4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.250 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-256b80dd-b157-4570-860a-47654e0873df namespace which is not needed anymore#033[00m
Nov 22 03:06:20 np0005531888 podman[230686]: 2025-11-22 08:06:20.252867005 +0000 UTC m=+0.067070170 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, config_id=edpm, container_name=openstack_network_exporter)
Nov 22 03:06:20 np0005531888 kernel: tapfaf94f4d-c3: entered promiscuous mode
Nov 22 03:06:20 np0005531888 kernel: tapfaf94f4d-c3 (unregistering): left promiscuous mode
Nov 22 03:06:20 np0005531888 NetworkManager[55166]: <info>  [1763798780.3375] manager: (tapfaf94f4d-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.339 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00353|binding|INFO|Claiming lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 for this chassis.
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00354|binding|INFO|faf94f4d-c395-46ad-93f8-9d7ef5f27d12: Claiming fa:16:3e:9f:98:6f 10.100.0.12
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.347 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:98:6f 10.100.0.12'], port_security=['fa:16:3e:9f:98:6f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c25fedf6-8ee9-48d2-a91c-f5040b45cb61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-256b80dd-b157-4570-860a-47654e0873df', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3cb45f10b9cc44f28a854f445948ff8d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cb8b4101-05bc-4911-9b59-cd8e80593f59', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7820d97-20c2-4b38-84a7-3f5b00fd5c33, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=faf94f4d-c395-46ad-93f8-9d7ef5f27d12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00355|binding|INFO|Setting lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 ovn-installed in OVS
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00356|binding|INFO|Setting lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 up in Southbound
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00357|binding|INFO|Releasing lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 from this chassis (sb_readonly=1)
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00358|if_status|INFO|Dropped 6 log messages in last 55 seconds (most recently, 55 seconds ago) due to excessive rate
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00359|if_status|INFO|Not setting lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 down as sb is readonly
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00360|binding|INFO|Removing iface tapfaf94f4d-c3 ovn-installed in OVS
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.361 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.364 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.372 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00361|binding|INFO|Releasing lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 from this chassis (sb_readonly=0)
Nov 22 03:06:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:20Z|00362|binding|INFO|Setting lport faf94f4d-c395-46ad-93f8-9d7ef5f27d12 down in Southbound
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.389 186792 INFO nova.virt.libvirt.driver [-] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Instance destroyed successfully.#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.390 186792 DEBUG nova.objects.instance [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lazy-loading 'resources' on Instance uuid c25fedf6-8ee9-48d2-a91c-f5040b45cb61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.399 186792 DEBUG nova.virt.libvirt.vif [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:05:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=105,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK71tIxRMdncuWpyEZ3UI/snljNEMVFB7YULd++BTz1vmTgY+HZOnudxZ7/apao9vSiBdmD/tbqA0CZQO59zT5/wzEr9cAO0OmDAaw1kCXpC2NqTutjtHc5UhNEIUzEqeA==',key_name='tempest-keypair-1150638835',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:05:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3cb45f10b9cc44f28a854f445948ff8d',ramdisk_id='',reservation_id='r-matfnu47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-867436661',owner_user_name='tempest-ServersTestFqdnHostnames-867436661-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:05:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='68050a7a73c74478bc5c540f68e3e639',uuid=c25fedf6-8ee9-48d2-a91c-f5040b45cb61,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.399 186792 DEBUG nova.network.os_vif_util [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Converting VIF {"id": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "address": "fa:16:3e:9f:98:6f", "network": {"id": "256b80dd-b157-4570-860a-47654e0873df", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1625990719-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3cb45f10b9cc44f28a854f445948ff8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaf94f4d-c3", "ovs_interfaceid": "faf94f4d-c395-46ad-93f8-9d7ef5f27d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.400 186792 DEBUG nova.network.os_vif_util [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:98:6f,bridge_name='br-int',has_traffic_filtering=True,id=faf94f4d-c395-46ad-93f8-9d7ef5f27d12,network=Network(256b80dd-b157-4570-860a-47654e0873df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaf94f4d-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.400 186792 DEBUG os_vif [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:98:6f,bridge_name='br-int',has_traffic_filtering=True,id=faf94f4d-c395-46ad-93f8-9d7ef5f27d12,network=Network(256b80dd-b157-4570-860a-47654e0873df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaf94f4d-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.402 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.403 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfaf94f4d-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.404 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.407 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.408 186792 INFO os_vif [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:98:6f,bridge_name='br-int',has_traffic_filtering=True,id=faf94f4d-c395-46ad-93f8-9d7ef5f27d12,network=Network(256b80dd-b157-4570-860a-47654e0873df),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaf94f4d-c3')#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.409 186792 INFO nova.virt.libvirt.driver [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Deleting instance files /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61_del#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.410 186792 INFO nova.virt.libvirt.driver [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Deletion of /var/lib/nova/instances/c25fedf6-8ee9-48d2-a91c-f5040b45cb61_del complete#033[00m
Nov 22 03:06:20 np0005531888 neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df[230267]: [NOTICE]   (230271) : haproxy version is 2.8.14-c23fe91
Nov 22 03:06:20 np0005531888 neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df[230267]: [NOTICE]   (230271) : path to executable is /usr/sbin/haproxy
Nov 22 03:06:20 np0005531888 neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df[230267]: [WARNING]  (230271) : Exiting Master process...
Nov 22 03:06:20 np0005531888 neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df[230267]: [ALERT]    (230271) : Current worker (230273) exited with code 143 (Terminated)
Nov 22 03:06:20 np0005531888 neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df[230267]: [WARNING]  (230271) : All workers exited. Exiting... (0)
Nov 22 03:06:20 np0005531888 systemd[1]: libpod-e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065.scope: Deactivated successfully.
Nov 22 03:06:20 np0005531888 podman[230731]: 2025-11-22 08:06:20.464208862 +0000 UTC m=+0.124454221 container died e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.502 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:98:6f 10.100.0.12'], port_security=['fa:16:3e:9f:98:6f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c25fedf6-8ee9-48d2-a91c-f5040b45cb61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-256b80dd-b157-4570-860a-47654e0873df', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3cb45f10b9cc44f28a854f445948ff8d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cb8b4101-05bc-4911-9b59-cd8e80593f59', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7820d97-20c2-4b38-84a7-3f5b00fd5c33, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=faf94f4d-c395-46ad-93f8-9d7ef5f27d12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:20 np0005531888 systemd[1]: var-lib-containers-storage-overlay-e331bb25c8b3124f5bc65313327194a794e5d1519701bd5514a20e84c9bf3423-merged.mount: Deactivated successfully.
Nov 22 03:06:20 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065-userdata-shm.mount: Deactivated successfully.
Nov 22 03:06:20 np0005531888 podman[230731]: 2025-11-22 08:06:20.592852076 +0000 UTC m=+0.253097435 container cleanup e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:06:20 np0005531888 systemd[1]: libpod-conmon-e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065.scope: Deactivated successfully.
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.644 186792 INFO nova.compute.manager [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Took 0.53 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.644 186792 DEBUG oslo.service.loopingcall [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.645 186792 DEBUG nova.compute.manager [-] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.645 186792 DEBUG nova.network.neutron [-] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:06:20 np0005531888 podman[230775]: 2025-11-22 08:06:20.727552548 +0000 UTC m=+0.115068031 container remove e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.732 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b49b60f3-a5c9-4d7b-896d-34800e9b44a3]: (4, ('Sat Nov 22 08:06:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df (e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065)\ne9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065\nSat Nov 22 08:06:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-256b80dd-b157-4570-860a-47654e0873df (e9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065)\ne9185ef5e40becad8bd8d72e3604c8f027e55419bbcbb4d841ef5de9ccd01065\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.735 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fa146359-7ceb-4789-9664-d40a09c01f49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.736 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap256b80dd-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.737 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 kernel: tap256b80dd-b0: left promiscuous mode
Nov 22 03:06:20 np0005531888 nova_compute[186788]: 2025-11-22 08:06:20.752 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.754 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8c46704b-3796-4db4-ae2a-99a0ae5d2224]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.767 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d9cca89b-a052-49e2-888d-74f798394e6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.768 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dd40225b-5b65-40dc-b091-8287d7e5fada]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.784 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc28d66-9d29-4817-818c-249da6144eff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548319, 'reachable_time': 36181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230790, 'error': None, 'target': 'ovnmeta-256b80dd-b157-4570-860a-47654e0873df', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.787 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-256b80dd-b157-4570-860a-47654e0873df deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.787 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f737a5d0-1edc-424c-ba36-ebb9572dbb73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 systemd[1]: run-netns-ovnmeta\x2d256b80dd\x2db157\x2d4570\x2d860a\x2d47654e0873df.mount: Deactivated successfully.
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.788 104023 INFO neutron.agent.ovn.metadata.agent [-] Port faf94f4d-c395-46ad-93f8-9d7ef5f27d12 in datapath 256b80dd-b157-4570-860a-47654e0873df unbound from our chassis#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.790 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 256b80dd-b157-4570-860a-47654e0873df, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.791 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bebee3ac-715f-42a9-b8df-d75c160b79e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.791 104023 INFO neutron.agent.ovn.metadata.agent [-] Port faf94f4d-c395-46ad-93f8-9d7ef5f27d12 in datapath 256b80dd-b157-4570-860a-47654e0873df unbound from our chassis#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.793 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 256b80dd-b157-4570-860a-47654e0873df, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:06:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:20.793 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f07fa131-ce4f-4bda-b5d5-149dd5e5ac84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:21 np0005531888 nova_compute[186788]: 2025-11-22 08:06:21.527 186792 DEBUG nova.compute.manager [req-6c2378e5-eff7-4492-8fce-91ca5e8b0379 req-77311445-164e-4036-995f-5555bd944b8b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received event network-vif-unplugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:21 np0005531888 nova_compute[186788]: 2025-11-22 08:06:21.528 186792 DEBUG oslo_concurrency.lockutils [req-6c2378e5-eff7-4492-8fce-91ca5e8b0379 req-77311445-164e-4036-995f-5555bd944b8b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:21 np0005531888 nova_compute[186788]: 2025-11-22 08:06:21.528 186792 DEBUG oslo_concurrency.lockutils [req-6c2378e5-eff7-4492-8fce-91ca5e8b0379 req-77311445-164e-4036-995f-5555bd944b8b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:21 np0005531888 nova_compute[186788]: 2025-11-22 08:06:21.528 186792 DEBUG oslo_concurrency.lockutils [req-6c2378e5-eff7-4492-8fce-91ca5e8b0379 req-77311445-164e-4036-995f-5555bd944b8b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:21 np0005531888 nova_compute[186788]: 2025-11-22 08:06:21.528 186792 DEBUG nova.compute.manager [req-6c2378e5-eff7-4492-8fce-91ca5e8b0379 req-77311445-164e-4036-995f-5555bd944b8b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] No waiting events found dispatching network-vif-unplugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:06:21 np0005531888 nova_compute[186788]: 2025-11-22 08:06:21.529 186792 DEBUG nova.compute.manager [req-6c2378e5-eff7-4492-8fce-91ca5e8b0379 req-77311445-164e-4036-995f-5555bd944b8b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received event network-vif-unplugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:06:21 np0005531888 nova_compute[186788]: 2025-11-22 08:06:21.963 186792 DEBUG nova.network.neutron [-] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:21 np0005531888 nova_compute[186788]: 2025-11-22 08:06:21.979 186792 INFO nova.compute.manager [-] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Took 1.33 seconds to deallocate network for instance.#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.064 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.065 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.194 186792 DEBUG nova.compute.provider_tree [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.213 186792 DEBUG nova.scheduler.client.report [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.241 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.286 186792 INFO nova.scheduler.client.report [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Deleted allocations for instance c25fedf6-8ee9-48d2-a91c-f5040b45cb61#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.357 186792 DEBUG oslo_concurrency.lockutils [None req-f0d5e4a5-7d90-42b2-9a9f-28bf706f0ce5 68050a7a73c74478bc5c540f68e3e639 3cb45f10b9cc44f28a854f445948ff8d - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.980 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "a5045f34-cbc5-4b30-8165-f1fe663be743" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.980 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:22 np0005531888 nova_compute[186788]: 2025-11-22 08:06:22.998 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.083 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.083 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.090 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.091 186792 INFO nova.compute.claims [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.111 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.220 186792 DEBUG nova.compute.provider_tree [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.235 186792 DEBUG nova.scheduler.client.report [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.255 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.256 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.302 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.303 186792 DEBUG nova.network.neutron [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.323 186792 INFO nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.339 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.512 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.514 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.515 186792 INFO nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Creating image(s)#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.516 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "/var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.516 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "/var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.517 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "/var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.530 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.595 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.596 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.597 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.609 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.629 186792 DEBUG nova.compute.manager [req-ecced5ce-cd8c-47b2-85b6-be9558d66063 req-af3818fc-57af-4639-99af-07a02cff47b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received event network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.630 186792 DEBUG oslo_concurrency.lockutils [req-ecced5ce-cd8c-47b2-85b6-be9558d66063 req-af3818fc-57af-4639-99af-07a02cff47b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.630 186792 DEBUG oslo_concurrency.lockutils [req-ecced5ce-cd8c-47b2-85b6-be9558d66063 req-af3818fc-57af-4639-99af-07a02cff47b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.630 186792 DEBUG oslo_concurrency.lockutils [req-ecced5ce-cd8c-47b2-85b6-be9558d66063 req-af3818fc-57af-4639-99af-07a02cff47b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c25fedf6-8ee9-48d2-a91c-f5040b45cb61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.630 186792 DEBUG nova.compute.manager [req-ecced5ce-cd8c-47b2-85b6-be9558d66063 req-af3818fc-57af-4639-99af-07a02cff47b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] No waiting events found dispatching network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.631 186792 WARNING nova.compute.manager [req-ecced5ce-cd8c-47b2-85b6-be9558d66063 req-af3818fc-57af-4639-99af-07a02cff47b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received unexpected event network-vif-plugged-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.631 186792 DEBUG nova.compute.manager [req-ecced5ce-cd8c-47b2-85b6-be9558d66063 req-af3818fc-57af-4639-99af-07a02cff47b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Received event network-vif-deleted-faf94f4d-c395-46ad-93f8-9d7ef5f27d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.664 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.665 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.810 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk 1073741824" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.811 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.812 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.867 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.868 186792 DEBUG nova.virt.disk.api [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Checking if we can resize image /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.869 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.941 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.942 186792 DEBUG nova.virt.disk.api [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Cannot resize image /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.942 186792 DEBUG nova.objects.instance [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'migration_context' on Instance uuid a5045f34-cbc5-4b30-8165-f1fe663be743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.960 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.961 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Ensure instance console log exists: /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.962 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.962 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:23 np0005531888 nova_compute[186788]: 2025-11-22 08:06:23.962 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:24 np0005531888 nova_compute[186788]: 2025-11-22 08:06:24.244 186792 DEBUG nova.policy [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:06:25 np0005531888 nova_compute[186788]: 2025-11-22 08:06:25.051 186792 DEBUG nova.network.neutron [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Successfully created port: 21f2a09e-e781-4e4e-9659-691ca54ee1d8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:06:25 np0005531888 nova_compute[186788]: 2025-11-22 08:06:25.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:25 np0005531888 podman[230806]: 2025-11-22 08:06:25.69562959 +0000 UTC m=+0.063534133 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 03:06:25 np0005531888 podman[230807]: 2025-11-22 08:06:25.750145491 +0000 UTC m=+0.117843939 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 03:06:26 np0005531888 nova_compute[186788]: 2025-11-22 08:06:26.386 186792 DEBUG nova.network.neutron [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Successfully updated port: 21f2a09e-e781-4e4e-9659-691ca54ee1d8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:06:26 np0005531888 nova_compute[186788]: 2025-11-22 08:06:26.450 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "refresh_cache-a5045f34-cbc5-4b30-8165-f1fe663be743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:26 np0005531888 nova_compute[186788]: 2025-11-22 08:06:26.451 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquired lock "refresh_cache-a5045f34-cbc5-4b30-8165-f1fe663be743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:06:26 np0005531888 nova_compute[186788]: 2025-11-22 08:06:26.451 186792 DEBUG nova.network.neutron [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:06:26 np0005531888 nova_compute[186788]: 2025-11-22 08:06:26.524 186792 DEBUG nova.compute.manager [req-bfda5000-863d-4e0a-89eb-fd9119c14e6b req-7e7fc4e1-d436-48b2-81ff-2b1ea0317ce8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received event network-changed-21f2a09e-e781-4e4e-9659-691ca54ee1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:26 np0005531888 nova_compute[186788]: 2025-11-22 08:06:26.525 186792 DEBUG nova.compute.manager [req-bfda5000-863d-4e0a-89eb-fd9119c14e6b req-7e7fc4e1-d436-48b2-81ff-2b1ea0317ce8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Refreshing instance network info cache due to event network-changed-21f2a09e-e781-4e4e-9659-691ca54ee1d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:06:26 np0005531888 nova_compute[186788]: 2025-11-22 08:06:26.525 186792 DEBUG oslo_concurrency.lockutils [req-bfda5000-863d-4e0a-89eb-fd9119c14e6b req-7e7fc4e1-d436-48b2-81ff-2b1ea0317ce8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a5045f34-cbc5-4b30-8165-f1fe663be743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:26 np0005531888 nova_compute[186788]: 2025-11-22 08:06:26.677 186792 DEBUG nova.network.neutron [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.599 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.656 186792 DEBUG nova.network.neutron [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Updating instance_info_cache with network_info: [{"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.674 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Releasing lock "refresh_cache-a5045f34-cbc5-4b30-8165-f1fe663be743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.675 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Instance network_info: |[{"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.675 186792 DEBUG oslo_concurrency.lockutils [req-bfda5000-863d-4e0a-89eb-fd9119c14e6b req-7e7fc4e1-d436-48b2-81ff-2b1ea0317ce8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a5045f34-cbc5-4b30-8165-f1fe663be743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.675 186792 DEBUG nova.network.neutron [req-bfda5000-863d-4e0a-89eb-fd9119c14e6b req-7e7fc4e1-d436-48b2-81ff-2b1ea0317ce8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Refreshing network info cache for port 21f2a09e-e781-4e4e-9659-691ca54ee1d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.679 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Start _get_guest_xml network_info=[{"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.687 186792 WARNING nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.697 186792 DEBUG nova.virt.libvirt.host [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.699 186792 DEBUG nova.virt.libvirt.host [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.703 186792 DEBUG nova.virt.libvirt.host [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.705 186792 DEBUG nova.virt.libvirt.host [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.706 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.706 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.707 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.707 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.707 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.708 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.708 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.708 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.710 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.710 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.710 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.710 186792 DEBUG nova.virt.hardware [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.715 186792 DEBUG nova.virt.libvirt.vif [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1538326568',display_name='tempest-ServerRescueNegativeTestJSON-server-1538326568',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1538326568',id=107,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f7086819eb340f28dd7087159d82fa3',ramdisk_id='',reservation_id='r-lhanirue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1724156244',owner_user_name='tempest-ServerRescueNegativeTestJSON-1724156244-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:23Z,user_data=None,user_id='2c1b21c06c9b48d39e736b195bd12c8c',uuid=a5045f34-cbc5-4b30-8165-f1fe663be743,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.716 186792 DEBUG nova.network.os_vif_util [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converting VIF {"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.717 186792 DEBUG nova.network.os_vif_util [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:51:b8,bridge_name='br-int',has_traffic_filtering=True,id=21f2a09e-e781-4e4e-9659-691ca54ee1d8,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21f2a09e-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.717 186792 DEBUG nova.objects.instance [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a5045f34-cbc5-4b30-8165-f1fe663be743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.729 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <uuid>a5045f34-cbc5-4b30-8165-f1fe663be743</uuid>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <name>instance-0000006b</name>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1538326568</nova:name>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:06:27</nova:creationTime>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        <nova:user uuid="2c1b21c06c9b48d39e736b195bd12c8c">tempest-ServerRescueNegativeTestJSON-1724156244-project-member</nova:user>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        <nova:project uuid="8f7086819eb340f28dd7087159d82fa3">tempest-ServerRescueNegativeTestJSON-1724156244</nova:project>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        <nova:port uuid="21f2a09e-e781-4e4e-9659-691ca54ee1d8">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <entry name="serial">a5045f34-cbc5-4b30-8165-f1fe663be743</entry>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <entry name="uuid">a5045f34-cbc5-4b30-8165-f1fe663be743</entry>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk.config"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:47:51:b8"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <target dev="tap21f2a09e-e7"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/console.log" append="off"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:06:27 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:06:27 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:06:27 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:06:27 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.730 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Preparing to wait for external event network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.731 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.731 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.731 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.732 186792 DEBUG nova.virt.libvirt.vif [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1538326568',display_name='tempest-ServerRescueNegativeTestJSON-server-1538326568',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1538326568',id=107,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f7086819eb340f28dd7087159d82fa3',ramdisk_id='',reservation_id='r-lhanirue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1724156244',owner_user_name='tempest-ServerRescueNegativeTestJSON-1724156244-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:23Z,user_data=None,user_id='2c1b21c06c9b48d39e736b195bd12c8c',uuid=a5045f34-cbc5-4b30-8165-f1fe663be743,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.732 186792 DEBUG nova.network.os_vif_util [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converting VIF {"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.733 186792 DEBUG nova.network.os_vif_util [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:51:b8,bridge_name='br-int',has_traffic_filtering=True,id=21f2a09e-e781-4e4e-9659-691ca54ee1d8,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21f2a09e-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.733 186792 DEBUG os_vif [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:51:b8,bridge_name='br-int',has_traffic_filtering=True,id=21f2a09e-e781-4e4e-9659-691ca54ee1d8,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21f2a09e-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.734 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.734 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.735 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.740 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.740 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21f2a09e-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.741 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap21f2a09e-e7, col_values=(('external_ids', {'iface-id': '21f2a09e-e781-4e4e-9659-691ca54ee1d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:51:b8', 'vm-uuid': 'a5045f34-cbc5-4b30-8165-f1fe663be743'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:27 np0005531888 NetworkManager[55166]: <info>  [1763798787.7444] manager: (tap21f2a09e-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.746 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.750 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.751 186792 INFO os_vif [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:51:b8,bridge_name='br-int',has_traffic_filtering=True,id=21f2a09e-e781-4e4e-9659-691ca54ee1d8,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21f2a09e-e7')#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.761 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.849 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.850 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.850 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No VIF found with MAC fa:16:3e:47:51:b8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:06:27 np0005531888 nova_compute[186788]: 2025-11-22 08:06:27.850 186792 INFO nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Using config drive#033[00m
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.117 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.410 186792 INFO nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Creating config drive at /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk.config#033[00m
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.418 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg11n64d5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.551 186792 DEBUG oslo_concurrency.processutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg11n64d5" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:28 np0005531888 kernel: tap21f2a09e-e7: entered promiscuous mode
Nov 22 03:06:28 np0005531888 NetworkManager[55166]: <info>  [1763798788.6375] manager: (tap21f2a09e-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Nov 22 03:06:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:28Z|00363|binding|INFO|Claiming lport 21f2a09e-e781-4e4e-9659-691ca54ee1d8 for this chassis.
Nov 22 03:06:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:28Z|00364|binding|INFO|21f2a09e-e781-4e4e-9659-691ca54ee1d8: Claiming fa:16:3e:47:51:b8 10.100.0.13
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.639 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.644 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.658 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:51:b8 10.100.0.13'], port_security=['fa:16:3e:47:51:b8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9714091-78f6-46c8-b55b-4a278bd99b49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7086819eb340f28dd7087159d82fa3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '65f3b143-522b-4e83-8261-f97700b0bd79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a437e229-533d-4315-8ee6-05d493bb5ad7, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=21f2a09e-e781-4e4e-9659-691ca54ee1d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.660 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 21f2a09e-e781-4e4e-9659-691ca54ee1d8 in datapath f9714091-78f6-46c8-b55b-4a278bd99b49 bound to our chassis#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.662 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9714091-78f6-46c8-b55b-4a278bd99b49#033[00m
Nov 22 03:06:28 np0005531888 systemd-udevd[230871]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.677 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a40468-f872-4382-ad55-703c524fa60e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.678 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9714091-71 in ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:06:28 np0005531888 systemd-machined[153106]: New machine qemu-51-instance-0000006b.
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.681 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9714091-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.681 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7649c4d4-ef10-44d4-8b00-c9a79ba55cd0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.683 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0679f787-5f04-40a8-9122-076f8467590b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 NetworkManager[55166]: <info>  [1763798788.6871] device (tap21f2a09e-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:06:28 np0005531888 NetworkManager[55166]: <info>  [1763798788.6881] device (tap21f2a09e-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.694 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ce935d-0ea3-4db2-94ae-2c7b223240bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 systemd[1]: Started Virtual Machine qemu-51-instance-0000006b.
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.702 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:28Z|00365|binding|INFO|Setting lport 21f2a09e-e781-4e4e-9659-691ca54ee1d8 ovn-installed in OVS
Nov 22 03:06:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:28Z|00366|binding|INFO|Setting lport 21f2a09e-e781-4e4e-9659-691ca54ee1d8 up in Southbound
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.709 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.721 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[396712b1-6140-4bda-b3d5-03e6b94f4fc2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.747 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d45e1d63-18a1-47f2-8e8b-458048eaf7a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.752 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[596563ce-d6a6-4be1-bf43-7f9a52a8fbd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 NetworkManager[55166]: <info>  [1763798788.7537] manager: (tapf9714091-70): new Veth device (/org/freedesktop/NetworkManager/Devices/174)
Nov 22 03:06:28 np0005531888 systemd-udevd[230876]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.782 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a63c89c8-3a07-46b4-a4ad-5eeac1c0186a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.786 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[978df320-fc63-4213-9cb3-fd8c801bd61d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 NetworkManager[55166]: <info>  [1763798788.8150] device (tapf9714091-70): carrier: link connected
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.820 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4b36ab-9b03-4176-a5db-4bda258ca586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.838 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bd554b6a-3d82-4aee-9225-18c424ee366b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9714091-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:55:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552945, 'reachable_time': 36625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230905, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.854 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a97a6e31-1f02-4086-9087-e408da4e63af]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:5583'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552945, 'tstamp': 552945}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230906, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.877 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5bcfce74-880b-4897-a1cb-31c07c607b85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9714091-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:55:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552945, 'reachable_time': 36625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230907, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.911 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f25cf8c8-c050-4a03-b4f4-5f21c5e7aa60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.974 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb3ad8e-6c61-4287-a719-43b7d9768240]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.975 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9714091-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.976 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.976 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9714091-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:28 np0005531888 NetworkManager[55166]: <info>  [1763798788.9785] manager: (tapf9714091-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.978 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:28 np0005531888 kernel: tapf9714091-70: entered promiscuous mode
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.983 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9714091-70, col_values=(('external_ids', {'iface-id': '298be65c-aa9e-4327-b67d-2a3d4f1acf68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.984 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:28Z|00367|binding|INFO|Releasing lport 298be65c-aa9e-4327-b67d-2a3d4f1acf68 from this chassis (sb_readonly=0)
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.987 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.989 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[35567e57-496b-4031-bbab-47b2211e1b20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.990 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-f9714091-78f6-46c8-b55b-4a278bd99b49
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID f9714091-78f6-46c8-b55b-4a278bd99b49
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:06:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:28.990 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'env', 'PROCESS_TAG=haproxy-f9714091-78f6-46c8-b55b-4a278bd99b49', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9714091-78f6-46c8-b55b-4a278bd99b49.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:06:28 np0005531888 nova_compute[186788]: 2025-11-22 08:06:28.998 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.165 186792 DEBUG nova.compute.manager [req-e8028b7f-f091-4a2c-a877-11fd730edd52 req-2133c641-bdf6-4bc2-91b8-cc3a5d2f4cce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received event network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.166 186792 DEBUG oslo_concurrency.lockutils [req-e8028b7f-f091-4a2c-a877-11fd730edd52 req-2133c641-bdf6-4bc2-91b8-cc3a5d2f4cce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.166 186792 DEBUG oslo_concurrency.lockutils [req-e8028b7f-f091-4a2c-a877-11fd730edd52 req-2133c641-bdf6-4bc2-91b8-cc3a5d2f4cce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.167 186792 DEBUG oslo_concurrency.lockutils [req-e8028b7f-f091-4a2c-a877-11fd730edd52 req-2133c641-bdf6-4bc2-91b8-cc3a5d2f4cce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.168 186792 DEBUG nova.compute.manager [req-e8028b7f-f091-4a2c-a877-11fd730edd52 req-2133c641-bdf6-4bc2-91b8-cc3a5d2f4cce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Processing event network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.223 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.224 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798789.222154, a5045f34-cbc5-4b30-8165-f1fe663be743 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.225 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] VM Started (Lifecycle Event)#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.228 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.233 186792 INFO nova.virt.libvirt.driver [-] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Instance spawned successfully.#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.235 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.261 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.266 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.267 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.268 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.268 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.269 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.269 186792 DEBUG nova.virt.libvirt.driver [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.273 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:06:29 np0005531888 podman[230945]: 2025-11-22 08:06:29.334211371 +0000 UTC m=+0.020602688 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.481 186792 INFO nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Took 5.97 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.482 186792 DEBUG nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.567 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.568 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798789.2223053, a5045f34-cbc5-4b30-8165-f1fe663be743 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.568 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.608 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.612 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798789.2280984, a5045f34-cbc5-4b30-8165-f1fe663be743 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.612 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.630 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.637 186792 INFO nova.compute.manager [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Took 6.58 seconds to build instance.#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.639 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:06:29 np0005531888 nova_compute[186788]: 2025-11-22 08:06:29.669 186792 DEBUG oslo_concurrency.lockutils [None req-1c18ced3-e4dc-4f4f-98fe-0bd32aaa2562 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:29 np0005531888 podman[230945]: 2025-11-22 08:06:29.783708574 +0000 UTC m=+0.470099881 container create b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 03:06:29 np0005531888 systemd[1]: Started libpod-conmon-b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8.scope.
Nov 22 03:06:29 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:06:29 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e198e64d0a68a47eeba7f1602f3e39a50eaae99c28b21554e713613317ec9274/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:06:30 np0005531888 podman[230945]: 2025-11-22 08:06:30.012917021 +0000 UTC m=+0.699308358 container init b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 03:06:30 np0005531888 podman[230945]: 2025-11-22 08:06:30.018841366 +0000 UTC m=+0.705232673 container start b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:06:30 np0005531888 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230960]: [NOTICE]   (230964) : New worker (230966) forked
Nov 22 03:06:30 np0005531888 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230960]: [NOTICE]   (230964) : Loading success.
Nov 22 03:06:30 np0005531888 nova_compute[186788]: 2025-11-22 08:06:30.661 186792 DEBUG nova.network.neutron [req-bfda5000-863d-4e0a-89eb-fd9119c14e6b req-7e7fc4e1-d436-48b2-81ff-2b1ea0317ce8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Updated VIF entry in instance network info cache for port 21f2a09e-e781-4e4e-9659-691ca54ee1d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:06:30 np0005531888 nova_compute[186788]: 2025-11-22 08:06:30.661 186792 DEBUG nova.network.neutron [req-bfda5000-863d-4e0a-89eb-fd9119c14e6b req-7e7fc4e1-d436-48b2-81ff-2b1ea0317ce8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Updating instance_info_cache with network_info: [{"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:30 np0005531888 nova_compute[186788]: 2025-11-22 08:06:30.683 186792 DEBUG oslo_concurrency.lockutils [req-bfda5000-863d-4e0a-89eb-fd9119c14e6b req-7e7fc4e1-d436-48b2-81ff-2b1ea0317ce8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a5045f34-cbc5-4b30-8165-f1fe663be743" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:06:31 np0005531888 nova_compute[186788]: 2025-11-22 08:06:31.469 186792 DEBUG nova.compute.manager [req-f36cfd1e-37c2-439d-891f-7203e5ee9b59 req-708289da-dbd2-4ef7-aec8-b6c7e32ad4df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received event network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:31 np0005531888 nova_compute[186788]: 2025-11-22 08:06:31.470 186792 DEBUG oslo_concurrency.lockutils [req-f36cfd1e-37c2-439d-891f-7203e5ee9b59 req-708289da-dbd2-4ef7-aec8-b6c7e32ad4df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:31 np0005531888 nova_compute[186788]: 2025-11-22 08:06:31.470 186792 DEBUG oslo_concurrency.lockutils [req-f36cfd1e-37c2-439d-891f-7203e5ee9b59 req-708289da-dbd2-4ef7-aec8-b6c7e32ad4df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:31 np0005531888 nova_compute[186788]: 2025-11-22 08:06:31.470 186792 DEBUG oslo_concurrency.lockutils [req-f36cfd1e-37c2-439d-891f-7203e5ee9b59 req-708289da-dbd2-4ef7-aec8-b6c7e32ad4df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:31 np0005531888 nova_compute[186788]: 2025-11-22 08:06:31.471 186792 DEBUG nova.compute.manager [req-f36cfd1e-37c2-439d-891f-7203e5ee9b59 req-708289da-dbd2-4ef7-aec8-b6c7e32ad4df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] No waiting events found dispatching network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:06:31 np0005531888 nova_compute[186788]: 2025-11-22 08:06:31.471 186792 WARNING nova.compute.manager [req-f36cfd1e-37c2-439d-891f-7203e5ee9b59 req-708289da-dbd2-4ef7-aec8-b6c7e32ad4df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received unexpected event network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:06:32 np0005531888 nova_compute[186788]: 2025-11-22 08:06:32.745 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:33 np0005531888 nova_compute[186788]: 2025-11-22 08:06:33.121 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:34 np0005531888 podman[230976]: 2025-11-22 08:06:34.691039433 +0000 UTC m=+0.055419133 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:06:34 np0005531888 podman[230975]: 2025-11-22 08:06:34.693503714 +0000 UTC m=+0.058253554 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:06:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:35.142 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:28:f1 10.100.0.2 2001:db8::f816:3eff:fe1e:28f1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe1e:28f1/64', 'neutron:device_id': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9c41f1e-b11e-4868-a3a0-70214f7435c4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0abd56a4-3e9e-4d28-8383-eadcda41744d) old=Port_Binding(mac=['fa:16:3e:1e:28:f1 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:35.143 104023 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0abd56a4-3e9e-4d28-8383-eadcda41744d in datapath 90da6fca-65d1-4012-9602-d88842a0ad0e updated#033[00m
Nov 22 03:06:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:35.145 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90da6fca-65d1-4012-9602-d88842a0ad0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:06:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:35.148 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4530d593-ff6e-4bb5-951b-b48a6c4ee72c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:35 np0005531888 nova_compute[186788]: 2025-11-22 08:06:35.388 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798780.3875084, c25fedf6-8ee9-48d2-a91c-f5040b45cb61 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:35 np0005531888 nova_compute[186788]: 2025-11-22 08:06:35.389 186792 INFO nova.compute.manager [-] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:06:35 np0005531888 nova_compute[186788]: 2025-11-22 08:06:35.405 186792 DEBUG nova.compute.manager [None req-d5987902-cc98-4717-91d3-e568cf36c7f2 - - - - - -] [instance: c25fedf6-8ee9-48d2-a91c-f5040b45cb61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:36.821 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:36.822 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:06:36.822 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.847 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006b', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8f7086819eb340f28dd7087159d82fa3', 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'hostId': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.848 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.854 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a5045f34-cbc5-4b30-8165-f1fe663be743 / tap21f2a09e-e7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.854 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e9a0de6-84b6-4b05-8a88-c9cb443febf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.849019', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2aec57bc-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': '0104e4fd9382430d4cf5a560e5d14d4cc3ab6d2f315964cd8b31bcc5ebe68055'}]}, 'timestamp': '2025-11-22 08:06:36.855755', '_unique_id': 'dbec303a239844bc9d32b44a13f6efa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.857 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.858 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.858 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c44df9b9-9a86-4e1e-ae6f-15e131c01fe1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.858286', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2aecca62-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': 'b237e581819cc186dd9ada2791b21b1d08e60fd3f6b357f28bccfee6e103b9a3'}]}, 'timestamp': '2025-11-22 08:06:36.858767', '_unique_id': 'fef52a816f9745dd90a5129e94c6033b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.859 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.889 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.890 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f6472d1-3fac-47a4-a082-8bd7b0c08cf3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.860031', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2af19506-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': '99000183f08e79fbaf446fef30696b4d21f806355107a3a14fb4d19089ca5fdd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.860031', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2af1a1e0-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': '14e20e9ffbae5ca702a2c517bd3246e3ee305bf0c42efcd32508df047b7641d8'}]}, 'timestamp': '2025-11-22 08:06:36.890273', '_unique_id': 'b9a37c020bc44b87b20eaf511af3b668'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.892 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.892 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '920fc1d4-5ad9-4e41-bffb-fb2e0b1789d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.892004', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2af1efc4-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': '1647f087d0b5e829ddd4848db5dd0d8e1fead0337723ee88d1c9609b5e0c0055'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.892004', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2af1f9d8-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': '8844d248346e95c2bff5977fb6fed3c3e523b243de8fa27b4eabd7a0e4fb69ba'}]}, 'timestamp': '2025-11-22 08:06:36.892546', '_unique_id': 'd795c96a93b345149a0c378e43209414'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.894 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c328ec9-451b-4e26-8e47-5e1c1d87a6ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.894225', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2af246ae-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': '1afa8871d6b6106904d5a85abba7d3648bbf2a2f03d179bfa297fa7ab19ad8fa'}]}, 'timestamp': '2025-11-22 08:06:36.894515', '_unique_id': 'c2ba5f23a05e473fa7a7c170da80b042'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.895 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.912 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.913 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ac539ee-cbd6-428e-9774-a7f3a8dc00d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.895868', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2af514ec-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.595385542, 'message_signature': '16960d6aa562d04cfb82c5cc6abbdaf49447423e863834514a1a540191dfc33e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.895868', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2af522ac-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.595385542, 'message_signature': '673a748e712a7d07ac6a0fe93beb94079b0c59262c73a984c54bd404a8273825'}]}, 'timestamp': '2025-11-22 08:06:36.913274', '_unique_id': 'b65276a5eee241b2bd4aabcf0d4b86c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.915 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.915 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1538326568>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1538326568>]
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ad16844-61dd-439f-b55b-54ce728cc1ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.916041', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2af59a20-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': '7097d61897724eb987fc6a3902aec769274b2fb9c7b79753628c3708937e706b'}]}, 'timestamp': '2025-11-22 08:06:36.916289', '_unique_id': '5221bcdaf3e742aeb4a6fc1994e80e09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.917 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.917 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c53ceb84-55de-4e78-9be1-a73b5fa8afc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.917405', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2af5cfd6-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': '5ba7b148d1332fd5fab4f8bd952d97f23b3b2c5a4c4ad9147e65968ac4cd932b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.917405', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2af5d94a-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': 'f89fadb0f9faccc9ad30547af57710a55323a12979b1e1b462563a4e1ae7853c'}]}, 'timestamp': '2025-11-22 08:06:36.917891', '_unique_id': '374da2dae94045b79f0232387db4c3a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1d5e9bf-d288-4762-8744-6ebfbb2dc923', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.919026', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2af60e10-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': 'db1f8cd91d21e99828ec9f6c3ec1fbcb559085e7c868db239277072f7ed5e9f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.919026', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2af61608-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': 'be9a66834c169771075ddf3f5b7a70a5910d387284870ed4b8ecf095e5bfb7d8'}]}, 'timestamp': '2025-11-22 08:06:36.919448', '_unique_id': '328e2edb9d91403f8ecea5fdef7d6d35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.920 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.937 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.937 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance a5045f34-cbc5-4b30-8165-f1fe663be743: ceilometer.compute.pollsters.NoVolumeException
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.937 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b77f06d9-1dab-47e0-b7f3-2eb594b0996b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.937750', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2af8ef72-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': '14a73dfe5dec02de206cc84d31db2a10e908d8320e47df1d4f516066aa3c5380'}]}, 'timestamp': '2025-11-22 08:06:36.938312', '_unique_id': '1a1817b10e574bc6bb8ce4abf3ca6bec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.940 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.940 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/cpu volume: 7050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f237ff8d-98f1-4420-9449-ed0ba4b4706a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7050000000, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'timestamp': '2025-11-22T08:06:36.940403', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2af95188-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.63638033, 'message_signature': '5ef206ef81f4660f255e6c826d0e5cf7de9814ced12fc82d40b10fb013f36711'}]}, 'timestamp': '2025-11-22 08:06:36.940693', '_unique_id': '3db18b22f8b644bf8edb71f822af29cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.941 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3136316-4bb4-4ef8-ba0d-a724f5eeca5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.941895', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2af98c7a-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': '0dcf8e1e26a963787d5a9cac6c691286ae48ee1fd8ce6d88118c6df5b9e1a67a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.941895', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2af9949a-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': '1151b838a8d45107973fec98f860515e8af51a9e845e8418be44e3dc07ef968a'}]}, 'timestamp': '2025-11-22 08:06:36.942356', '_unique_id': 'dfdab985e61a495d9baa01e94536324a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.943 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.943 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.read.latency volume: 1581110312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.943 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.read.latency volume: 65895200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d035e99-e640-4860-a01b-97bd0e0388d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1581110312, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.943721', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2af9d34c-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': '94f5d7338a35a518f7323560871a16c631776b67e3a1f3eba953dc50a964fca6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 65895200, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.943721', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2af9dd10-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.55953716, 'message_signature': 'f0ea7b5ba1a92c48dacf3654d4dce145cd124e6476bf72cc5e39cb75df964625'}]}, 'timestamp': '2025-11-22 08:06:36.944245', '_unique_id': 'fc3a7184ebd04a93bafaf00a195d7c02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.946 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.946 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.946 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1538326568>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1538326568>]
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.946 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7070a7b6-dedf-4ee0-8642-0d27801448bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.946773', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2afa51c8-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': '36e78cb551a7cb555bb0d409111edc3192fe41952d68bc00051d706c12481e92'}]}, 'timestamp': '2025-11-22 08:06:36.947371', '_unique_id': '09bb6e0ea1dd4a45b0adc56552b9687b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.950 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c18be5c-596d-45ec-b4c7-a9f8e6ccd6a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.950245', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2afad526-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': 'f4d6b854c26f9d6a2c0c5f6a4af8233783d33814046ee5e0668b07694da4b630'}]}, 'timestamp': '2025-11-22 08:06:36.950679', '_unique_id': '6348d4b9092f45a6ab2e2f83bdfe548f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.952 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.952 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1538326568>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1538326568>]
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.952 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1ba6e0a-9d13-4301-a777-f9809b6fc205', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.952772', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2afb3700-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': 'ab08c33419681e50b3e0c91053b45dde29fc493e555e3a0776c441aea6af5521'}]}, 'timestamp': '2025-11-22 08:06:36.953097', '_unique_id': '6f9f357134464fa4a7236ff7857e332d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.968 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e79919e6-06e1-4bca-953d-d9022aefb40c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.968468', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2afd9f40-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': 'eb5e3942af022a6f76eed694d9c9439243042682b24bb801b094ccd2a1959f35'}]}, 'timestamp': '2025-11-22 08:06:36.968972', '_unique_id': 'a07e2f911928412ba6e06e52ae4ac84b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.970 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.970 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.971 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0340157-089e-4c6b-8fcd-516cd7a99457', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.970886', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2afdfa9e-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.595385542, 'message_signature': '2979c6c173533a0bf9c3e2f6bce9de0a93dba16b30c85352422d5dc85831af91'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.970886', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2afe04b2-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.595385542, 'message_signature': 'f6820f9877342b5f62c769be5c4b810f97c8a6095f303a654cc9587c17772429'}]}, 'timestamp': '2025-11-22 08:06:36.971467', '_unique_id': '99e09501c4fb4e569e76612410ca506a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.973 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.973 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.973 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8de90e8-44ea-445b-a0f7-d1e6a1f6f18b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-vda', 'timestamp': '2025-11-22T08:06:36.973484', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2afe5f48-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.595385542, 'message_signature': '849fb1298e49ae1e498ce8cd407771302eceaf063b4933ffc79f1034d1416f56'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743-sda', 'timestamp': '2025-11-22T08:06:36.973484', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'instance-0000006b', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2afe67c2-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.595385542, 'message_signature': '6123776ae215b6415d95a573b8994d118213a05265cf50b369c316aedb8b0edb'}]}, 'timestamp': '2025-11-22 08:06:36.973966', '_unique_id': '8f38ba61672946e3b40f4f11e464c043'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.975 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.975 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1538326568>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1538326568>]
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.975 12 DEBUG ceilometer.compute.pollsters [-] a5045f34-cbc5-4b30-8165-f1fe663be743/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '563d9dcf-5dbd-4df6-93a7-6a628f1ffc3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006b-a5045f34-cbc5-4b30-8165-f1fe663be743-tap21f2a09e-e7', 'timestamp': '2025-11-22T08:06:36.975946', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1538326568', 'name': 'tap21f2a09e-e7', 'instance_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'instance_type': 'm1.nano', 'host': 'e48d89825dc0af02b2b2c1dd8dc1856f90a4c1ec159b3e7b1d8f6b1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:51:b8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap21f2a09e-e7'}, 'message_id': '2afebdee-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5537.548516969, 'message_signature': '70d35c65b84a34dd893abb58b73ec45cdaadcbf87cc6a07dd1653a9cff82e36b'}]}, 'timestamp': '2025-11-22 08:06:36.976187', '_unique_id': '8e78d0b9024544ca878268e373377b37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:06:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:37 np0005531888 nova_compute[186788]: 2025-11-22 08:06:37.748 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:38 np0005531888 nova_compute[186788]: 2025-11-22 08:06:38.121 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:42 np0005531888 nova_compute[186788]: 2025-11-22 08:06:42.750 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:43 np0005531888 nova_compute[186788]: 2025-11-22 08:06:43.123 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:43 np0005531888 podman[231028]: 2025-11-22 08:06:43.685579484 +0000 UTC m=+0.057768592 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Nov 22 03:06:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:44Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:51:b8 10.100.0.13
Nov 22 03:06:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:06:44Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:51:b8 10.100.0.13
Nov 22 03:06:46 np0005531888 podman[231048]: 2025-11-22 08:06:46.6778081 +0000 UTC m=+0.047608192 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:06:47 np0005531888 nova_compute[186788]: 2025-11-22 08:06:47.752 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.126 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.623 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.624 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.639 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.729 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.729 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.737 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.738 186792 INFO nova.compute.claims [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.867 186792 DEBUG nova.compute.provider_tree [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.879 186792 DEBUG nova.scheduler.client.report [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.901 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.902 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.964 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.964 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.980 186792 INFO nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:06:48 np0005531888 nova_compute[186788]: 2025-11-22 08:06:48.995 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.093 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.095 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.095 186792 INFO nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Creating image(s)#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.096 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "/var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.096 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "/var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.097 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "/var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.114 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.178 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.181 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.182 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.197 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.222 186792 DEBUG nova.policy [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd1b53cb76c914b98afb487ff6059ebfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.262 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.262 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.325 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.326 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.327 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.397 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.398 186792 DEBUG nova.virt.disk.api [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Checking if we can resize image /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.399 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.466 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.467 186792 DEBUG nova.virt.disk.api [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Cannot resize image /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.467 186792 DEBUG nova.objects.instance [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lazy-loading 'migration_context' on Instance uuid dca0936b-0f9e-4a24-a613-087b4a117a05 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.478 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.478 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Ensure instance console log exists: /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.479 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.479 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:49 np0005531888 nova_compute[186788]: 2025-11-22 08:06:49.479 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:50 np0005531888 nova_compute[186788]: 2025-11-22 08:06:50.253 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Successfully created port: 9397b73b-bcd8-436d-ac03-5350a4a977b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:06:50 np0005531888 podman[231087]: 2025-11-22 08:06:50.681678444 +0000 UTC m=+0.053715552 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64)
Nov 22 03:06:51 np0005531888 nova_compute[186788]: 2025-11-22 08:06:51.132 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Successfully created port: d79c107f-7912-4ccc-9651-f7791b24d22c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:06:52 np0005531888 nova_compute[186788]: 2025-11-22 08:06:52.069 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Successfully created port: f6434b8e-2b54-4b57-a8c5-ce06c6d6325d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:06:52 np0005531888 nova_compute[186788]: 2025-11-22 08:06:52.754 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:52 np0005531888 nova_compute[186788]: 2025-11-22 08:06:52.768 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Successfully updated port: 9397b73b-bcd8-436d-ac03-5350a4a977b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:06:52 np0005531888 nova_compute[186788]: 2025-11-22 08:06:52.877 186792 DEBUG nova.compute.manager [req-ad888ca9-d51a-4614-8602-34985748ada1 req-dc6e00a2-3096-4377-b62d-e95cb916b78f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-changed-9397b73b-bcd8-436d-ac03-5350a4a977b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:52 np0005531888 nova_compute[186788]: 2025-11-22 08:06:52.878 186792 DEBUG nova.compute.manager [req-ad888ca9-d51a-4614-8602-34985748ada1 req-dc6e00a2-3096-4377-b62d-e95cb916b78f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Refreshing instance network info cache due to event network-changed-9397b73b-bcd8-436d-ac03-5350a4a977b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:06:52 np0005531888 nova_compute[186788]: 2025-11-22 08:06:52.878 186792 DEBUG oslo_concurrency.lockutils [req-ad888ca9-d51a-4614-8602-34985748ada1 req-dc6e00a2-3096-4377-b62d-e95cb916b78f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:52 np0005531888 nova_compute[186788]: 2025-11-22 08:06:52.878 186792 DEBUG oslo_concurrency.lockutils [req-ad888ca9-d51a-4614-8602-34985748ada1 req-dc6e00a2-3096-4377-b62d-e95cb916b78f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:06:52 np0005531888 nova_compute[186788]: 2025-11-22 08:06:52.878 186792 DEBUG nova.network.neutron [req-ad888ca9-d51a-4614-8602-34985748ada1 req-dc6e00a2-3096-4377-b62d-e95cb916b78f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Refreshing network info cache for port 9397b73b-bcd8-436d-ac03-5350a4a977b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:06:53 np0005531888 nova_compute[186788]: 2025-11-22 08:06:53.059 186792 DEBUG nova.network.neutron [req-ad888ca9-d51a-4614-8602-34985748ada1 req-dc6e00a2-3096-4377-b62d-e95cb916b78f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:06:53 np0005531888 nova_compute[186788]: 2025-11-22 08:06:53.419 186792 DEBUG nova.network.neutron [req-ad888ca9-d51a-4614-8602-34985748ada1 req-dc6e00a2-3096-4377-b62d-e95cb916b78f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:53 np0005531888 nova_compute[186788]: 2025-11-22 08:06:53.432 186792 DEBUG oslo_concurrency.lockutils [req-ad888ca9-d51a-4614-8602-34985748ada1 req-dc6e00a2-3096-4377-b62d-e95cb916b78f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:06:53 np0005531888 nova_compute[186788]: 2025-11-22 08:06:53.475 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:53 np0005531888 nova_compute[186788]: 2025-11-22 08:06:53.583 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Successfully updated port: d79c107f-7912-4ccc-9651-f7791b24d22c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.970 186792 DEBUG nova.compute.manager [req-e269c6f2-e140-43bc-bffa-008b87ab9e5c req-70b56300-32ab-40d8-84e2-cbdde9764f71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-changed-d79c107f-7912-4ccc-9651-f7791b24d22c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.970 186792 DEBUG nova.compute.manager [req-e269c6f2-e140-43bc-bffa-008b87ab9e5c req-70b56300-32ab-40d8-84e2-cbdde9764f71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Refreshing instance network info cache due to event network-changed-d79c107f-7912-4ccc-9651-f7791b24d22c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.971 186792 DEBUG oslo_concurrency.lockutils [req-e269c6f2-e140-43bc-bffa-008b87ab9e5c req-70b56300-32ab-40d8-84e2-cbdde9764f71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.971 186792 DEBUG oslo_concurrency.lockutils [req-e269c6f2-e140-43bc-bffa-008b87ab9e5c req-70b56300-32ab-40d8-84e2-cbdde9764f71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.971 186792 DEBUG nova.network.neutron [req-e269c6f2-e140-43bc-bffa-008b87ab9e5c req-70b56300-32ab-40d8-84e2-cbdde9764f71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Refreshing network info cache for port d79c107f-7912-4ccc-9651-f7791b24d22c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.985 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:06:54 np0005531888 nova_compute[186788]: 2025-11-22 08:06:54.985 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:55 np0005531888 nova_compute[186788]: 2025-11-22 08:06:55.312 186792 DEBUG nova.network.neutron [req-e269c6f2-e140-43bc-bffa-008b87ab9e5c req-70b56300-32ab-40d8-84e2-cbdde9764f71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:06:55 np0005531888 nova_compute[186788]: 2025-11-22 08:06:55.592 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Successfully updated port: f6434b8e-2b54-4b57-a8c5-ce06c6d6325d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:06:55 np0005531888 nova_compute[186788]: 2025-11-22 08:06:55.615 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:55 np0005531888 nova_compute[186788]: 2025-11-22 08:06:55.670 186792 DEBUG nova.network.neutron [req-e269c6f2-e140-43bc-bffa-008b87ab9e5c req-70b56300-32ab-40d8-84e2-cbdde9764f71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:55 np0005531888 nova_compute[186788]: 2025-11-22 08:06:55.690 186792 DEBUG oslo_concurrency.lockutils [req-e269c6f2-e140-43bc-bffa-008b87ab9e5c req-70b56300-32ab-40d8-84e2-cbdde9764f71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:06:55 np0005531888 nova_compute[186788]: 2025-11-22 08:06:55.691 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquired lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:06:55 np0005531888 nova_compute[186788]: 2025-11-22 08:06:55.691 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:06:55 np0005531888 nova_compute[186788]: 2025-11-22 08:06:55.824 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:06:56 np0005531888 podman[231108]: 2025-11-22 08:06:56.684594582 +0000 UTC m=+0.056052850 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:06:56 np0005531888 podman[231109]: 2025-11-22 08:06:56.718722221 +0000 UTC m=+0.085890253 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Nov 22 03:06:56 np0005531888 nova_compute[186788]: 2025-11-22 08:06:56.980 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:57 np0005531888 nova_compute[186788]: 2025-11-22 08:06:57.071 186792 DEBUG nova.compute.manager [req-2bacc732-31cd-4c54-9d2e-c7e6b066980d req-38a5fa07-e5d0-4737-a027-723edc5034b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-changed-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:57 np0005531888 nova_compute[186788]: 2025-11-22 08:06:57.071 186792 DEBUG nova.compute.manager [req-2bacc732-31cd-4c54-9d2e-c7e6b066980d req-38a5fa07-e5d0-4737-a027-723edc5034b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Refreshing instance network info cache due to event network-changed-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:06:57 np0005531888 nova_compute[186788]: 2025-11-22 08:06:57.071 186792 DEBUG oslo_concurrency.lockutils [req-2bacc732-31cd-4c54-9d2e-c7e6b066980d req-38a5fa07-e5d0-4737-a027-723edc5034b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:57 np0005531888 nova_compute[186788]: 2025-11-22 08:06:57.757 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:57 np0005531888 nova_compute[186788]: 2025-11-22 08:06:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:57 np0005531888 nova_compute[186788]: 2025-11-22 08:06:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:58 np0005531888 nova_compute[186788]: 2025-11-22 08:06:58.478 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:00 np0005531888 nova_compute[186788]: 2025-11-22 08:07:00.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:01 np0005531888 nova_compute[186788]: 2025-11-22 08:07:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:01 np0005531888 nova_compute[186788]: 2025-11-22 08:07:01.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.071 186792 DEBUG nova.network.neutron [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Updating instance_info_cache with network_info: [{"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.096 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Releasing lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.097 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Instance network_info: |[{"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.097 186792 DEBUG oslo_concurrency.lockutils [req-2bacc732-31cd-4c54-9d2e-c7e6b066980d req-38a5fa07-e5d0-4737-a027-723edc5034b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.097 186792 DEBUG nova.network.neutron [req-2bacc732-31cd-4c54-9d2e-c7e6b066980d req-38a5fa07-e5d0-4737-a027-723edc5034b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Refreshing network info cache for port f6434b8e-2b54-4b57-a8c5-ce06c6d6325d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.101 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Start _get_guest_xml network_info=[{"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.106 186792 WARNING nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.111 186792 DEBUG nova.virt.libvirt.host [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.113 186792 DEBUG nova.virt.libvirt.host [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.122 186792 DEBUG nova.virt.libvirt.host [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.123 186792 DEBUG nova.virt.libvirt.host [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.124 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.124 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.125 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.125 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.126 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.126 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.126 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.127 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.127 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.127 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.128 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.128 186792 DEBUG nova.virt.hardware [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.133 186792 DEBUG nova.virt.libvirt.vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:49Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.134 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.135 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:5b:b6,bridge_name='br-int',has_traffic_filtering=True,id=9397b73b-bcd8-436d-ac03-5350a4a977b8,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9397b73b-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.136 186792 DEBUG nova.virt.libvirt.vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:49Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.136 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.137 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:73:ce,bridge_name='br-int',has_traffic_filtering=True,id=d79c107f-7912-4ccc-9651-f7791b24d22c,network=Network(40ade3e4-14f4-4dad-853b-815e30349996),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd79c107f-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.137 186792 DEBUG nova.virt.libvirt.vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:49Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.138 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.138 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:68:af,bridge_name='br-int',has_traffic_filtering=True,id=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6434b8e-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.139 186792 DEBUG nova.objects.instance [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid dca0936b-0f9e-4a24-a613-087b4a117a05 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.155 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <uuid>dca0936b-0f9e-4a24-a613-087b4a117a05</uuid>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <name>instance-0000006e</name>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersTestMultiNic-server-10239088</nova:name>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:07:02</nova:creationTime>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:user uuid="d1b53cb76c914b98afb487ff6059ebfe">tempest-ServersTestMultiNic-1824610177-project-member</nova:user>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:project uuid="dd19aebd63694f83a0bdbf1e376177d5">tempest-ServersTestMultiNic-1824610177</nova:project>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:port uuid="9397b73b-bcd8-436d-ac03-5350a4a977b8">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.184" ipVersion="4"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:port uuid="d79c107f-7912-4ccc-9651-f7791b24d22c">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.1.21" ipVersion="4"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        <nova:port uuid="f6434b8e-2b54-4b57-a8c5-ce06c6d6325d">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.154" ipVersion="4"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <entry name="serial">dca0936b-0f9e-4a24-a613-087b4a117a05</entry>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <entry name="uuid">dca0936b-0f9e-4a24-a613-087b4a117a05</entry>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk.config"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:5b:5b:b6"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <target dev="tap9397b73b-bc"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:cc:73:ce"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <target dev="tapd79c107f-79"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:4d:68:af"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <target dev="tapf6434b8e-2b"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/console.log" append="off"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:07:02 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:07:02 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:07:02 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:07:02 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.156 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Preparing to wait for external event network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.157 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.157 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.157 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.157 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Preparing to wait for external event network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.157 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.158 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.158 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.158 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Preparing to wait for external event network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.158 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.158 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.159 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.159 186792 DEBUG nova.virt.libvirt.vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:49Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.160 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.160 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:5b:b6,bridge_name='br-int',has_traffic_filtering=True,id=9397b73b-bcd8-436d-ac03-5350a4a977b8,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9397b73b-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.161 186792 DEBUG os_vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:5b:b6,bridge_name='br-int',has_traffic_filtering=True,id=9397b73b-bcd8-436d-ac03-5350a4a977b8,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9397b73b-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.161 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.161 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.162 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.165 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.165 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9397b73b-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.165 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9397b73b-bc, col_values=(('external_ids', {'iface-id': '9397b73b-bcd8-436d-ac03-5350a4a977b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:5b:b6', 'vm-uuid': 'dca0936b-0f9e-4a24-a613-087b4a117a05'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.167 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 NetworkManager[55166]: <info>  [1763798822.1680] manager: (tap9397b73b-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.169 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.178 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.179 186792 INFO os_vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:5b:b6,bridge_name='br-int',has_traffic_filtering=True,id=9397b73b-bcd8-436d-ac03-5350a4a977b8,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9397b73b-bc')#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.180 186792 DEBUG nova.virt.libvirt.vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:49Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.181 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.181 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:73:ce,bridge_name='br-int',has_traffic_filtering=True,id=d79c107f-7912-4ccc-9651-f7791b24d22c,network=Network(40ade3e4-14f4-4dad-853b-815e30349996),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd79c107f-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.181 186792 DEBUG os_vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:73:ce,bridge_name='br-int',has_traffic_filtering=True,id=d79c107f-7912-4ccc-9651-f7791b24d22c,network=Network(40ade3e4-14f4-4dad-853b-815e30349996),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd79c107f-79') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.182 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.182 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.182 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.184 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.184 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd79c107f-79, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.184 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd79c107f-79, col_values=(('external_ids', {'iface-id': 'd79c107f-7912-4ccc-9651-f7791b24d22c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:73:ce', 'vm-uuid': 'dca0936b-0f9e-4a24-a613-087b4a117a05'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.185 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 NetworkManager[55166]: <info>  [1763798822.1864] manager: (tapd79c107f-79): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.188 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.192 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.193 186792 INFO os_vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:73:ce,bridge_name='br-int',has_traffic_filtering=True,id=d79c107f-7912-4ccc-9651-f7791b24d22c,network=Network(40ade3e4-14f4-4dad-853b-815e30349996),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd79c107f-79')#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.193 186792 DEBUG nova.virt.libvirt.vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:49Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.194 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.194 186792 DEBUG nova.network.os_vif_util [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:68:af,bridge_name='br-int',has_traffic_filtering=True,id=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6434b8e-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.194 186792 DEBUG os_vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:68:af,bridge_name='br-int',has_traffic_filtering=True,id=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6434b8e-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.195 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.195 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.195 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.197 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.198 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6434b8e-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.198 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6434b8e-2b, col_values=(('external_ids', {'iface-id': 'f6434b8e-2b54-4b57-a8c5-ce06c6d6325d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4d:68:af', 'vm-uuid': 'dca0936b-0f9e-4a24-a613-087b4a117a05'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.199 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 NetworkManager[55166]: <info>  [1763798822.1999] manager: (tapf6434b8e-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.207 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.208 186792 INFO os_vif [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:68:af,bridge_name='br-int',has_traffic_filtering=True,id=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6434b8e-2b')#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.725 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.726 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.726 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No VIF found with MAC fa:16:3e:5b:5b:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.727 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No VIF found with MAC fa:16:3e:cc:73:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.728 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No VIF found with MAC fa:16:3e:4d:68:af, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:07:02 np0005531888 nova_compute[186788]: 2025-11-22 08:07:02.729 186792 INFO nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Using config drive#033[00m
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.274 186792 INFO nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Creating config drive at /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk.config#033[00m
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.281 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp37b5418 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.407 186792 DEBUG oslo_concurrency.processutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp37b5418" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.4711] manager: (tap9397b73b-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/179)
Nov 22 03:07:03 np0005531888 kernel: tap9397b73b-bc: entered promiscuous mode
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00368|binding|INFO|Claiming lport 9397b73b-bcd8-436d-ac03-5350a4a977b8 for this chassis.
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00369|binding|INFO|9397b73b-bcd8-436d-ac03-5350a4a977b8: Claiming fa:16:3e:5b:5b:b6 10.100.0.184
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.478 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.4868] manager: (tapd79c107f-79): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.491 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:5b:b6 10.100.0.184'], port_security=['fa:16:3e:5b:5b:b6 10.100.0.184'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.184/24', 'neutron:device_id': 'dca0936b-0f9e-4a24-a613-087b4a117a05', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b19c0ba2-0afd-4104-905d-6395b6c0bd43, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=9397b73b-bcd8-436d-ac03-5350a4a977b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.493 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 9397b73b-bcd8-436d-ac03-5350a4a977b8 in datapath 6179d9b5-e6e9-4125-83a8-b9f05e25d45c bound to our chassis#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.495 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6179d9b5-e6e9-4125-83a8-b9f05e25d45c#033[00m
Nov 22 03:07:03 np0005531888 systemd-udevd[231184]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:07:03 np0005531888 systemd-udevd[231182]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.506 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5ddf832c-d011-4707-945b-9cdb8d152d25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.507 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6179d9b5-e1 in ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.5089] manager: (tapf6434b8e-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.510 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6179d9b5-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.510 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b1cb3708-5cfe-48f1-88bd-42a06ce75700]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.511 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[99681d4d-54f5-4a5b-bfb2-28cbca4d8ed7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 kernel: tapd79c107f-79: entered promiscuous mode
Nov 22 03:07:03 np0005531888 kernel: tapf6434b8e-2b: entered promiscuous mode
Nov 22 03:07:03 np0005531888 systemd-udevd[231189]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.5160] device (tap9397b73b-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.5167] device (tap9397b73b-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00370|binding|INFO|Claiming lport f6434b8e-2b54-4b57-a8c5-ce06c6d6325d for this chassis.
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00371|binding|INFO|f6434b8e-2b54-4b57-a8c5-ce06c6d6325d: Claiming fa:16:3e:4d:68:af 10.100.0.154
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00372|binding|INFO|Claiming lport d79c107f-7912-4ccc-9651-f7791b24d22c for this chassis.
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00373|binding|INFO|d79c107f-7912-4ccc-9651-f7791b24d22c: Claiming fa:16:3e:cc:73:ce 10.100.1.21
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.518 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.522 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[8ffbe962-82e7-4927-9194-a8eca1599c9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.5241] device (tapd79c107f-79): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.5252] device (tapd79c107f-79): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.525 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00374|binding|INFO|Setting lport 9397b73b-bcd8-436d-ac03-5350a4a977b8 ovn-installed in OVS
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.528 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.5297] device (tapf6434b8e-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.5308] device (tapf6434b8e-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.531 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:73:ce 10.100.1.21'], port_security=['fa:16:3e:cc:73:ce 10.100.1.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.21/24', 'neutron:device_id': 'dca0936b-0f9e-4a24-a613-087b4a117a05', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40ade3e4-14f4-4dad-853b-815e30349996', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=018a3c1d-4e5c-4ca7-8402-1e4fa9d4f4dc, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=d79c107f-7912-4ccc-9651-f7791b24d22c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.532 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:68:af 10.100.0.154'], port_security=['fa:16:3e:4d:68:af 10.100.0.154'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.154/24', 'neutron:device_id': 'dca0936b-0f9e-4a24-a613-087b4a117a05', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b19c0ba2-0afd-4104-905d-6395b6c0bd43, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00375|binding|INFO|Setting lport 9397b73b-bcd8-436d-ac03-5350a4a977b8 up in Southbound
Nov 22 03:07:03 np0005531888 systemd-machined[153106]: New machine qemu-52-instance-0000006e.
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.566 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f849c567-40b0-447a-b3a0-5c2899d7929f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00376|binding|INFO|Setting lport f6434b8e-2b54-4b57-a8c5-ce06c6d6325d ovn-installed in OVS
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00377|binding|INFO|Setting lport f6434b8e-2b54-4b57-a8c5-ce06c6d6325d up in Southbound
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00378|binding|INFO|Setting lport d79c107f-7912-4ccc-9651-f7791b24d22c ovn-installed in OVS
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00379|binding|INFO|Setting lport d79c107f-7912-4ccc-9651-f7791b24d22c up in Southbound
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.569 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:03 np0005531888 systemd[1]: Started Virtual Machine qemu-52-instance-0000006e.
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.595 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[54a9a60d-db05-4757-b664-033b3f286c0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.6029] manager: (tap6179d9b5-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/182)
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.601 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[45d972ec-0603-4057-a5c0-c07143032177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.639 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3077b2f7-7436-409a-a355-c699cbb8cf99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.642 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0ca063-f0e1-4219-8b15-78b52d67fcff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.6632] device (tap6179d9b5-e0): carrier: link connected
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.667 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a7dd68-a04a-4a2b-b08d-69ac770a5e77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.685 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d427f597-59f1-4039-a294-dfbf5cc75553]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6179d9b5-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:6b:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556430, 'reachable_time': 36077, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231223, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.701 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fc31c604-79f5-4da7-937f-c4e46b9b21f5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:6bcb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556430, 'tstamp': 556430}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231224, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.713 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6f388df7-01db-4a27-8783-9c51f751683c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6179d9b5-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:6b:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556430, 'reachable_time': 36077, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231225, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.739 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c17aef85-8690-4927-846c-e711e422d807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.793 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ef91eb85-380c-48cc-b24e-a7f092e359da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.794 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6179d9b5-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.795 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.795 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6179d9b5-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:03 np0005531888 NetworkManager[55166]: <info>  [1763798823.7983] manager: (tap6179d9b5-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.797 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:03 np0005531888 kernel: tap6179d9b5-e0: entered promiscuous mode
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.800 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.801 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6179d9b5-e0, col_values=(('external_ids', {'iface-id': 'd66e5699-5d16-4875-a9b2-21678e5443d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:03Z|00380|binding|INFO|Releasing lport d66e5699-5d16-4875-a9b2-21678e5443d9 from this chassis (sb_readonly=0)
Nov 22 03:07:03 np0005531888 nova_compute[186788]: 2025-11-22 08:07:03.814 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.815 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6179d9b5-e6e9-4125-83a8-b9f05e25d45c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6179d9b5-e6e9-4125-83a8-b9f05e25d45c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.816 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4aca7c-1724-43be-88b2-bf7a2cfce702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.817 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-6179d9b5-e6e9-4125-83a8-b9f05e25d45c
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/6179d9b5-e6e9-4125-83a8-b9f05e25d45c.pid.haproxy
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 6179d9b5-e6e9-4125-83a8-b9f05e25d45c
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:07:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:03.817 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'env', 'PROCESS_TAG=haproxy-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6179d9b5-e6e9-4125-83a8-b9f05e25d45c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.032 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798824.0317976, dca0936b-0f9e-4a24-a613-087b4a117a05 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.033 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] VM Started (Lifecycle Event)#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.053 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.059 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798824.0320835, dca0936b-0f9e-4a24-a613-087b4a117a05 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.060 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.086 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.094 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.113 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:07:04 np0005531888 podman[231266]: 2025-11-22 08:07:04.214477617 +0000 UTC m=+0.079554738 container create 6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 03:07:04 np0005531888 podman[231266]: 2025-11-22 08:07:04.157955227 +0000 UTC m=+0.023032368 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:07:04 np0005531888 systemd[1]: Started libpod-conmon-6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba.scope.
Nov 22 03:07:04 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:07:04 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1bc88bfcffbb809dd5b439fac4f6bf5e7cb27c2718efc224e415566759d01be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:07:04 np0005531888 podman[231266]: 2025-11-22 08:07:04.354742456 +0000 UTC m=+0.219819607 container init 6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:04 np0005531888 podman[231266]: 2025-11-22 08:07:04.360483117 +0000 UTC m=+0.225560238 container start 6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.382 186792 DEBUG nova.network.neutron [req-2bacc732-31cd-4c54-9d2e-c7e6b066980d req-38a5fa07-e5d0-4737-a027-723edc5034b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Updated VIF entry in instance network info cache for port f6434b8e-2b54-4b57-a8c5-ce06c6d6325d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:07:04 np0005531888 neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c[231281]: [NOTICE]   (231285) : New worker (231287) forked
Nov 22 03:07:04 np0005531888 neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c[231281]: [NOTICE]   (231285) : Loading success.
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.384 186792 DEBUG nova.network.neutron [req-2bacc732-31cd-4c54-9d2e-c7e6b066980d req-38a5fa07-e5d0-4737-a027-723edc5034b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Updating instance_info_cache with network_info: [{"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.398 186792 DEBUG oslo_concurrency.lockutils [req-2bacc732-31cd-4c54-9d2e-c7e6b066980d req-38a5fa07-e5d0-4737-a027-723edc5034b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dca0936b-0f9e-4a24-a613-087b4a117a05" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.418 104023 INFO neutron.agent.ovn.metadata.agent [-] Port d79c107f-7912-4ccc-9651-f7791b24d22c in datapath 40ade3e4-14f4-4dad-853b-815e30349996 unbound from our chassis#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.420 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 40ade3e4-14f4-4dad-853b-815e30349996#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.432 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5a038a-eb49-4eb3-a85c-94f6e871fd46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.433 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap40ade3e4-11 in ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.435 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap40ade3e4-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.435 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c687447b-152e-4c56-95b6-f5030de61ce5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.437 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[626f5f6c-4f30-4d7d-9fbc-ce07ecb7c66a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.451 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[48cd5706-2c11-4146-8122-3ea83cebfd9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.463 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9235e62f-4cc8-4d0c-ad76-b3ed28cd346f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.468 186792 DEBUG nova.compute.manager [req-edb5d1fa-3bd5-4538-8211-7c6a504c9b5e req-fd82246c-0bc5-4b34-a04c-1b726aa8f0e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.469 186792 DEBUG oslo_concurrency.lockutils [req-edb5d1fa-3bd5-4538-8211-7c6a504c9b5e req-fd82246c-0bc5-4b34-a04c-1b726aa8f0e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.469 186792 DEBUG oslo_concurrency.lockutils [req-edb5d1fa-3bd5-4538-8211-7c6a504c9b5e req-fd82246c-0bc5-4b34-a04c-1b726aa8f0e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.470 186792 DEBUG oslo_concurrency.lockutils [req-edb5d1fa-3bd5-4538-8211-7c6a504c9b5e req-fd82246c-0bc5-4b34-a04c-1b726aa8f0e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.470 186792 DEBUG nova.compute.manager [req-edb5d1fa-3bd5-4538-8211-7c6a504c9b5e req-fd82246c-0bc5-4b34-a04c-1b726aa8f0e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Processing event network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.492 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5a26d6-aa20-4640-b9a3-1afb4025011d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 NetworkManager[55166]: <info>  [1763798824.5015] manager: (tap40ade3e4-10): new Veth device (/org/freedesktop/NetworkManager/Devices/184)
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.500 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c56be3-fc8d-492d-a532-9aed583e67ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.538 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[587c68a4-e185-4331-9319-e98298098d44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.543 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d3cb978d-a88b-4c5f-8135-34e0503bf469]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 NetworkManager[55166]: <info>  [1763798824.5706] device (tap40ade3e4-10): carrier: link connected
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.576 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[131a2f39-0280-4044-866b-11ffe78f41ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.592 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[62807ac7-4a4f-4f4f-8f65-45bc2d21d133]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40ade3e4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:11:01'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556521, 'reachable_time': 17206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231306, 'error': None, 'target': 'ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.608 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[074aae56-c57f-433a-866d-1b26fb2e1317]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feab:1101'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556521, 'tstamp': 556521}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231307, 'error': None, 'target': 'ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.627 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d9b7e8-2edb-4277-9138-a5d321cd6b4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40ade3e4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:11:01'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556521, 'reachable_time': 17206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231308, 'error': None, 'target': 'ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.657 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a755ee84-0072-47f1-8d2e-8d8f3de345b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.711 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4523894d-4258-4fd9-bc27-73a822bcbc09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.713 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40ade3e4-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.713 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.714 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40ade3e4-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:04 np0005531888 NetworkManager[55166]: <info>  [1763798824.7162] manager: (tap40ade3e4-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Nov 22 03:07:04 np0005531888 kernel: tap40ade3e4-10: entered promiscuous mode
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.716 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.718 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap40ade3e4-10, col_values=(('external_ids', {'iface-id': '2064fe08-262b-4bce-a621-0e358c276298'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:04 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:04Z|00381|binding|INFO|Releasing lport 2064fe08-262b-4bce-a621-0e358c276298 from this chassis (sb_readonly=0)
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.719 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.732 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.733 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/40ade3e4-14f4-4dad-853b-815e30349996.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/40ade3e4-14f4-4dad-853b-815e30349996.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.734 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4b309a-f4b6-43b7-839e-27eb1bd4322b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.735 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-40ade3e4-14f4-4dad-853b-815e30349996
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/40ade3e4-14f4-4dad-853b-815e30349996.pid.haproxy
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 40ade3e4-14f4-4dad-853b-815e30349996
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:07:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:04.735 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996', 'env', 'PROCESS_TAG=haproxy-40ade3e4-14f4-4dad-853b-815e30349996', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/40ade3e4-14f4-4dad-853b-815e30349996.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.993 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.993 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:04 np0005531888 nova_compute[186788]: 2025-11-22 08:07:04.993 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.076 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:05 np0005531888 podman[231336]: 2025-11-22 08:07:05.097254014 +0000 UTC m=+0.054216714 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:07:05 np0005531888 podman[231332]: 2025-11-22 08:07:05.097451269 +0000 UTC m=+0.056695026 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.138 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.139 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:05 np0005531888 podman[231352]: 2025-11-22 08:07:05.161876093 +0000 UTC m=+0.105582718 container create 56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:05 np0005531888 podman[231352]: 2025-11-22 08:07:05.090775145 +0000 UTC m=+0.034481790 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.208 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.216 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:05 np0005531888 systemd[1]: Started libpod-conmon-56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55.scope.
Nov 22 03:07:05 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:07:05 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a27b7c518b0b934522eddd8a097545f23398efe5b2c4bd0f41004d3e955e56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:07:05 np0005531888 podman[231352]: 2025-11-22 08:07:05.261830181 +0000 UTC m=+0.205536846 container init 56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:07:05 np0005531888 podman[231352]: 2025-11-22 08:07:05.267327556 +0000 UTC m=+0.211034191 container start 56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.279 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.280 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:05 np0005531888 neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996[231399]: [NOTICE]   (231404) : New worker (231409) forked
Nov 22 03:07:05 np0005531888 neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996[231399]: [NOTICE]   (231404) : Loading success.
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.340 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.358 104023 INFO neutron.agent.ovn.metadata.agent [-] Port f6434b8e-2b54-4b57-a8c5-ce06c6d6325d in datapath 6179d9b5-e6e9-4125-83a8-b9f05e25d45c unbound from our chassis#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.360 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6179d9b5-e6e9-4125-83a8-b9f05e25d45c#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.378 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c64e6229-ffe4-429d-ab1d-77fcbc8e316c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.408 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[edb3a5c7-8df1-4115-94da-f27da3ffc466]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.412 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[55d7dd5c-1bf3-4e46-aacb-0f0dee341dbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.439 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9d3b63-e61f-43c5-831e-574a03aa5da1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.457 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fb540f0a-f87d-44e0-bf8b-9517ec3f65dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6179d9b5-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:6b:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556430, 'reachable_time': 36077, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231425, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.473 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[48056f36-1022-40ac-b014-19200577fc7e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6179d9b5-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556440, 'tstamp': 556440}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231426, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap6179d9b5-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556443, 'tstamp': 556443}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231426, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.475 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6179d9b5-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.477 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.478 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6179d9b5-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.478 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.479 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6179d9b5-e0, col_values=(('external_ids', {'iface-id': 'd66e5699-5d16-4875-a9b2-21678e5443d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:05.479 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.567 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.569 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5473MB free_disk=73.24488830566406GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.569 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.569 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.712 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance a5045f34-cbc5-4b30-8165-f1fe663be743 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.713 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance dca0936b-0f9e-4a24-a613-087b4a117a05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.713 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.713 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.766 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.778 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.836 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:07:05 np0005531888 nova_compute[186788]: 2025-11-22 08:07:05.837 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.776 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.776 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.776 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.777 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.777 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No event matching network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 in dict_keys([('network-vif-plugged', 'd79c107f-7912-4ccc-9651-f7791b24d22c'), ('network-vif-plugged', 'f6434b8e-2b54-4b57-a8c5-ce06c6d6325d')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.777 186792 WARNING nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received unexpected event network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.777 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.778 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.778 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.778 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.779 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Processing event network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.779 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.779 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.779 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.780 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.780 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No event matching network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d in dict_keys([('network-vif-plugged', 'd79c107f-7912-4ccc-9651-f7791b24d22c')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.780 186792 WARNING nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received unexpected event network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.780 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.781 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.781 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.781 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.781 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Processing event network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.781 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.782 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.782 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.782 186792 DEBUG oslo_concurrency.lockutils [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.782 186792 DEBUG nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No waiting events found dispatching network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.783 186792 WARNING nova.compute.manager [req-aea57174-e580-486c-8c65-f2787b44a344 req-bed5dc09-bbde-45ab-a357-3eec4ae13a38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received unexpected event network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.784 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.788 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798826.7884498, dca0936b-0f9e-4a24-a613-087b4a117a05 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.789 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.791 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.795 186792 INFO nova.virt.libvirt.driver [-] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Instance spawned successfully.#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.796 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.811 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.816 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.821 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.822 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.822 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.822 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.823 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.823 186792 DEBUG nova.virt.libvirt.driver [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.837 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.849 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.897 186792 INFO nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Took 17.80 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.897 186792 DEBUG nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:06 np0005531888 nova_compute[186788]: 2025-11-22 08:07:06.992 186792 INFO nova.compute.manager [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Took 18.30 seconds to build instance.#033[00m
Nov 22 03:07:07 np0005531888 nova_compute[186788]: 2025-11-22 08:07:07.016 186792 DEBUG oslo_concurrency.lockutils [None req-e8645fdd-a37e-43a7-8811-4b522ebabdd9 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:07 np0005531888 nova_compute[186788]: 2025-11-22 08:07:07.199 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.529 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.765 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.766 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.767 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.767 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.767 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.777 186792 INFO nova.compute.manager [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Terminating instance#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.786 186792 DEBUG nova.compute.manager [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:07:08 np0005531888 kernel: tap9397b73b-bc (unregistering): left promiscuous mode
Nov 22 03:07:08 np0005531888 NetworkManager[55166]: <info>  [1763798828.8096] device (tap9397b73b-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00382|binding|INFO|Releasing lport 9397b73b-bcd8-436d-ac03-5350a4a977b8 from this chassis (sb_readonly=0)
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.815 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00383|binding|INFO|Setting lport 9397b73b-bcd8-436d-ac03-5350a4a977b8 down in Southbound
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00384|binding|INFO|Removing iface tap9397b73b-bc ovn-installed in OVS
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.817 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.822 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:5b:b6 10.100.0.184'], port_security=['fa:16:3e:5b:5b:b6 10.100.0.184'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.184/24', 'neutron:device_id': 'dca0936b-0f9e-4a24-a613-087b4a117a05', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b19c0ba2-0afd-4104-905d-6395b6c0bd43, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=9397b73b-bcd8-436d-ac03-5350a4a977b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.824 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 9397b73b-bcd8-436d-ac03-5350a4a977b8 in datapath 6179d9b5-e6e9-4125-83a8-b9f05e25d45c unbound from our chassis#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.826 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6179d9b5-e6e9-4125-83a8-b9f05e25d45c#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.833 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 kernel: tapd79c107f-79 (unregistering): left promiscuous mode
Nov 22 03:07:08 np0005531888 NetworkManager[55166]: <info>  [1763798828.8422] device (tapd79c107f-79): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.843 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[556bd812-74e0-4127-8052-d624676455e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00385|binding|INFO|Releasing lport d79c107f-7912-4ccc-9651-f7791b24d22c from this chassis (sb_readonly=0)
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00386|binding|INFO|Setting lport d79c107f-7912-4ccc-9651-f7791b24d22c down in Southbound
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00387|binding|INFO|Removing iface tapd79c107f-79 ovn-installed in OVS
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.850 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.852 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.859 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:73:ce 10.100.1.21'], port_security=['fa:16:3e:cc:73:ce 10.100.1.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.21/24', 'neutron:device_id': 'dca0936b-0f9e-4a24-a613-087b4a117a05', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40ade3e4-14f4-4dad-853b-815e30349996', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=018a3c1d-4e5c-4ca7-8402-1e4fa9d4f4dc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=d79c107f-7912-4ccc-9651-f7791b24d22c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.864 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.872 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3abbc830-76d7-4e15-8c6f-1ac5b957dba9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.875 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a41f2f11-3ce8-46ee-ab21-2d224e7ed8cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:08 np0005531888 kernel: tapf6434b8e-2b (unregistering): left promiscuous mode
Nov 22 03:07:08 np0005531888 NetworkManager[55166]: <info>  [1763798828.8818] device (tapf6434b8e-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.892 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00388|binding|INFO|Releasing lport f6434b8e-2b54-4b57-a8c5-ce06c6d6325d from this chassis (sb_readonly=0)
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00389|binding|INFO|Setting lport f6434b8e-2b54-4b57-a8c5-ce06c6d6325d down in Southbound
Nov 22 03:07:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:08Z|00390|binding|INFO|Removing iface tapf6434b8e-2b ovn-installed in OVS
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.894 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.899 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:68:af 10.100.0.154'], port_security=['fa:16:3e:4d:68:af 10.100.0.154'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.154/24', 'neutron:device_id': 'dca0936b-0f9e-4a24-a613-087b4a117a05', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b19c0ba2-0afd-4104-905d-6395b6c0bd43, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.908 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8a2d511d-d623-40db-91f7-f4809af4f74a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.916 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.931 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[82e3339d-8fe3-4973-8529-b848aa108b2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6179d9b5-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:6b:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556430, 'reachable_time': 36077, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231451, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:08 np0005531888 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Nov 22 03:07:08 np0005531888 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006e.scope: Consumed 2.449s CPU time.
Nov 22 03:07:08 np0005531888 systemd-machined[153106]: Machine qemu-52-instance-0000006e terminated.
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.951 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[27ddfb4c-8a8c-4fe4-80db-45f6350e9d02]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6179d9b5-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556440, 'tstamp': 556440}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231452, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tap6179d9b5-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556443, 'tstamp': 556443}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231452, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.954 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6179d9b5-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.955 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 nova_compute[186788]: 2025-11-22 08:07:08.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.967 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6179d9b5-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.968 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.968 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6179d9b5-e0, col_values=(('external_ids', {'iface-id': 'd66e5699-5d16-4875-a9b2-21678e5443d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.969 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.971 104023 INFO neutron.agent.ovn.metadata.agent [-] Port d79c107f-7912-4ccc-9651-f7791b24d22c in datapath 40ade3e4-14f4-4dad-853b-815e30349996 unbound from our chassis#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.973 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 40ade3e4-14f4-4dad-853b-815e30349996, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.974 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4efd6083-9960-4bac-bfc4-23dc0b417e00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:08.975 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996 namespace which is not needed anymore#033[00m
Nov 22 03:07:09 np0005531888 NetworkManager[55166]: <info>  [1763798829.0211] manager: (tapd79c107f-79): new Tun device (/org/freedesktop/NetworkManager/Devices/186)
Nov 22 03:07:09 np0005531888 NetworkManager[55166]: <info>  [1763798829.0357] manager: (tapf6434b8e-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.078 186792 INFO nova.virt.libvirt.driver [-] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Instance destroyed successfully.#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.079 186792 DEBUG nova.objects.instance [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lazy-loading 'resources' on Instance uuid dca0936b-0f9e-4a24-a613-087b4a117a05 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.092 186792 DEBUG nova.virt.libvirt.vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:07:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:06Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.092 186792 DEBUG nova.network.os_vif_util [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.093 186792 DEBUG nova.network.os_vif_util [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:5b:b6,bridge_name='br-int',has_traffic_filtering=True,id=9397b73b-bcd8-436d-ac03-5350a4a977b8,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9397b73b-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.093 186792 DEBUG os_vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:5b:b6,bridge_name='br-int',has_traffic_filtering=True,id=9397b73b-bcd8-436d-ac03-5350a4a977b8,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9397b73b-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.096 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.096 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9397b73b-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.098 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.100 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.105 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.108 186792 INFO os_vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:5b:b6,bridge_name='br-int',has_traffic_filtering=True,id=9397b73b-bcd8-436d-ac03-5350a4a977b8,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9397b73b-bc')#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.109 186792 DEBUG nova.virt.libvirt.vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:07:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:06Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.110 186792 DEBUG nova.network.os_vif_util [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "d79c107f-7912-4ccc-9651-f7791b24d22c", "address": "fa:16:3e:cc:73:ce", "network": {"id": "40ade3e4-14f4-4dad-853b-815e30349996", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1950548299", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd79c107f-79", "ovs_interfaceid": "d79c107f-7912-4ccc-9651-f7791b24d22c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.110 186792 DEBUG nova.network.os_vif_util [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:73:ce,bridge_name='br-int',has_traffic_filtering=True,id=d79c107f-7912-4ccc-9651-f7791b24d22c,network=Network(40ade3e4-14f4-4dad-853b-815e30349996),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd79c107f-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.111 186792 DEBUG os_vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:73:ce,bridge_name='br-int',has_traffic_filtering=True,id=d79c107f-7912-4ccc-9651-f7791b24d22c,network=Network(40ade3e4-14f4-4dad-853b-815e30349996),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd79c107f-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.112 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.113 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd79c107f-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.114 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.116 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.119 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.121 186792 INFO os_vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:73:ce,bridge_name='br-int',has_traffic_filtering=True,id=d79c107f-7912-4ccc-9651-f7791b24d22c,network=Network(40ade3e4-14f4-4dad-853b-815e30349996),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd79c107f-79')#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.122 186792 DEBUG nova.virt.libvirt.vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-10239088',display_name='tempest-ServersTestMultiNic-server-10239088',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-10239088',id=110,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:07:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-235vzz0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:06Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=dca0936b-0f9e-4a24-a613-087b4a117a05,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.122 186792 DEBUG nova.network.os_vif_util [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.123 186792 DEBUG nova.network.os_vif_util [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:68:af,bridge_name='br-int',has_traffic_filtering=True,id=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6434b8e-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.123 186792 DEBUG os_vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:68:af,bridge_name='br-int',has_traffic_filtering=True,id=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6434b8e-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.124 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.125 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6434b8e-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.127 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.129 186792 INFO os_vif [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:68:af,bridge_name='br-int',has_traffic_filtering=True,id=f6434b8e-2b54-4b57-a8c5-ce06c6d6325d,network=Network(6179d9b5-e6e9-4125-83a8-b9f05e25d45c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6434b8e-2b')#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.130 186792 INFO nova.virt.libvirt.driver [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Deleting instance files /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05_del#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.131 186792 INFO nova.virt.libvirt.driver [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Deletion of /var/lib/nova/instances/dca0936b-0f9e-4a24-a613-087b4a117a05_del complete#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.229 186792 INFO nova.compute.manager [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.230 186792 DEBUG oslo.service.loopingcall [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.230 186792 DEBUG nova.compute.manager [-] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.231 186792 DEBUG nova.network.neutron [-] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996[231399]: [NOTICE]   (231404) : haproxy version is 2.8.14-c23fe91
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996[231399]: [NOTICE]   (231404) : path to executable is /usr/sbin/haproxy
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996[231399]: [WARNING]  (231404) : Exiting Master process...
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996[231399]: [WARNING]  (231404) : Exiting Master process...
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996[231399]: [ALERT]    (231404) : Current worker (231409) exited with code 143 (Terminated)
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996[231399]: [WARNING]  (231404) : All workers exited. Exiting... (0)
Nov 22 03:07:09 np0005531888 systemd[1]: libpod-56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55.scope: Deactivated successfully.
Nov 22 03:07:09 np0005531888 podman[231514]: 2025-11-22 08:07:09.469630829 +0000 UTC m=+0.378362175 container died 56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:07:09 np0005531888 systemd[1]: var-lib-containers-storage-overlay-62a27b7c518b0b934522eddd8a097545f23398efe5b2c4bd0f41004d3e955e56-merged.mount: Deactivated successfully.
Nov 22 03:07:09 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55-userdata-shm.mount: Deactivated successfully.
Nov 22 03:07:09 np0005531888 podman[231514]: 2025-11-22 08:07:09.541201449 +0000 UTC m=+0.449932795 container cleanup 56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:07:09 np0005531888 systemd[1]: libpod-conmon-56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55.scope: Deactivated successfully.
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.563 186792 DEBUG nova.compute.manager [req-98353bfa-e4a6-4d52-b1ec-02df1aaefa80 req-0ff90793-4ac6-4a54-9a92-9b1d78fec8a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-unplugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.564 186792 DEBUG oslo_concurrency.lockutils [req-98353bfa-e4a6-4d52-b1ec-02df1aaefa80 req-0ff90793-4ac6-4a54-9a92-9b1d78fec8a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.564 186792 DEBUG oslo_concurrency.lockutils [req-98353bfa-e4a6-4d52-b1ec-02df1aaefa80 req-0ff90793-4ac6-4a54-9a92-9b1d78fec8a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.564 186792 DEBUG oslo_concurrency.lockutils [req-98353bfa-e4a6-4d52-b1ec-02df1aaefa80 req-0ff90793-4ac6-4a54-9a92-9b1d78fec8a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.564 186792 DEBUG nova.compute.manager [req-98353bfa-e4a6-4d52-b1ec-02df1aaefa80 req-0ff90793-4ac6-4a54-9a92-9b1d78fec8a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No waiting events found dispatching network-vif-unplugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.565 186792 DEBUG nova.compute.manager [req-98353bfa-e4a6-4d52-b1ec-02df1aaefa80 req-0ff90793-4ac6-4a54-9a92-9b1d78fec8a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-unplugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:07:09 np0005531888 podman[231544]: 2025-11-22 08:07:09.676875425 +0000 UTC m=+0.114542047 container remove 56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.682 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[096d402c-854e-42aa-8ff6-6c0d31ff0953]: (4, ('Sat Nov 22 08:07:09 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996 (56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55)\n56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55\nSat Nov 22 08:07:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996 (56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55)\n56af99ede02d1e27bee5a9ad60037909b0c503aea6d54969897bdd4e09ba3b55\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.684 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4c6ba3-21a9-459a-bf74-216eca49f0b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.685 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40ade3e4-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.687 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 kernel: tap40ade3e4-10: left promiscuous mode
Nov 22 03:07:09 np0005531888 nova_compute[186788]: 2025-11-22 08:07:09.701 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.707 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8a2e14c9-95ca-4457-abe0-1570f000f94c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.734 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[de75f2f7-a6a0-439a-b3b9-ab1d5d4c0891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.736 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[01957684-f95b-4828-a3b6-270c91e34efe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.751 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e54a8ca7-402b-4f00-a940-65ecb881cb49]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556513, 'reachable_time': 42476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231559, 'error': None, 'target': 'ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:09 np0005531888 systemd[1]: run-netns-ovnmeta\x2d40ade3e4\x2d14f4\x2d4dad\x2d853b\x2d815e30349996.mount: Deactivated successfully.
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.755 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-40ade3e4-14f4-4dad-853b-815e30349996 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.755 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[abfe8a29-d287-4c7b-8207-57a25a6293df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.756 104023 INFO neutron.agent.ovn.metadata.agent [-] Port f6434b8e-2b54-4b57-a8c5-ce06c6d6325d in datapath 6179d9b5-e6e9-4125-83a8-b9f05e25d45c unbound from our chassis#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.758 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6179d9b5-e6e9-4125-83a8-b9f05e25d45c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.759 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[13027fb6-3574-4bc0-aa37-795a545c20a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:09.760 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c namespace which is not needed anymore#033[00m
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c[231281]: [NOTICE]   (231285) : haproxy version is 2.8.14-c23fe91
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c[231281]: [NOTICE]   (231285) : path to executable is /usr/sbin/haproxy
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c[231281]: [WARNING]  (231285) : Exiting Master process...
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c[231281]: [ALERT]    (231285) : Current worker (231287) exited with code 143 (Terminated)
Nov 22 03:07:09 np0005531888 neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c[231281]: [WARNING]  (231285) : All workers exited. Exiting... (0)
Nov 22 03:07:09 np0005531888 systemd[1]: libpod-6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba.scope: Deactivated successfully.
Nov 22 03:07:10 np0005531888 podman[231576]: 2025-11-22 08:07:10.000320778 +0000 UTC m=+0.159629556 container died 6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.011 186792 DEBUG nova.compute.manager [req-46c2ec5c-cc40-4796-af05-7add09d47e31 req-75b2b516-7052-4df8-8746-07df3b151f25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-deleted-d79c107f-7912-4ccc-9651-f7791b24d22c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.012 186792 INFO nova.compute.manager [req-46c2ec5c-cc40-4796-af05-7add09d47e31 req-75b2b516-7052-4df8-8746-07df3b151f25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Neutron deleted interface d79c107f-7912-4ccc-9651-f7791b24d22c; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.012 186792 DEBUG nova.network.neutron [req-46c2ec5c-cc40-4796-af05-7add09d47e31 req-75b2b516-7052-4df8-8746-07df3b151f25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Updating instance_info_cache with network_info: [{"id": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "address": "fa:16:3e:5b:5b:b6", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9397b73b-bc", "ovs_interfaceid": "9397b73b-bcd8-436d-ac03-5350a4a977b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "address": "fa:16:3e:4d:68:af", "network": {"id": "6179d9b5-e6e9-4125-83a8-b9f05e25d45c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-21995347", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6434b8e-2b", "ovs_interfaceid": "f6434b8e-2b54-4b57-a8c5-ce06c6d6325d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.034 186792 DEBUG nova.compute.manager [req-46c2ec5c-cc40-4796-af05-7add09d47e31 req-75b2b516-7052-4df8-8746-07df3b151f25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Detach interface failed, port_id=d79c107f-7912-4ccc-9651-f7791b24d22c, reason: Instance dca0936b-0f9e-4a24-a613-087b4a117a05 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.273 186792 INFO nova.compute.manager [None req-553b912c-6c00-434a-a78a-e614290091ed 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Pausing#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.274 186792 DEBUG nova.objects.instance [None req-553b912c-6c00-434a-a78a-e614290091ed 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'flavor' on Instance uuid a5045f34-cbc5-4b30-8165-f1fe663be743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.313 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798830.3134527, a5045f34-cbc5-4b30-8165-f1fe663be743 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.314 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.318 186792 DEBUG nova.compute.manager [None req-553b912c-6c00-434a-a78a-e614290091ed 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.344 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.347 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.377 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Nov 22 03:07:10 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba-userdata-shm.mount: Deactivated successfully.
Nov 22 03:07:10 np0005531888 systemd[1]: var-lib-containers-storage-overlay-c1bc88bfcffbb809dd5b439fac4f6bf5e7cb27c2718efc224e415566759d01be-merged.mount: Deactivated successfully.
Nov 22 03:07:10 np0005531888 podman[231576]: 2025-11-22 08:07:10.543929896 +0000 UTC m=+0.703238674 container cleanup 6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:07:10 np0005531888 systemd[1]: libpod-conmon-6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba.scope: Deactivated successfully.
Nov 22 03:07:10 np0005531888 podman[231605]: 2025-11-22 08:07:10.639078025 +0000 UTC m=+0.067112221 container remove 6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.643 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[67624d79-3852-48c4-91a2-28a5ec34c0d8]: (4, ('Sat Nov 22 08:07:09 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c (6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba)\n6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba\nSat Nov 22 08:07:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c (6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba)\n6a518158ee5fbe35032c0ba1fadc33d4e7e1bffd9114aef68a42a22c600986ba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.645 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ea567ab2-12e6-4c51-8006-709bebbf5832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.647 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6179d9b5-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:10 np0005531888 kernel: tap6179d9b5-e0: left promiscuous mode
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.661 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.664 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee7a09b-ae64-43e5-8b5c-ccaef8635682]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.680 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c4c06c-ab45-4cf6-8c68-81c05473a8f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.681 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cd09dba5-af12-452e-a7be-6568c74e6362]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.698 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff0fc36-71f6-4700-b935-63d10cd58184]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556423, 'reachable_time': 41397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231621, 'error': None, 'target': 'ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.700 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6179d9b5-e6e9-4125-83a8-b9f05e25d45c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:07:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:10.701 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[39d0c30a-22c2-4959-9c27-eef7018c06f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:10 np0005531888 systemd[1]: run-netns-ovnmeta\x2d6179d9b5\x2de6e9\x2d4125\x2d83a8\x2db9f05e25d45c.mount: Deactivated successfully.
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.810 186792 DEBUG nova.network.neutron [-] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.830 186792 INFO nova.compute.manager [-] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Took 1.60 seconds to deallocate network for instance.#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.883 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.884 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.956 186792 DEBUG nova.compute.provider_tree [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.968 186792 DEBUG nova.scheduler.client.report [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:07:10 np0005531888 nova_compute[186788]: 2025-11-22 08:07:10.986 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.010 186792 INFO nova.scheduler.client.report [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Deleted allocations for instance dca0936b-0f9e-4a24-a613-087b4a117a05#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.069 186792 DEBUG oslo_concurrency.lockutils [None req-5e38586c-d45b-40a1-b83a-8f45ee3a7b06 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.634 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.634 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.634 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.635 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.635 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No waiting events found dispatching network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.635 186792 WARNING nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received unexpected event network-vif-plugged-9397b73b-bcd8-436d-ac03-5350a4a977b8 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.635 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-unplugged-d79c107f-7912-4ccc-9651-f7791b24d22c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.636 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.636 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.636 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.636 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No waiting events found dispatching network-vif-unplugged-d79c107f-7912-4ccc-9651-f7791b24d22c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.637 186792 WARNING nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received unexpected event network-vif-unplugged-d79c107f-7912-4ccc-9651-f7791b24d22c for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.637 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.637 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.637 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.638 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.638 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No waiting events found dispatching network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.638 186792 WARNING nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received unexpected event network-vif-plugged-d79c107f-7912-4ccc-9651-f7791b24d22c for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.638 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-unplugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.639 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.639 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.639 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.639 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No waiting events found dispatching network-vif-unplugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.640 186792 WARNING nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received unexpected event network-vif-unplugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.640 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.640 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.640 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.641 186792 DEBUG oslo_concurrency.lockutils [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dca0936b-0f9e-4a24-a613-087b4a117a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.641 186792 DEBUG nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] No waiting events found dispatching network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:11 np0005531888 nova_compute[186788]: 2025-11-22 08:07:11.641 186792 WARNING nova.compute.manager [req-bd3662e4-11bb-4971-aa7d-a4408f695cbb req-a6b6bfba-c7ec-49cb-a435-605a208722ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received unexpected event network-vif-plugged-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:07:12 np0005531888 nova_compute[186788]: 2025-11-22 08:07:12.094 186792 DEBUG nova.compute.manager [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-deleted-9397b73b-bcd8-436d-ac03-5350a4a977b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:12 np0005531888 nova_compute[186788]: 2025-11-22 08:07:12.094 186792 DEBUG nova.compute.manager [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Received event network-vif-deleted-f6434b8e-2b54-4b57-a8c5-ce06c6d6325d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:13.390 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.390 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:13.392 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:07:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:13.393 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.532 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.614 186792 INFO nova.compute.manager [None req-c670c6c4-b28b-4d3c-b6ce-918d7562ef9f 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Unpausing#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.615 186792 DEBUG nova.objects.instance [None req-c670c6c4-b28b-4d3c-b6ce-918d7562ef9f 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'flavor' on Instance uuid a5045f34-cbc5-4b30-8165-f1fe663be743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.641 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798833.640976, a5045f34-cbc5-4b30-8165-f1fe663be743 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.641 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:07:13 np0005531888 virtqemud[186358]: argument unsupported: QEMU guest agent is not configured
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.646 186792 DEBUG nova.virt.libvirt.guest [None req-c670c6c4-b28b-4d3c-b6ce-918d7562ef9f 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.646 186792 DEBUG nova.compute.manager [None req-c670c6c4-b28b-4d3c-b6ce-918d7562ef9f 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.668 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.670 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.687 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Nov 22 03:07:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:13Z|00391|binding|INFO|Releasing lport 298be65c-aa9e-4327-b67d-2a3d4f1acf68 from this chassis (sb_readonly=0)
Nov 22 03:07:13 np0005531888 nova_compute[186788]: 2025-11-22 08:07:13.850 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:14 np0005531888 nova_compute[186788]: 2025-11-22 08:07:14.127 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:14 np0005531888 podman[231622]: 2025-11-22 08:07:14.689695058 +0000 UTC m=+0.056105621 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:07:17 np0005531888 podman[231642]: 2025-11-22 08:07:17.682798847 +0000 UTC m=+0.055427533 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.534 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.748 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "a5045f34-cbc5-4b30-8165-f1fe663be743" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.748 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.749 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.749 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.749 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.755 186792 INFO nova.compute.manager [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Terminating instance#033[00m
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.763 186792 DEBUG nova.compute.manager [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:07:18 np0005531888 kernel: tap21f2a09e-e7 (unregistering): left promiscuous mode
Nov 22 03:07:18 np0005531888 NetworkManager[55166]: <info>  [1763798838.8616] device (tap21f2a09e-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.872 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:18Z|00392|binding|INFO|Releasing lport 21f2a09e-e781-4e4e-9659-691ca54ee1d8 from this chassis (sb_readonly=0)
Nov 22 03:07:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:18Z|00393|binding|INFO|Setting lport 21f2a09e-e781-4e4e-9659-691ca54ee1d8 down in Southbound
Nov 22 03:07:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:18Z|00394|binding|INFO|Removing iface tap21f2a09e-e7 ovn-installed in OVS
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.875 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:18.882 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:51:b8 10.100.0.13'], port_security=['fa:16:3e:47:51:b8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a5045f34-cbc5-4b30-8165-f1fe663be743', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9714091-78f6-46c8-b55b-4a278bd99b49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7086819eb340f28dd7087159d82fa3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '65f3b143-522b-4e83-8261-f97700b0bd79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a437e229-533d-4315-8ee6-05d493bb5ad7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=21f2a09e-e781-4e4e-9659-691ca54ee1d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:18.884 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 21f2a09e-e781-4e4e-9659-691ca54ee1d8 in datapath f9714091-78f6-46c8-b55b-4a278bd99b49 unbound from our chassis#033[00m
Nov 22 03:07:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:18.885 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9714091-78f6-46c8-b55b-4a278bd99b49, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:07:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:18.886 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f4a3c2e5-8033-417b-8edb-8bc42b23fc70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:18.886 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 namespace which is not needed anymore#033[00m
Nov 22 03:07:18 np0005531888 nova_compute[186788]: 2025-11-22 08:07:18.890 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:18 np0005531888 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Nov 22 03:07:18 np0005531888 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006b.scope: Consumed 15.893s CPU time.
Nov 22 03:07:18 np0005531888 systemd-machined[153106]: Machine qemu-51-instance-0000006b terminated.
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.021 186792 INFO nova.virt.libvirt.driver [-] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Instance destroyed successfully.#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.022 186792 DEBUG nova.objects.instance [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'resources' on Instance uuid a5045f34-cbc5-4b30-8165-f1fe663be743 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.034 186792 DEBUG nova.virt.libvirt.vif [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1538326568',display_name='tempest-ServerRescueNegativeTestJSON-server-1538326568',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1538326568',id=107,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:06:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f7086819eb340f28dd7087159d82fa3',ramdisk_id='',reservation_id='r-lhanirue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1724156244',owner_user_name='tempest-ServerRescueNegativeTestJSON-1724156244-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:13Z,user_data=None,user_id='2c1b21c06c9b48d39e736b195bd12c8c',uuid=a5045f34-cbc5-4b30-8165-f1fe663be743,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.034 186792 DEBUG nova.network.os_vif_util [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converting VIF {"id": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "address": "fa:16:3e:47:51:b8", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21f2a09e-e7", "ovs_interfaceid": "21f2a09e-e781-4e4e-9659-691ca54ee1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.035 186792 DEBUG nova.network.os_vif_util [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:51:b8,bridge_name='br-int',has_traffic_filtering=True,id=21f2a09e-e781-4e4e-9659-691ca54ee1d8,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21f2a09e-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.035 186792 DEBUG os_vif [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:51:b8,bridge_name='br-int',has_traffic_filtering=True,id=21f2a09e-e781-4e4e-9659-691ca54ee1d8,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21f2a09e-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.036 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.037 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21f2a09e-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.038 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.041 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.042 186792 INFO os_vif [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:51:b8,bridge_name='br-int',has_traffic_filtering=True,id=21f2a09e-e781-4e4e-9659-691ca54ee1d8,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21f2a09e-e7')#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.043 186792 INFO nova.virt.libvirt.driver [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Deleting instance files /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743_del#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.043 186792 INFO nova.virt.libvirt.driver [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Deletion of /var/lib/nova/instances/a5045f34-cbc5-4b30-8165-f1fe663be743_del complete#033[00m
Nov 22 03:07:19 np0005531888 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230960]: [NOTICE]   (230964) : haproxy version is 2.8.14-c23fe91
Nov 22 03:07:19 np0005531888 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230960]: [NOTICE]   (230964) : path to executable is /usr/sbin/haproxy
Nov 22 03:07:19 np0005531888 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230960]: [WARNING]  (230964) : Exiting Master process...
Nov 22 03:07:19 np0005531888 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230960]: [ALERT]    (230964) : Current worker (230966) exited with code 143 (Terminated)
Nov 22 03:07:19 np0005531888 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230960]: [WARNING]  (230964) : All workers exited. Exiting... (0)
Nov 22 03:07:19 np0005531888 systemd[1]: libpod-b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8.scope: Deactivated successfully.
Nov 22 03:07:19 np0005531888 podman[231688]: 2025-11-22 08:07:19.109021758 +0000 UTC m=+0.134680253 container died b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.290 186792 DEBUG nova.compute.manager [req-9c38b234-6a23-4abd-89ea-f2c1e7b922e9 req-944b4a49-b25b-4fe1-b38e-67e21955a403 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received event network-vif-unplugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.290 186792 DEBUG oslo_concurrency.lockutils [req-9c38b234-6a23-4abd-89ea-f2c1e7b922e9 req-944b4a49-b25b-4fe1-b38e-67e21955a403 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.291 186792 DEBUG oslo_concurrency.lockutils [req-9c38b234-6a23-4abd-89ea-f2c1e7b922e9 req-944b4a49-b25b-4fe1-b38e-67e21955a403 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.291 186792 DEBUG oslo_concurrency.lockutils [req-9c38b234-6a23-4abd-89ea-f2c1e7b922e9 req-944b4a49-b25b-4fe1-b38e-67e21955a403 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.291 186792 DEBUG nova.compute.manager [req-9c38b234-6a23-4abd-89ea-f2c1e7b922e9 req-944b4a49-b25b-4fe1-b38e-67e21955a403 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] No waiting events found dispatching network-vif-unplugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.291 186792 DEBUG nova.compute.manager [req-9c38b234-6a23-4abd-89ea-f2c1e7b922e9 req-944b4a49-b25b-4fe1-b38e-67e21955a403 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received event network-vif-unplugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.443 186792 INFO nova.compute.manager [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.444 186792 DEBUG oslo.service.loopingcall [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.444 186792 DEBUG nova.compute.manager [-] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:07:19 np0005531888 nova_compute[186788]: 2025-11-22 08:07:19.444 186792 DEBUG nova.network.neutron [-] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.123 186792 DEBUG nova.network.neutron [-] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.140 186792 INFO nova.compute.manager [-] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Took 0.70 seconds to deallocate network for instance.#033[00m
Nov 22 03:07:20 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8-userdata-shm.mount: Deactivated successfully.
Nov 22 03:07:20 np0005531888 systemd[1]: var-lib-containers-storage-overlay-e198e64d0a68a47eeba7f1602f3e39a50eaae99c28b21554e713613317ec9274-merged.mount: Deactivated successfully.
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.197 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.198 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:20 np0005531888 podman[231688]: 2025-11-22 08:07:20.347770059 +0000 UTC m=+1.373428544 container cleanup b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:07:20 np0005531888 systemd[1]: libpod-conmon-b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8.scope: Deactivated successfully.
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.395 186792 DEBUG nova.compute.provider_tree [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.409 186792 DEBUG nova.scheduler.client.report [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.429 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.457 186792 INFO nova.scheduler.client.report [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Deleted allocations for instance a5045f34-cbc5-4b30-8165-f1fe663be743#033[00m
Nov 22 03:07:20 np0005531888 podman[231732]: 2025-11-22 08:07:20.462193523 +0000 UTC m=+0.092734652 container remove b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.470 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ed5aed-8a3f-4a38-902c-4ecbfbe8d6da]: (4, ('Sat Nov 22 08:07:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 (b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8)\nb6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8\nSat Nov 22 08:07:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 (b6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8)\nb6f6e15ea0a08f6495abf9829c2cfd4ac666413a9d34140dada25ae18d8e66a8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.472 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6acf274a-6b57-4b0a-9bb1-2853eaf8afa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.473 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9714091-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.475 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:20 np0005531888 kernel: tapf9714091-70: left promiscuous mode
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.493 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.495 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4a2d9ce9-c2e0-4f77-9667-4c17385eaac7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.510 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8e161dc9-0105-483f-9c8f-028cf9d99956]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.511 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3f9c2e42-223a-41c8-a879-6c4ae540f0f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:20 np0005531888 nova_compute[186788]: 2025-11-22 08:07:20.524 186792 DEBUG oslo_concurrency.lockutils [None req-d7b75fb6-165c-466a-9c11-60546b9b5ac4 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.526 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[37fc35bd-269b-4bc5-9939-28e4a04dc5e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552938, 'reachable_time': 41265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231747, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:20 np0005531888 systemd[1]: run-netns-ovnmeta\x2df9714091\x2d78f6\x2d46c8\x2db55b\x2d4a278bd99b49.mount: Deactivated successfully.
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.529 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:07:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:20.530 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0f404e-1251-4c43-a92a-9747f639a84a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:21 np0005531888 nova_compute[186788]: 2025-11-22 08:07:21.383 186792 DEBUG nova.compute.manager [req-5f0ff691-3f4c-43ff-9049-815a4265cee4 req-dfe57d18-c325-4352-9609-7132dccb6b7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received event network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:21 np0005531888 nova_compute[186788]: 2025-11-22 08:07:21.383 186792 DEBUG oslo_concurrency.lockutils [req-5f0ff691-3f4c-43ff-9049-815a4265cee4 req-dfe57d18-c325-4352-9609-7132dccb6b7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:21 np0005531888 nova_compute[186788]: 2025-11-22 08:07:21.383 186792 DEBUG oslo_concurrency.lockutils [req-5f0ff691-3f4c-43ff-9049-815a4265cee4 req-dfe57d18-c325-4352-9609-7132dccb6b7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:21 np0005531888 nova_compute[186788]: 2025-11-22 08:07:21.384 186792 DEBUG oslo_concurrency.lockutils [req-5f0ff691-3f4c-43ff-9049-815a4265cee4 req-dfe57d18-c325-4352-9609-7132dccb6b7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a5045f34-cbc5-4b30-8165-f1fe663be743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:21 np0005531888 nova_compute[186788]: 2025-11-22 08:07:21.384 186792 DEBUG nova.compute.manager [req-5f0ff691-3f4c-43ff-9049-815a4265cee4 req-dfe57d18-c325-4352-9609-7132dccb6b7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] No waiting events found dispatching network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:21 np0005531888 nova_compute[186788]: 2025-11-22 08:07:21.384 186792 WARNING nova.compute.manager [req-5f0ff691-3f4c-43ff-9049-815a4265cee4 req-dfe57d18-c325-4352-9609-7132dccb6b7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received unexpected event network-vif-plugged-21f2a09e-e781-4e4e-9659-691ca54ee1d8 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:07:21 np0005531888 podman[231748]: 2025-11-22 08:07:21.717486199 +0000 UTC m=+0.081073384 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 03:07:22 np0005531888 nova_compute[186788]: 2025-11-22 08:07:22.129 186792 DEBUG nova.compute.manager [req-2c964fff-369c-4178-a1ca-0a91315fd84a req-4fb5fca7-597e-4ded-ac32-eb5ed828f506 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Received event network-vif-deleted-21f2a09e-e781-4e4e-9659-691ca54ee1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:23 np0005531888 nova_compute[186788]: 2025-11-22 08:07:23.536 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:24 np0005531888 nova_compute[186788]: 2025-11-22 08:07:24.039 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:24 np0005531888 nova_compute[186788]: 2025-11-22 08:07:24.076 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798829.0752003, dca0936b-0f9e-4a24-a613-087b4a117a05 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:24 np0005531888 nova_compute[186788]: 2025-11-22 08:07:24.077 186792 INFO nova.compute.manager [-] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:07:24 np0005531888 nova_compute[186788]: 2025-11-22 08:07:24.104 186792 DEBUG nova.compute.manager [None req-9f589e50-b4dd-4f61-a2b3-ac1fab9a5eb9 - - - - - -] [instance: dca0936b-0f9e-4a24-a613-087b4a117a05] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:24 np0005531888 nova_compute[186788]: 2025-11-22 08:07:24.369 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:26 np0005531888 nova_compute[186788]: 2025-11-22 08:07:26.901 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:26 np0005531888 nova_compute[186788]: 2025-11-22 08:07:26.901 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:26 np0005531888 nova_compute[186788]: 2025-11-22 08:07:26.924 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:07:26 np0005531888 podman[231769]: 2025-11-22 08:07:26.991167277 +0000 UTC m=+0.065121433 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 03:07:27 np0005531888 podman[231770]: 2025-11-22 08:07:27.02424617 +0000 UTC m=+0.093467570 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.026 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.027 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.034 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.034 186792 INFO nova.compute.claims [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.242 186792 DEBUG nova.compute.provider_tree [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.259 186792 DEBUG nova.scheduler.client.report [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.296 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.296 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.396 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.396 186792 DEBUG nova.network.neutron [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.462 186792 INFO nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.486 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.656 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.657 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.657 186792 INFO nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Creating image(s)#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.658 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "/var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.658 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "/var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.659 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "/var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.671 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.738 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.739 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.740 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.751 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.819 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.820 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.868 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.869 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.869 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.936 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.938 186792 DEBUG nova.virt.disk.api [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Checking if we can resize image /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.938 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:27 np0005531888 nova_compute[186788]: 2025-11-22 08:07:27.963 186792 DEBUG nova.policy [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd1b53cb76c914b98afb487ff6059ebfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.003 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.004 186792 DEBUG nova.virt.disk.api [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Cannot resize image /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.005 186792 DEBUG nova.objects.instance [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lazy-loading 'migration_context' on Instance uuid 57e661f0-0363-413f-8e02-7c12e8336789 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.023 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.024 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Ensure instance console log exists: /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.024 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.024 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.025 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:28 np0005531888 nova_compute[186788]: 2025-11-22 08:07:28.538 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:29 np0005531888 nova_compute[186788]: 2025-11-22 08:07:29.041 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:29 np0005531888 nova_compute[186788]: 2025-11-22 08:07:29.603 186792 DEBUG nova.network.neutron [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Successfully created port: 02662e23-ed3c-4273-8579-899251b4f9ea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:07:32 np0005531888 nova_compute[186788]: 2025-11-22 08:07:32.156 186792 DEBUG nova.network.neutron [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Successfully created port: 495fd4f0-57d3-4f58-a770-801f0d4d3e57 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:07:33 np0005531888 nova_compute[186788]: 2025-11-22 08:07:33.539 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.021 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798839.0194268, a5045f34-cbc5-4b30-8165-f1fe663be743 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.022 186792 INFO nova.compute.manager [-] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.045 186792 DEBUG nova.compute.manager [None req-2eae12c7-52dd-46cb-836d-1a4efea1305f - - - - - -] [instance: a5045f34-cbc5-4b30-8165-f1fe663be743] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.046 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.363 186792 DEBUG nova.network.neutron [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Successfully updated port: 02662e23-ed3c-4273-8579-899251b4f9ea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.481 186792 DEBUG nova.compute.manager [req-ed6e7c62-651e-4309-b5ac-3ae03e829205 req-bebd2c3b-4e98-46e9-aec3-482758c7f04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-changed-02662e23-ed3c-4273-8579-899251b4f9ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.481 186792 DEBUG nova.compute.manager [req-ed6e7c62-651e-4309-b5ac-3ae03e829205 req-bebd2c3b-4e98-46e9-aec3-482758c7f04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Refreshing instance network info cache due to event network-changed-02662e23-ed3c-4273-8579-899251b4f9ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.481 186792 DEBUG oslo_concurrency.lockutils [req-ed6e7c62-651e-4309-b5ac-3ae03e829205 req-bebd2c3b-4e98-46e9-aec3-482758c7f04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.482 186792 DEBUG oslo_concurrency.lockutils [req-ed6e7c62-651e-4309-b5ac-3ae03e829205 req-bebd2c3b-4e98-46e9-aec3-482758c7f04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.482 186792 DEBUG nova.network.neutron [req-ed6e7c62-651e-4309-b5ac-3ae03e829205 req-bebd2c3b-4e98-46e9-aec3-482758c7f04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Refreshing network info cache for port 02662e23-ed3c-4273-8579-899251b4f9ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:07:34 np0005531888 nova_compute[186788]: 2025-11-22 08:07:34.803 186792 DEBUG nova.network.neutron [req-ed6e7c62-651e-4309-b5ac-3ae03e829205 req-bebd2c3b-4e98-46e9-aec3-482758c7f04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:07:35 np0005531888 nova_compute[186788]: 2025-11-22 08:07:35.533 186792 DEBUG nova.network.neutron [req-ed6e7c62-651e-4309-b5ac-3ae03e829205 req-bebd2c3b-4e98-46e9-aec3-482758c7f04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:35 np0005531888 nova_compute[186788]: 2025-11-22 08:07:35.559 186792 DEBUG oslo_concurrency.lockutils [req-ed6e7c62-651e-4309-b5ac-3ae03e829205 req-bebd2c3b-4e98-46e9-aec3-482758c7f04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:35 np0005531888 podman[231829]: 2025-11-22 08:07:35.68607086 +0000 UTC m=+0.056340837 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:07:35 np0005531888 podman[231830]: 2025-11-22 08:07:35.723860389 +0000 UTC m=+0.086718563 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:07:35 np0005531888 nova_compute[186788]: 2025-11-22 08:07:35.761 186792 DEBUG nova.network.neutron [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Successfully updated port: 495fd4f0-57d3-4f58-a770-801f0d4d3e57 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:07:35 np0005531888 nova_compute[186788]: 2025-11-22 08:07:35.817 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:07:35 np0005531888 nova_compute[186788]: 2025-11-22 08:07:35.817 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquired lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:35 np0005531888 nova_compute[186788]: 2025-11-22 08:07:35.818 186792 DEBUG nova.network.neutron [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:07:36 np0005531888 nova_compute[186788]: 2025-11-22 08:07:36.152 186792 DEBUG nova.network.neutron [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:07:36 np0005531888 nova_compute[186788]: 2025-11-22 08:07:36.680 186792 DEBUG nova.compute.manager [req-a2f8299a-f3cb-40df-bdba-b9214998f13d req-e45cecac-3340-420b-ac85-acf91df77937 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-changed-495fd4f0-57d3-4f58-a770-801f0d4d3e57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:36 np0005531888 nova_compute[186788]: 2025-11-22 08:07:36.681 186792 DEBUG nova.compute.manager [req-a2f8299a-f3cb-40df-bdba-b9214998f13d req-e45cecac-3340-420b-ac85-acf91df77937 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Refreshing instance network info cache due to event network-changed-495fd4f0-57d3-4f58-a770-801f0d4d3e57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:07:36 np0005531888 nova_compute[186788]: 2025-11-22 08:07:36.681 186792 DEBUG oslo_concurrency.lockutils [req-a2f8299a-f3cb-40df-bdba-b9214998f13d req-e45cecac-3340-420b-ac85-acf91df77937 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:07:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:36.822 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:36.822 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:36.823 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:38 np0005531888 nova_compute[186788]: 2025-11-22 08:07:38.540 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:39 np0005531888 nova_compute[186788]: 2025-11-22 08:07:39.049 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:41 np0005531888 nova_compute[186788]: 2025-11-22 08:07:41.975 186792 DEBUG nova.network.neutron [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Updating instance_info_cache with network_info: [{"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.542 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.895 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Releasing lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.895 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Instance network_info: |[{"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.896 186792 DEBUG oslo_concurrency.lockutils [req-a2f8299a-f3cb-40df-bdba-b9214998f13d req-e45cecac-3340-420b-ac85-acf91df77937 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.896 186792 DEBUG nova.network.neutron [req-a2f8299a-f3cb-40df-bdba-b9214998f13d req-e45cecac-3340-420b-ac85-acf91df77937 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Refreshing network info cache for port 495fd4f0-57d3-4f58-a770-801f0d4d3e57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.899 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Start _get_guest_xml network_info=[{"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.905 186792 WARNING nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.914 186792 DEBUG nova.virt.libvirt.host [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.915 186792 DEBUG nova.virt.libvirt.host [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.921 186792 DEBUG nova.virt.libvirt.host [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.922 186792 DEBUG nova.virt.libvirt.host [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.923 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.924 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.924 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.924 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.925 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.925 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.925 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.925 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.926 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.926 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.926 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.927 186792 DEBUG nova.virt.hardware [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.931 186792 DEBUG nova.virt.libvirt.vif [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-208559163',display_name='tempest-ServersTestMultiNic-server-208559163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-208559163',id=112,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-br0sps07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:27Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=57e661f0-0363-413f-8e02-7c12e8336789,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.932 186792 DEBUG nova.network.os_vif_util [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.932 186792 DEBUG nova.network.os_vif_util [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:44:d9,bridge_name='br-int',has_traffic_filtering=True,id=02662e23-ed3c-4273-8579-899251b4f9ea,network=Network(68c429a3-410e-4fc5-a2f5-3128722c4b12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02662e23-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.934 186792 DEBUG nova.virt.libvirt.vif [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-208559163',display_name='tempest-ServersTestMultiNic-server-208559163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-208559163',id=112,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-br0sps07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:27Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=57e661f0-0363-413f-8e02-7c12e8336789,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.934 186792 DEBUG nova.network.os_vif_util [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.935 186792 DEBUG nova.network.os_vif_util [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:a5:54,bridge_name='br-int',has_traffic_filtering=True,id=495fd4f0-57d3-4f58-a770-801f0d4d3e57,network=Network(54b86579-e994-4e47-90b0-4a37929dd096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap495fd4f0-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.936 186792 DEBUG nova.objects.instance [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 57e661f0-0363-413f-8e02-7c12e8336789 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.947 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <uuid>57e661f0-0363-413f-8e02-7c12e8336789</uuid>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <name>instance-00000070</name>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersTestMultiNic-server-208559163</nova:name>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:07:43</nova:creationTime>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:user uuid="d1b53cb76c914b98afb487ff6059ebfe">tempest-ServersTestMultiNic-1824610177-project-member</nova:user>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:project uuid="dd19aebd63694f83a0bdbf1e376177d5">tempest-ServersTestMultiNic-1824610177</nova:project>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:port uuid="02662e23-ed3c-4273-8579-899251b4f9ea">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.191" ipVersion="4"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        <nova:port uuid="495fd4f0-57d3-4f58-a770-801f0d4d3e57">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.1.122" ipVersion="4"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <entry name="serial">57e661f0-0363-413f-8e02-7c12e8336789</entry>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <entry name="uuid">57e661f0-0363-413f-8e02-7c12e8336789</entry>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk.config"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:f5:44:d9"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <target dev="tap02662e23-ed"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:f6:a5:54"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <target dev="tap495fd4f0-57"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/console.log" append="off"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:07:43 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:07:43 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:07:43 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:07:43 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.949 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Preparing to wait for external event network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.949 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.949 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.949 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.950 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Preparing to wait for external event network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.950 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.950 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.951 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.951 186792 DEBUG nova.virt.libvirt.vif [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-208559163',display_name='tempest-ServersTestMultiNic-server-208559163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-208559163',id=112,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-br0sps07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:27Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=57e661f0-0363-413f-8e02-7c12e8336789,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.952 186792 DEBUG nova.network.os_vif_util [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.953 186792 DEBUG nova.network.os_vif_util [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:44:d9,bridge_name='br-int',has_traffic_filtering=True,id=02662e23-ed3c-4273-8579-899251b4f9ea,network=Network(68c429a3-410e-4fc5-a2f5-3128722c4b12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02662e23-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.953 186792 DEBUG os_vif [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:44:d9,bridge_name='br-int',has_traffic_filtering=True,id=02662e23-ed3c-4273-8579-899251b4f9ea,network=Network(68c429a3-410e-4fc5-a2f5-3128722c4b12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02662e23-ed') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.954 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.954 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.955 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.959 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.959 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02662e23-ed, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.960 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02662e23-ed, col_values=(('external_ids', {'iface-id': '02662e23-ed3c-4273-8579-899251b4f9ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:44:d9', 'vm-uuid': '57e661f0-0363-413f-8e02-7c12e8336789'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.962 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 NetworkManager[55166]: <info>  [1763798863.9634] manager: (tap02662e23-ed): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.963 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.968 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.969 186792 INFO os_vif [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:44:d9,bridge_name='br-int',has_traffic_filtering=True,id=02662e23-ed3c-4273-8579-899251b4f9ea,network=Network(68c429a3-410e-4fc5-a2f5-3128722c4b12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02662e23-ed')#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.970 186792 DEBUG nova.virt.libvirt.vif [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-208559163',display_name='tempest-ServersTestMultiNic-server-208559163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-208559163',id=112,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-br0sps07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:27Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=57e661f0-0363-413f-8e02-7c12e8336789,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.971 186792 DEBUG nova.network.os_vif_util [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.972 186792 DEBUG nova.network.os_vif_util [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:a5:54,bridge_name='br-int',has_traffic_filtering=True,id=495fd4f0-57d3-4f58-a770-801f0d4d3e57,network=Network(54b86579-e994-4e47-90b0-4a37929dd096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap495fd4f0-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.972 186792 DEBUG os_vif [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:a5:54,bridge_name='br-int',has_traffic_filtering=True,id=495fd4f0-57d3-4f58-a770-801f0d4d3e57,network=Network(54b86579-e994-4e47-90b0-4a37929dd096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap495fd4f0-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.973 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.973 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.973 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.976 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.976 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap495fd4f0-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.977 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap495fd4f0-57, col_values=(('external_ids', {'iface-id': '495fd4f0-57d3-4f58-a770-801f0d4d3e57', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:a5:54', 'vm-uuid': '57e661f0-0363-413f-8e02-7c12e8336789'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.978 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 NetworkManager[55166]: <info>  [1763798863.9803] manager: (tap495fd4f0-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.981 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.990 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:43 np0005531888 nova_compute[186788]: 2025-11-22 08:07:43.991 186792 INFO os_vif [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:a5:54,bridge_name='br-int',has_traffic_filtering=True,id=495fd4f0-57d3-4f58-a770-801f0d4d3e57,network=Network(54b86579-e994-4e47-90b0-4a37929dd096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap495fd4f0-57')#033[00m
Nov 22 03:07:44 np0005531888 nova_compute[186788]: 2025-11-22 08:07:44.148 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:44 np0005531888 nova_compute[186788]: 2025-11-22 08:07:44.149 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:44 np0005531888 nova_compute[186788]: 2025-11-22 08:07:44.149 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No VIF found with MAC fa:16:3e:f5:44:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:07:44 np0005531888 nova_compute[186788]: 2025-11-22 08:07:44.149 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] No VIF found with MAC fa:16:3e:f6:a5:54, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:07:44 np0005531888 nova_compute[186788]: 2025-11-22 08:07:44.150 186792 INFO nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Using config drive#033[00m
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.403 186792 INFO nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Creating config drive at /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk.config#033[00m
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.407 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0t9x79jp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.534 186792 DEBUG oslo_concurrency.processutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0t9x79jp" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:45 np0005531888 kernel: tap02662e23-ed: entered promiscuous mode
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.6126] manager: (tap02662e23-ed): new Tun device (/org/freedesktop/NetworkManager/Devices/190)
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00395|binding|INFO|Claiming lport 02662e23-ed3c-4273-8579-899251b4f9ea for this chassis.
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00396|binding|INFO|02662e23-ed3c-4273-8579-899251b4f9ea: Claiming fa:16:3e:f5:44:d9 10.100.0.191
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.615 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.632 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:44:d9 10.100.0.191'], port_security=['fa:16:3e:f5:44:d9 10.100.0.191'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.191/24', 'neutron:device_id': '57e661f0-0363-413f-8e02-7c12e8336789', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-68c429a3-410e-4fc5-a2f5-3128722c4b12', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd26b766-290a-462e-b30d-c679f0bad9dd, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=02662e23-ed3c-4273-8579-899251b4f9ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.633 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 02662e23-ed3c-4273-8579-899251b4f9ea in datapath 68c429a3-410e-4fc5-a2f5-3128722c4b12 bound to our chassis#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.635 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 68c429a3-410e-4fc5-a2f5-3128722c4b12#033[00m
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.6454] manager: (tap495fd4f0-57): new Tun device (/org/freedesktop/NetworkManager/Devices/191)
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.647 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b81cd1-dfe7-429f-bce5-15155cdefc7d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.649 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap68c429a3-41 in ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.650 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap68c429a3-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.650 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b1282bdc-f55a-4c81-9474-674f25a16f84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.651 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6960d94f-0575-4333-9f00-53b3ad052fb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 kernel: tap495fd4f0-57: entered promiscuous mode
Nov 22 03:07:45 np0005531888 systemd-udevd[231909]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:07:45 np0005531888 systemd-udevd[231910]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00397|binding|INFO|Claiming lport 495fd4f0-57d3-4f58-a770-801f0d4d3e57 for this chassis.
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00398|binding|INFO|495fd4f0-57d3-4f58-a770-801f0d4d3e57: Claiming fa:16:3e:f6:a5:54 10.100.1.122
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.658 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00399|binding|INFO|Setting lport 02662e23-ed3c-4273-8579-899251b4f9ea ovn-installed in OVS
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.665 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.665 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[de844e2b-9eaa-48d9-b953-40a36a45d052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00400|binding|INFO|Setting lport 02662e23-ed3c-4273-8579-899251b4f9ea up in Southbound
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.671 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:a5:54 10.100.1.122'], port_security=['fa:16:3e:f6:a5:54 10.100.1.122'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.122/24', 'neutron:device_id': '57e661f0-0363-413f-8e02-7c12e8336789', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54b86579-e994-4e47-90b0-4a37929dd096', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1be82569-6aa7-4cfc-b5c0-4a3b5c8ec111, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=495fd4f0-57d3-4f58-a770-801f0d4d3e57) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.6744] device (tap495fd4f0-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.6754] device (tap02662e23-ed): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.6769] device (tap495fd4f0-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.6797] device (tap02662e23-ed): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:07:45 np0005531888 systemd-machined[153106]: New machine qemu-53-instance-00000070.
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00401|binding|INFO|Setting lport 495fd4f0-57d3-4f58-a770-801f0d4d3e57 ovn-installed in OVS
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00402|binding|INFO|Setting lport 495fd4f0-57d3-4f58-a770-801f0d4d3e57 up in Southbound
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.695 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dc77e076-f147-456b-9b08-b13479d970bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.697 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:45 np0005531888 systemd[1]: Started Virtual Machine qemu-53-instance-00000070.
Nov 22 03:07:45 np0005531888 podman[231886]: 2025-11-22 08:07:45.723539285 +0000 UTC m=+0.116384454 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.733 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[85633cbe-2be9-4ee4-97bc-705899cd596e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.738 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c5c44b-40b7-4950-a183-eac7f7f21389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.7402] manager: (tap68c429a3-40): new Veth device (/org/freedesktop/NetworkManager/Devices/192)
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.771 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[54f7dfca-4fe7-4722-8e25-e5fdff5eb4b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.775 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a290464e-f7d7-47c1-bfbe-2c43410fb55b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.7994] device (tap68c429a3-40): carrier: link connected
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.804 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[44c42425-0d14-4fef-a513-8d5e7bac51a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.819 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6458d1ec-1606-4ae2-ab17-ce69ea46ab8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap68c429a3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:b2:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560644, 'reachable_time': 18116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231952, 'error': None, 'target': 'ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.835 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5363cae1-368e-445f-9e5d-bbcdf55bd9f8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:b27d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560644, 'tstamp': 560644}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231953, 'error': None, 'target': 'ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.853 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0089e6dc-afc8-42a7-9ed7-5525a3a183df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap68c429a3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:b2:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560644, 'reachable_time': 18116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231954, 'error': None, 'target': 'ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.888 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1a3d55-4288-4ab5-ac78-46b9bbe1c2a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.953 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a05f0f01-3453-4306-b8ec-74c7be314b64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.955 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68c429a3-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.955 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.956 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68c429a3-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:45 np0005531888 NetworkManager[55166]: <info>  [1763798865.9596] manager: (tap68c429a3-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.959 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:45 np0005531888 kernel: tap68c429a3-40: entered promiscuous mode
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.962 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap68c429a3-40, col_values=(('external_ids', {'iface-id': '91d45cf4-7743-4e7f-885e-59dde7e617f9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:45Z|00403|binding|INFO|Releasing lport 91d45cf4-7743-4e7f-885e-59dde7e617f9 from this chassis (sb_readonly=0)
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.964 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.965 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/68c429a3-410e-4fc5-a2f5-3128722c4b12.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/68c429a3-410e-4fc5-a2f5-3128722c4b12.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.966 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[166c2673-ff0b-43b7-a767-8f1834427bac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.967 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-68c429a3-410e-4fc5-a2f5-3128722c4b12
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/68c429a3-410e-4fc5-a2f5-3128722c4b12.pid.haproxy
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 68c429a3-410e-4fc5-a2f5-3128722c4b12
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:07:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:45.968 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12', 'env', 'PROCESS_TAG=haproxy-68c429a3-410e-4fc5-a2f5-3128722c4b12', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/68c429a3-410e-4fc5-a2f5-3128722c4b12.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:07:45 np0005531888 nova_compute[186788]: 2025-11-22 08:07:45.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.043 186792 DEBUG nova.compute.manager [req-5458153f-2787-4a25-b1b9-a2be074b47ec req-01075470-d719-4c69-a5a8-b9d98f4ce3c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.043 186792 DEBUG oslo_concurrency.lockutils [req-5458153f-2787-4a25-b1b9-a2be074b47ec req-01075470-d719-4c69-a5a8-b9d98f4ce3c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.043 186792 DEBUG oslo_concurrency.lockutils [req-5458153f-2787-4a25-b1b9-a2be074b47ec req-01075470-d719-4c69-a5a8-b9d98f4ce3c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.044 186792 DEBUG oslo_concurrency.lockutils [req-5458153f-2787-4a25-b1b9-a2be074b47ec req-01075470-d719-4c69-a5a8-b9d98f4ce3c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.044 186792 DEBUG nova.compute.manager [req-5458153f-2787-4a25-b1b9-a2be074b47ec req-01075470-d719-4c69-a5a8-b9d98f4ce3c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Processing event network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.095 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798866.0939567, 57e661f0-0363-413f-8e02-7c12e8336789 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.096 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] VM Started (Lifecycle Event)#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.123 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.130 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798866.0943372, 57e661f0-0363-413f-8e02-7c12e8336789 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.130 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.150 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.154 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:46 np0005531888 nova_compute[186788]: 2025-11-22 08:07:46.170 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:07:46 np0005531888 podman[231994]: 2025-11-22 08:07:46.367330145 +0000 UTC m=+0.024532834 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:07:46 np0005531888 podman[231994]: 2025-11-22 08:07:46.839512276 +0000 UTC m=+0.496714935 container create 705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:46 np0005531888 systemd[1]: Started libpod-conmon-705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd.scope.
Nov 22 03:07:46 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:07:46 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0eef72e7ac7c0d7c6c569689c9f10e4202b65a9cc4d0771e251e7edd5dee9842/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:07:46 np0005531888 podman[231994]: 2025-11-22 08:07:46.999208073 +0000 UTC m=+0.656410762 container init 705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 03:07:47 np0005531888 podman[231994]: 2025-11-22 08:07:47.004654245 +0000 UTC m=+0.661856904 container start 705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:07:47 np0005531888 neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12[232008]: [NOTICE]   (232012) : New worker (232014) forked
Nov 22 03:07:47 np0005531888 neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12[232008]: [NOTICE]   (232012) : Loading success.
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.151 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 495fd4f0-57d3-4f58-a770-801f0d4d3e57 in datapath 54b86579-e994-4e47-90b0-4a37929dd096 unbound from our chassis#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.154 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 54b86579-e994-4e47-90b0-4a37929dd096#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.166 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[15e9666c-5ead-44c1-bb59-328cbe4ab8b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.167 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap54b86579-e1 in ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.170 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap54b86579-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.170 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e19b1fa5-41b7-4378-8d5d-29aa232e6a13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.171 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[70e6b15c-2410-491c-bce0-3a50e366b80c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.185 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[c85d6252-f6d2-43fd-a735-86f9e5251b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.201 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[778f92f8-582f-43e5-87b1-b9f93c78d798]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.235 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e6318aa6-74d8-4096-b5ac-73f8a7f307d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.235 186792 DEBUG nova.network.neutron [req-a2f8299a-f3cb-40df-bdba-b9214998f13d req-e45cecac-3340-420b-ac85-acf91df77937 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Updated VIF entry in instance network info cache for port 495fd4f0-57d3-4f58-a770-801f0d4d3e57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.237 186792 DEBUG nova.network.neutron [req-a2f8299a-f3cb-40df-bdba-b9214998f13d req-e45cecac-3340-420b-ac85-acf91df77937 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Updating instance_info_cache with network_info: [{"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.240 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d5135260-09f3-4b2b-962c-63eca8cc6176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 systemd-udevd[231943]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:07:47 np0005531888 NetworkManager[55166]: <info>  [1763798867.2432] manager: (tap54b86579-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/194)
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.253 186792 DEBUG oslo_concurrency.lockutils [req-a2f8299a-f3cb-40df-bdba-b9214998f13d req-e45cecac-3340-420b-ac85-acf91df77937 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.270 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec79150-0e40-4ca1-999a-7c72ae604a33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.272 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5635cdbc-eaab-46ee-8c39-406c7f2bebcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 NetworkManager[55166]: <info>  [1763798867.2953] device (tap54b86579-e0): carrier: link connected
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.301 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[432318e8-ed00-4070-bb81-ce330bed5448]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.316 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8e410e13-9a95-4d1f-ba29-726624290cf8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54b86579-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:3e:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560793, 'reachable_time': 23587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232033, 'error': None, 'target': 'ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.332 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[adade848-2573-4972-94ef-e09bf2372b84]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:3e39'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560793, 'tstamp': 560793}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232034, 'error': None, 'target': 'ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.350 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[84e4060c-321f-4ea2-8ca4-48891772e2b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54b86579-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:3e:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560793, 'reachable_time': 23587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232035, 'error': None, 'target': 'ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.378 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[218f2960-2cdf-41e3-8042-3cd7c5b4289c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.439 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[23705223-56f4-4e24-a2e7-2612fa1fa256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.441 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54b86579-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.441 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.442 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54b86579-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.444 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:47 np0005531888 NetworkManager[55166]: <info>  [1763798867.4452] manager: (tap54b86579-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Nov 22 03:07:47 np0005531888 kernel: tap54b86579-e0: entered promiscuous mode
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.446 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.447 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap54b86579-e0, col_values=(('external_ids', {'iface-id': 'a31a589f-b080-4f00-9f37-f8d8892bfeaf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.448 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:47Z|00404|binding|INFO|Releasing lport a31a589f-b080-4f00-9f37-f8d8892bfeaf from this chassis (sb_readonly=0)
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.462 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.463 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/54b86579-e994-4e47-90b0-4a37929dd096.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/54b86579-e994-4e47-90b0-4a37929dd096.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.464 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3125d750-766c-42dd-93f2-f1ec1a78023b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.466 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-54b86579-e994-4e47-90b0-4a37929dd096
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/54b86579-e994-4e47-90b0-4a37929dd096.pid.haproxy
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 54b86579-e994-4e47-90b0-4a37929dd096
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:07:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:47.467 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096', 'env', 'PROCESS_TAG=haproxy-54b86579-e994-4e47-90b0-4a37929dd096', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/54b86579-e994-4e47-90b0-4a37929dd096.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.713 186792 DEBUG nova.compute.manager [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.714 186792 DEBUG oslo_concurrency.lockutils [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.714 186792 DEBUG oslo_concurrency.lockutils [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.714 186792 DEBUG oslo_concurrency.lockutils [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.714 186792 DEBUG nova.compute.manager [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Processing event network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.715 186792 DEBUG nova.compute.manager [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.715 186792 DEBUG oslo_concurrency.lockutils [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.715 186792 DEBUG oslo_concurrency.lockutils [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.715 186792 DEBUG oslo_concurrency.lockutils [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.715 186792 DEBUG nova.compute.manager [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] No waiting events found dispatching network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.715 186792 WARNING nova.compute.manager [req-d43c85ad-66ad-4821-b9e2-8a05fc7e01fb req-1d4dcd40-647c-4129-aee7-c501e187d841 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received unexpected event network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.716 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.723 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.724 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798867.7240336, 57e661f0-0363-413f-8e02-7c12e8336789 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.724 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.729 186792 INFO nova.virt.libvirt.driver [-] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Instance spawned successfully.#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.729 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.756 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.763 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.768 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.769 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.769 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.770 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.770 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.770 186792 DEBUG nova.virt.libvirt.driver [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.797 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.839 186792 INFO nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Took 20.18 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.840 186792 DEBUG nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:47 np0005531888 podman[232067]: 2025-11-22 08:07:47.869943102 +0000 UTC m=+0.077565857 container create afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:07:47 np0005531888 systemd[1]: Started libpod-conmon-afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e.scope.
Nov 22 03:07:47 np0005531888 podman[232067]: 2025-11-22 08:07:47.816987721 +0000 UTC m=+0.024610496 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:07:47 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.924 186792 INFO nova.compute.manager [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Took 20.94 seconds to build instance.#033[00m
Nov 22 03:07:47 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/558a07a28e8f157bf992c57cc15e7cfbc5de7de7f429e8ab478e1ac51a00b1eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:07:47 np0005531888 nova_compute[186788]: 2025-11-22 08:07:47.944 186792 DEBUG oslo_concurrency.lockutils [None req-d613c189-182d-499a-8249-c580039a7a19 d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:47 np0005531888 podman[232067]: 2025-11-22 08:07:47.958122771 +0000 UTC m=+0.165745526 container init afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:47 np0005531888 podman[232067]: 2025-11-22 08:07:47.964402886 +0000 UTC m=+0.172025631 container start afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:07:47 np0005531888 neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096[232088]: [NOTICE]   (232095) : New worker (232097) forked
Nov 22 03:07:47 np0005531888 neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096[232088]: [NOTICE]   (232095) : Loading success.
Nov 22 03:07:48 np0005531888 podman[232080]: 2025-11-22 08:07:48.048628857 +0000 UTC m=+0.140551817 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:07:48 np0005531888 nova_compute[186788]: 2025-11-22 08:07:48.134 186792 DEBUG nova.compute.manager [req-360681ea-2e33-4362-a9de-b3dbe6b6b7ae req-5545d5a9-b982-421d-8eaf-b6b366353e00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:48 np0005531888 nova_compute[186788]: 2025-11-22 08:07:48.135 186792 DEBUG oslo_concurrency.lockutils [req-360681ea-2e33-4362-a9de-b3dbe6b6b7ae req-5545d5a9-b982-421d-8eaf-b6b366353e00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:48 np0005531888 nova_compute[186788]: 2025-11-22 08:07:48.135 186792 DEBUG oslo_concurrency.lockutils [req-360681ea-2e33-4362-a9de-b3dbe6b6b7ae req-5545d5a9-b982-421d-8eaf-b6b366353e00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:48 np0005531888 nova_compute[186788]: 2025-11-22 08:07:48.135 186792 DEBUG oslo_concurrency.lockutils [req-360681ea-2e33-4362-a9de-b3dbe6b6b7ae req-5545d5a9-b982-421d-8eaf-b6b366353e00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:48 np0005531888 nova_compute[186788]: 2025-11-22 08:07:48.136 186792 DEBUG nova.compute.manager [req-360681ea-2e33-4362-a9de-b3dbe6b6b7ae req-5545d5a9-b982-421d-8eaf-b6b366353e00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] No waiting events found dispatching network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:48 np0005531888 nova_compute[186788]: 2025-11-22 08:07:48.136 186792 WARNING nova.compute.manager [req-360681ea-2e33-4362-a9de-b3dbe6b6b7ae req-5545d5a9-b982-421d-8eaf-b6b366353e00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received unexpected event network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:07:48 np0005531888 nova_compute[186788]: 2025-11-22 08:07:48.544 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:48 np0005531888 nova_compute[186788]: 2025-11-22 08:07:48.978 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.668 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.669 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.669 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.670 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.670 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.678 186792 INFO nova.compute.manager [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Terminating instance#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.687 186792 DEBUG nova.compute.manager [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:07:51 np0005531888 kernel: tap02662e23-ed (unregistering): left promiscuous mode
Nov 22 03:07:51 np0005531888 NetworkManager[55166]: <info>  [1763798871.7153] device (tap02662e23-ed): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.724 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:51Z|00405|binding|INFO|Releasing lport 02662e23-ed3c-4273-8579-899251b4f9ea from this chassis (sb_readonly=0)
Nov 22 03:07:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:51Z|00406|binding|INFO|Setting lport 02662e23-ed3c-4273-8579-899251b4f9ea down in Southbound
Nov 22 03:07:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:51Z|00407|binding|INFO|Removing iface tap02662e23-ed ovn-installed in OVS
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.726 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:51.733 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:44:d9 10.100.0.191'], port_security=['fa:16:3e:f5:44:d9 10.100.0.191'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.191/24', 'neutron:device_id': '57e661f0-0363-413f-8e02-7c12e8336789', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-68c429a3-410e-4fc5-a2f5-3128722c4b12', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd26b766-290a-462e-b30d-c679f0bad9dd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=02662e23-ed3c-4273-8579-899251b4f9ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:51.734 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 02662e23-ed3c-4273-8579-899251b4f9ea in datapath 68c429a3-410e-4fc5-a2f5-3128722c4b12 unbound from our chassis#033[00m
Nov 22 03:07:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:51.736 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 68c429a3-410e-4fc5-a2f5-3128722c4b12, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.737 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:51.737 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[953b0188-c9ce-4cd4-b9e3-af7b94a1c941]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:51.738 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12 namespace which is not needed anymore#033[00m
Nov 22 03:07:51 np0005531888 kernel: tap495fd4f0-57 (unregistering): left promiscuous mode
Nov 22 03:07:51 np0005531888 NetworkManager[55166]: <info>  [1763798871.7691] device (tap495fd4f0-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.773 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:51Z|00408|binding|INFO|Releasing lport 495fd4f0-57d3-4f58-a770-801f0d4d3e57 from this chassis (sb_readonly=0)
Nov 22 03:07:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:51Z|00409|binding|INFO|Setting lport 495fd4f0-57d3-4f58-a770-801f0d4d3e57 down in Southbound
Nov 22 03:07:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:07:51Z|00410|binding|INFO|Removing iface tap495fd4f0-57 ovn-installed in OVS
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.789 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:51.792 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:a5:54 10.100.1.122'], port_security=['fa:16:3e:f6:a5:54 10.100.1.122'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.122/24', 'neutron:device_id': '57e661f0-0363-413f-8e02-7c12e8336789', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54b86579-e994-4e47-90b0-4a37929dd096', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd19aebd63694f83a0bdbf1e376177d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e84b169a-d91c-4e2d-9c43-4fa87c120591', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1be82569-6aa7-4cfc-b5c0-4a3b5c8ec111, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=495fd4f0-57d3-4f58-a770-801f0d4d3e57) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.796 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000070.scope: Deactivated successfully.
Nov 22 03:07:51 np0005531888 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000070.scope: Consumed 4.404s CPU time.
Nov 22 03:07:51 np0005531888 podman[232123]: 2025-11-22 08:07:51.826845892 +0000 UTC m=+0.071676615 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.6, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, managed_by=edpm_ansible)
Nov 22 03:07:51 np0005531888 systemd-machined[153106]: Machine qemu-53-instance-00000070 terminated.
Nov 22 03:07:51 np0005531888 neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12[232008]: [NOTICE]   (232012) : haproxy version is 2.8.14-c23fe91
Nov 22 03:07:51 np0005531888 neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12[232008]: [NOTICE]   (232012) : path to executable is /usr/sbin/haproxy
Nov 22 03:07:51 np0005531888 neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12[232008]: [WARNING]  (232012) : Exiting Master process...
Nov 22 03:07:51 np0005531888 neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12[232008]: [WARNING]  (232012) : Exiting Master process...
Nov 22 03:07:51 np0005531888 neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12[232008]: [ALERT]    (232012) : Current worker (232014) exited with code 143 (Terminated)
Nov 22 03:07:51 np0005531888 neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12[232008]: [WARNING]  (232012) : All workers exited. Exiting... (0)
Nov 22 03:07:51 np0005531888 systemd[1]: libpod-705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd.scope: Deactivated successfully.
Nov 22 03:07:51 np0005531888 podman[232167]: 2025-11-22 08:07:51.887114334 +0000 UTC m=+0.056112131 container died 705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 03:07:51 np0005531888 NetworkManager[55166]: <info>  [1763798871.9109] manager: (tap02662e23-ed): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Nov 22 03:07:51 np0005531888 NetworkManager[55166]: <info>  [1763798871.9229] manager: (tap495fd4f0-57): new Tun device (/org/freedesktop/NetworkManager/Devices/197)
Nov 22 03:07:51 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd-userdata-shm.mount: Deactivated successfully.
Nov 22 03:07:51 np0005531888 systemd[1]: var-lib-containers-storage-overlay-0eef72e7ac7c0d7c6c569689c9f10e4202b65a9cc4d0771e251e7edd5dee9842-merged.mount: Deactivated successfully.
Nov 22 03:07:51 np0005531888 podman[232167]: 2025-11-22 08:07:51.935752849 +0000 UTC m=+0.104750636 container cleanup 705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:07:51 np0005531888 systemd[1]: libpod-conmon-705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd.scope: Deactivated successfully.
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.976 186792 INFO nova.virt.libvirt.driver [-] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Instance destroyed successfully.#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.976 186792 DEBUG nova.objects.instance [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lazy-loading 'resources' on Instance uuid 57e661f0-0363-413f-8e02-7c12e8336789 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.988 186792 DEBUG nova.virt.libvirt.vif [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-208559163',display_name='tempest-ServersTestMultiNic-server-208559163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-208559163',id=112,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:07:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-br0sps07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:47Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=57e661f0-0363-413f-8e02-7c12e8336789,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.988 186792 DEBUG nova.network.os_vif_util [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.989 186792 DEBUG nova.network.os_vif_util [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:44:d9,bridge_name='br-int',has_traffic_filtering=True,id=02662e23-ed3c-4273-8579-899251b4f9ea,network=Network(68c429a3-410e-4fc5-a2f5-3128722c4b12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02662e23-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.989 186792 DEBUG os_vif [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:44:d9,bridge_name='br-int',has_traffic_filtering=True,id=02662e23-ed3c-4273-8579-899251b4f9ea,network=Network(68c429a3-410e-4fc5-a2f5-3128722c4b12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02662e23-ed') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.991 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.991 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02662e23-ed, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.993 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.995 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:51 np0005531888 nova_compute[186788]: 2025-11-22 08:07:51.998 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.001 186792 INFO os_vif [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:44:d9,bridge_name='br-int',has_traffic_filtering=True,id=02662e23-ed3c-4273-8579-899251b4f9ea,network=Network(68c429a3-410e-4fc5-a2f5-3128722c4b12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02662e23-ed')#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.001 186792 DEBUG nova.virt.libvirt.vif [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-208559163',display_name='tempest-ServersTestMultiNic-server-208559163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-208559163',id=112,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:07:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dd19aebd63694f83a0bdbf1e376177d5',ramdisk_id='',reservation_id='r-br0sps07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1824610177',owner_user_name='tempest-ServersTestMultiNic-1824610177-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:47Z,user_data=None,user_id='d1b53cb76c914b98afb487ff6059ebfe',uuid=57e661f0-0363-413f-8e02-7c12e8336789,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.002 186792 DEBUG nova.network.os_vif_util [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converting VIF {"id": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "address": "fa:16:3e:f6:a5:54", "network": {"id": "54b86579-e994-4e47-90b0-4a37929dd096", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1048229487", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap495fd4f0-57", "ovs_interfaceid": "495fd4f0-57d3-4f58-a770-801f0d4d3e57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.002 186792 DEBUG nova.network.os_vif_util [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:a5:54,bridge_name='br-int',has_traffic_filtering=True,id=495fd4f0-57d3-4f58-a770-801f0d4d3e57,network=Network(54b86579-e994-4e47-90b0-4a37929dd096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap495fd4f0-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.002 186792 DEBUG os_vif [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:a5:54,bridge_name='br-int',has_traffic_filtering=True,id=495fd4f0-57d3-4f58-a770-801f0d4d3e57,network=Network(54b86579-e994-4e47-90b0-4a37929dd096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap495fd4f0-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.003 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.004 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap495fd4f0-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.005 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.007 186792 INFO os_vif [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:a5:54,bridge_name='br-int',has_traffic_filtering=True,id=495fd4f0-57d3-4f58-a770-801f0d4d3e57,network=Network(54b86579-e994-4e47-90b0-4a37929dd096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap495fd4f0-57')#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.008 186792 INFO nova.virt.libvirt.driver [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Deleting instance files /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789_del#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.009 186792 INFO nova.virt.libvirt.driver [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Deletion of /var/lib/nova/instances/57e661f0-0363-413f-8e02-7c12e8336789_del complete#033[00m
Nov 22 03:07:52 np0005531888 podman[232218]: 2025-11-22 08:07:52.031203767 +0000 UTC m=+0.071900120 container remove 705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.037 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d57bd463-f9d0-48a6-88f7-a33b6e70c5a2]: (4, ('Sat Nov 22 08:07:51 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12 (705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd)\n705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd\nSat Nov 22 08:07:51 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12 (705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd)\n705a1940f3027ae12067d2ae5b71bde1a4f83bf5b4702f1769117c4d82ce72cd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.038 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6451ba47-33b1-4b54-97b0-d6f2e2fc5112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.039 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68c429a3-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.041 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531888 kernel: tap68c429a3-40: left promiscuous mode
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.058 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b99a4af5-9948-4aab-b14b-9e6ab201cd9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.073 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e34f1714-d6dc-435f-9665-502bb4515f48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.075 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb40c53-ff1b-4e68-a81f-b11c15579fc4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.090 186792 DEBUG nova.compute.manager [req-26ce1ff2-54ef-4c53-b18b-9a7c82562328 req-ed2b3d05-9ce8-4b92-8fc5-f6a04900e22f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-unplugged-02662e23-ed3c-4273-8579-899251b4f9ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.090 186792 DEBUG oslo_concurrency.lockutils [req-26ce1ff2-54ef-4c53-b18b-9a7c82562328 req-ed2b3d05-9ce8-4b92-8fc5-f6a04900e22f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.090 186792 DEBUG oslo_concurrency.lockutils [req-26ce1ff2-54ef-4c53-b18b-9a7c82562328 req-ed2b3d05-9ce8-4b92-8fc5-f6a04900e22f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.091 186792 DEBUG oslo_concurrency.lockutils [req-26ce1ff2-54ef-4c53-b18b-9a7c82562328 req-ed2b3d05-9ce8-4b92-8fc5-f6a04900e22f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.091 186792 DEBUG nova.compute.manager [req-26ce1ff2-54ef-4c53-b18b-9a7c82562328 req-ed2b3d05-9ce8-4b92-8fc5-f6a04900e22f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] No waiting events found dispatching network-vif-unplugged-02662e23-ed3c-4273-8579-899251b4f9ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.091 186792 DEBUG nova.compute.manager [req-26ce1ff2-54ef-4c53-b18b-9a7c82562328 req-ed2b3d05-9ce8-4b92-8fc5-f6a04900e22f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-unplugged-02662e23-ed3c-4273-8579-899251b4f9ea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.092 186792 INFO nova.compute.manager [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.093 186792 DEBUG oslo.service.loopingcall [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.093 186792 DEBUG nova.compute.manager [-] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.093 186792 DEBUG nova.network.neutron [-] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.096 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[351021ca-46a9-43da-aba8-6dfe47b0cfb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560636, 'reachable_time': 16075, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232240, 'error': None, 'target': 'ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 systemd[1]: run-netns-ovnmeta\x2d68c429a3\x2d410e\x2d4fc5\x2da2f5\x2d3128722c4b12.mount: Deactivated successfully.
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.099 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-68c429a3-410e-4fc5-a2f5-3128722c4b12 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.099 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[a291d964-c728-4032-ae27-910fb753c153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.101 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 495fd4f0-57d3-4f58-a770-801f0d4d3e57 in datapath 54b86579-e994-4e47-90b0-4a37929dd096 unbound from our chassis#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.102 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54b86579-e994-4e47-90b0-4a37929dd096, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.103 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d237a0-86fe-427b-a288-6d2d23cc59a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.103 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096 namespace which is not needed anymore#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.149 186792 DEBUG nova.compute.manager [req-441dc3ae-d05d-41ac-ba40-dbf34c6cddca req-22f92560-804c-4ec7-bb55-812115959b53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-unplugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.149 186792 DEBUG oslo_concurrency.lockutils [req-441dc3ae-d05d-41ac-ba40-dbf34c6cddca req-22f92560-804c-4ec7-bb55-812115959b53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.149 186792 DEBUG oslo_concurrency.lockutils [req-441dc3ae-d05d-41ac-ba40-dbf34c6cddca req-22f92560-804c-4ec7-bb55-812115959b53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.149 186792 DEBUG oslo_concurrency.lockutils [req-441dc3ae-d05d-41ac-ba40-dbf34c6cddca req-22f92560-804c-4ec7-bb55-812115959b53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.150 186792 DEBUG nova.compute.manager [req-441dc3ae-d05d-41ac-ba40-dbf34c6cddca req-22f92560-804c-4ec7-bb55-812115959b53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] No waiting events found dispatching network-vif-unplugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.150 186792 DEBUG nova.compute.manager [req-441dc3ae-d05d-41ac-ba40-dbf34c6cddca req-22f92560-804c-4ec7-bb55-812115959b53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-unplugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:07:52 np0005531888 neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096[232088]: [NOTICE]   (232095) : haproxy version is 2.8.14-c23fe91
Nov 22 03:07:52 np0005531888 neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096[232088]: [NOTICE]   (232095) : path to executable is /usr/sbin/haproxy
Nov 22 03:07:52 np0005531888 neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096[232088]: [WARNING]  (232095) : Exiting Master process...
Nov 22 03:07:52 np0005531888 neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096[232088]: [ALERT]    (232095) : Current worker (232097) exited with code 143 (Terminated)
Nov 22 03:07:52 np0005531888 neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096[232088]: [WARNING]  (232095) : All workers exited. Exiting... (0)
Nov 22 03:07:52 np0005531888 systemd[1]: libpod-afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e.scope: Deactivated successfully.
Nov 22 03:07:52 np0005531888 podman[232258]: 2025-11-22 08:07:52.260434103 +0000 UTC m=+0.051383084 container died afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:07:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e-userdata-shm.mount: Deactivated successfully.
Nov 22 03:07:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay-558a07a28e8f157bf992c57cc15e7cfbc5de7de7f429e8ab478e1ac51a00b1eb-merged.mount: Deactivated successfully.
Nov 22 03:07:52 np0005531888 podman[232258]: 2025-11-22 08:07:52.306138187 +0000 UTC m=+0.097087168 container cleanup afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:07:52 np0005531888 systemd[1]: libpod-conmon-afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e.scope: Deactivated successfully.
Nov 22 03:07:52 np0005531888 podman[232290]: 2025-11-22 08:07:52.368706446 +0000 UTC m=+0.042639290 container remove afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.375 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1ede9414-3dca-4afe-aa9a-8627852d42b9]: (4, ('Sat Nov 22 08:07:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096 (afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e)\nafce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e\nSat Nov 22 08:07:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096 (afce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e)\nafce9383e15d7f4a6049e2651110bdb92884cdff884b6af5fd6d1c2f9af2f58e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.376 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[748ef658-54b9-473c-be16-8720b80162b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.377 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54b86579-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.379 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531888 kernel: tap54b86579-e0: left promiscuous mode
Nov 22 03:07:52 np0005531888 nova_compute[186788]: 2025-11-22 08:07:52.390 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.394 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0cf74e80-27a8-4739-a228-96a55df5a914]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.410 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[59759cea-d396-4958-b4fa-46094b31f712]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.411 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3db577f5-7a8c-49e6-8486-6a06a42c9cad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.427 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f87c9407-5948-4335-8d11-cb68485992ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560787, 'reachable_time': 33002, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232305, 'error': None, 'target': 'ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.430 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-54b86579-e994-4e47-90b0-4a37929dd096 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:07:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:07:52.430 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a9293b-301b-4700-8b82-822d70ccc82a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:52 np0005531888 systemd[1]: run-netns-ovnmeta\x2d54b86579\x2de994\x2d4e47\x2d90b0\x2d4a37929dd096.mount: Deactivated successfully.
Nov 22 03:07:53 np0005531888 nova_compute[186788]: 2025-11-22 08:07:53.546 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.204 186792 DEBUG nova.compute.manager [req-2e622d75-1550-421b-801d-1cb949ebdf18 req-f6ebe2a3-09d4-4b7a-adc7-5cc876f79096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.204 186792 DEBUG oslo_concurrency.lockutils [req-2e622d75-1550-421b-801d-1cb949ebdf18 req-f6ebe2a3-09d4-4b7a-adc7-5cc876f79096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.204 186792 DEBUG oslo_concurrency.lockutils [req-2e622d75-1550-421b-801d-1cb949ebdf18 req-f6ebe2a3-09d4-4b7a-adc7-5cc876f79096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.205 186792 DEBUG oslo_concurrency.lockutils [req-2e622d75-1550-421b-801d-1cb949ebdf18 req-f6ebe2a3-09d4-4b7a-adc7-5cc876f79096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.205 186792 DEBUG nova.compute.manager [req-2e622d75-1550-421b-801d-1cb949ebdf18 req-f6ebe2a3-09d4-4b7a-adc7-5cc876f79096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] No waiting events found dispatching network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.205 186792 WARNING nova.compute.manager [req-2e622d75-1550-421b-801d-1cb949ebdf18 req-f6ebe2a3-09d4-4b7a-adc7-5cc876f79096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received unexpected event network-vif-plugged-02662e23-ed3c-4273-8579-899251b4f9ea for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.247 186792 DEBUG nova.compute.manager [req-00f79e07-817e-4a34-9e93-3065967368c5 req-aa5fb3af-5b67-43d7-809f-5aa052a0ca9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.247 186792 DEBUG oslo_concurrency.lockutils [req-00f79e07-817e-4a34-9e93-3065967368c5 req-aa5fb3af-5b67-43d7-809f-5aa052a0ca9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "57e661f0-0363-413f-8e02-7c12e8336789-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.247 186792 DEBUG oslo_concurrency.lockutils [req-00f79e07-817e-4a34-9e93-3065967368c5 req-aa5fb3af-5b67-43d7-809f-5aa052a0ca9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.247 186792 DEBUG oslo_concurrency.lockutils [req-00f79e07-817e-4a34-9e93-3065967368c5 req-aa5fb3af-5b67-43d7-809f-5aa052a0ca9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.248 186792 DEBUG nova.compute.manager [req-00f79e07-817e-4a34-9e93-3065967368c5 req-aa5fb3af-5b67-43d7-809f-5aa052a0ca9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] No waiting events found dispatching network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.248 186792 WARNING nova.compute.manager [req-00f79e07-817e-4a34-9e93-3065967368c5 req-aa5fb3af-5b67-43d7-809f-5aa052a0ca9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received unexpected event network-vif-plugged-495fd4f0-57d3-4f58-a770-801f0d4d3e57 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.308 186792 DEBUG nova.compute.manager [req-497384fc-69a4-4dc9-8de1-a902a511b2b9 req-343c0961-a4bb-4a2a-979a-217c16b14b13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-deleted-495fd4f0-57d3-4f58-a770-801f0d4d3e57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.308 186792 INFO nova.compute.manager [req-497384fc-69a4-4dc9-8de1-a902a511b2b9 req-343c0961-a4bb-4a2a-979a-217c16b14b13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Neutron deleted interface 495fd4f0-57d3-4f58-a770-801f0d4d3e57; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.309 186792 DEBUG nova.network.neutron [req-497384fc-69a4-4dc9-8de1-a902a511b2b9 req-343c0961-a4bb-4a2a-979a-217c16b14b13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Updating instance_info_cache with network_info: [{"id": "02662e23-ed3c-4273-8579-899251b4f9ea", "address": "fa:16:3e:f5:44:d9", "network": {"id": "68c429a3-410e-4fc5-a2f5-3128722c4b12", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-696662446", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dd19aebd63694f83a0bdbf1e376177d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02662e23-ed", "ovs_interfaceid": "02662e23-ed3c-4273-8579-899251b4f9ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.328 186792 DEBUG nova.compute.manager [req-497384fc-69a4-4dc9-8de1-a902a511b2b9 req-343c0961-a4bb-4a2a-979a-217c16b14b13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Detach interface failed, port_id=495fd4f0-57d3-4f58-a770-801f0d4d3e57, reason: Instance 57e661f0-0363-413f-8e02-7c12e8336789 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.799 186792 DEBUG nova.network.neutron [-] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.813 186792 INFO nova.compute.manager [-] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Took 2.72 seconds to deallocate network for instance.#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.873 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.873 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.946 186792 DEBUG nova.compute.provider_tree [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.970 186792 DEBUG nova.scheduler.client.report [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:07:54 np0005531888 nova_compute[186788]: 2025-11-22 08:07:54.993 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:55 np0005531888 nova_compute[186788]: 2025-11-22 08:07:55.021 186792 INFO nova.scheduler.client.report [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Deleted allocations for instance 57e661f0-0363-413f-8e02-7c12e8336789#033[00m
Nov 22 03:07:55 np0005531888 nova_compute[186788]: 2025-11-22 08:07:55.092 186792 DEBUG oslo_concurrency.lockutils [None req-9692fad7-956e-4b25-b04a-fce1c523100a d1b53cb76c914b98afb487ff6059ebfe dd19aebd63694f83a0bdbf1e376177d5 - - default default] Lock "57e661f0-0363-413f-8e02-7c12e8336789" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:55 np0005531888 nova_compute[186788]: 2025-11-22 08:07:55.210 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:07:55 np0005531888 nova_compute[186788]: 2025-11-22 08:07:55.211 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:55 np0005531888 nova_compute[186788]: 2025-11-22 08:07:55.211 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:07:55 np0005531888 nova_compute[186788]: 2025-11-22 08:07:55.211 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 57e661f0-0363-413f-8e02-7c12e8336789 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:55 np0005531888 nova_compute[186788]: 2025-11-22 08:07:55.256 186792 DEBUG nova.compute.utils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010#033[00m
Nov 22 03:07:55 np0005531888 nova_compute[186788]: 2025-11-22 08:07:55.463 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:07:56 np0005531888 nova_compute[186788]: 2025-11-22 08:07:56.157 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:56 np0005531888 nova_compute[186788]: 2025-11-22 08:07:56.403 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:56 np0005531888 nova_compute[186788]: 2025-11-22 08:07:56.436 186792 DEBUG nova.compute.manager [req-4759eb70-9ce0-4df8-a556-38949998e695 req-05b25d13-c647-4b40-892e-c1e056468152 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Received event network-vif-deleted-02662e23-ed3c-4273-8579-899251b4f9ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:56 np0005531888 nova_compute[186788]: 2025-11-22 08:07:56.470 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-57e661f0-0363-413f-8e02-7c12e8336789" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:56 np0005531888 nova_compute[186788]: 2025-11-22 08:07:56.470 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:07:56 np0005531888 nova_compute[186788]: 2025-11-22 08:07:56.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:57 np0005531888 nova_compute[186788]: 2025-11-22 08:07:57.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:57 np0005531888 podman[232306]: 2025-11-22 08:07:57.692077315 +0000 UTC m=+0.065840540 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:57 np0005531888 podman[232307]: 2025-11-22 08:07:57.712503487 +0000 UTC m=+0.084806256 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:57 np0005531888 nova_compute[186788]: 2025-11-22 08:07:57.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:58 np0005531888 nova_compute[186788]: 2025-11-22 08:07:58.548 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:59 np0005531888 nova_compute[186788]: 2025-11-22 08:07:59.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:59 np0005531888 nova_compute[186788]: 2025-11-22 08:07:59.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:00 np0005531888 nova_compute[186788]: 2025-11-22 08:08:00.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:02 np0005531888 nova_compute[186788]: 2025-11-22 08:08:02.008 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531888 nova_compute[186788]: 2025-11-22 08:08:03.550 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531888 nova_compute[186788]: 2025-11-22 08:08:03.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:03 np0005531888 nova_compute[186788]: 2025-11-22 08:08:03.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:08:04 np0005531888 nova_compute[186788]: 2025-11-22 08:08:04.957 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:04 np0005531888 nova_compute[186788]: 2025-11-22 08:08:04.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:04 np0005531888 nova_compute[186788]: 2025-11-22 08:08:04.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:04 np0005531888 nova_compute[186788]: 2025-11-22 08:08:04.985 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:04 np0005531888 nova_compute[186788]: 2025-11-22 08:08:04.985 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.166 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.167 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5668MB free_disk=73.27509307861328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.168 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.168 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.222 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.223 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.254 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.270 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.291 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:08:05 np0005531888 nova_compute[186788]: 2025-11-22 08:08:05.291 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:06 np0005531888 podman[232352]: 2025-11-22 08:08:06.688186683 +0000 UTC m=+0.056271154 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:08:06 np0005531888 podman[232353]: 2025-11-22 08:08:06.712827899 +0000 UTC m=+0.080049229 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 03:08:06 np0005531888 nova_compute[186788]: 2025-11-22 08:08:06.977 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798871.9740531, 57e661f0-0363-413f-8e02-7c12e8336789 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:08:06 np0005531888 nova_compute[186788]: 2025-11-22 08:08:06.977 186792 INFO nova.compute.manager [-] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:08:06 np0005531888 nova_compute[186788]: 2025-11-22 08:08:06.999 186792 DEBUG nova.compute.manager [None req-0b12fdf3-ebad-478a-a0ef-64bc65ae4a75 - - - - - -] [instance: 57e661f0-0363-413f-8e02-7c12e8336789] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.011 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.042 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "e2cd40ca-3604-448b-b55d-f90698a0f28a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.042 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.055 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.132 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.132 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.139 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.139 186792 INFO nova.compute.claims [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.240 186792 DEBUG nova.compute.provider_tree [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.255 186792 DEBUG nova.scheduler.client.report [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.275 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.276 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.323 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.324 186792 DEBUG nova.network.neutron [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.338 186792 INFO nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.350 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.443 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.445 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.445 186792 INFO nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Creating image(s)#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.446 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "/var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.446 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "/var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.447 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "/var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.459 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.523 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.524 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.524 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.537 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.598 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:07 np0005531888 nova_compute[186788]: 2025-11-22 08:08:07.599 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.088 186792 DEBUG nova.policy [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.198 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk 1073741824" returned: 0 in 0.599s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.199 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.199 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.261 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.262 186792 DEBUG nova.virt.disk.api [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Checking if we can resize image /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.262 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.288 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.320 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.321 186792 DEBUG nova.virt.disk.api [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Cannot resize image /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.321 186792 DEBUG nova.objects.instance [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lazy-loading 'migration_context' on Instance uuid e2cd40ca-3604-448b-b55d-f90698a0f28a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.338 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.339 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Ensure instance console log exists: /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.339 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.339 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.340 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.553 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:08 np0005531888 nova_compute[186788]: 2025-11-22 08:08:08.878 186792 DEBUG nova.network.neutron [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Successfully created port: 7dac20df-e449-4a6a-876e-07468688cf7b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:08:10 np0005531888 nova_compute[186788]: 2025-11-22 08:08:10.082 186792 DEBUG nova.network.neutron [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Successfully updated port: 7dac20df-e449-4a6a-876e-07468688cf7b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:08:10 np0005531888 nova_compute[186788]: 2025-11-22 08:08:10.095 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:08:10 np0005531888 nova_compute[186788]: 2025-11-22 08:08:10.096 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquired lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:08:10 np0005531888 nova_compute[186788]: 2025-11-22 08:08:10.096 186792 DEBUG nova.network.neutron [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:08:10 np0005531888 nova_compute[186788]: 2025-11-22 08:08:10.220 186792 DEBUG nova.compute.manager [req-799d7d98-c3fe-4432-8a03-ef18e9764c86 req-c6d00857-45f8-4bf9-b214-acef40af0bb2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received event network-changed-7dac20df-e449-4a6a-876e-07468688cf7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:10 np0005531888 nova_compute[186788]: 2025-11-22 08:08:10.220 186792 DEBUG nova.compute.manager [req-799d7d98-c3fe-4432-8a03-ef18e9764c86 req-c6d00857-45f8-4bf9-b214-acef40af0bb2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Refreshing instance network info cache due to event network-changed-7dac20df-e449-4a6a-876e-07468688cf7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:08:10 np0005531888 nova_compute[186788]: 2025-11-22 08:08:10.221 186792 DEBUG oslo_concurrency.lockutils [req-799d7d98-c3fe-4432-8a03-ef18e9764c86 req-c6d00857-45f8-4bf9-b214-acef40af0bb2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:08:10 np0005531888 nova_compute[186788]: 2025-11-22 08:08:10.736 186792 DEBUG nova.network.neutron [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.118 186792 DEBUG nova.network.neutron [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Updating instance_info_cache with network_info: [{"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.144 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Releasing lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.144 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Instance network_info: |[{"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.145 186792 DEBUG oslo_concurrency.lockutils [req-799d7d98-c3fe-4432-8a03-ef18e9764c86 req-c6d00857-45f8-4bf9-b214-acef40af0bb2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.145 186792 DEBUG nova.network.neutron [req-799d7d98-c3fe-4432-8a03-ef18e9764c86 req-c6d00857-45f8-4bf9-b214-acef40af0bb2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Refreshing network info cache for port 7dac20df-e449-4a6a-876e-07468688cf7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.148 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Start _get_guest_xml network_info=[{"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.153 186792 WARNING nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.160 186792 DEBUG nova.virt.libvirt.host [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.161 186792 DEBUG nova.virt.libvirt.host [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.165 186792 DEBUG nova.virt.libvirt.host [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.165 186792 DEBUG nova.virt.libvirt.host [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.166 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.166 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.167 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.167 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.167 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.168 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.168 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.168 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.168 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.169 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.169 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.169 186792 DEBUG nova.virt.hardware [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.173 186792 DEBUG nova.virt.libvirt.vif [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:08:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1178214837',display_name='tempest-TestServerBasicOps-server-1178214837',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1178214837',id=114,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDpQsp5k296LFuveuWYcfdhkRkHtXKdiOD6yU4/A12CwH5o3asnU1Q6kA3/dIDgqZ6lPIsmTP7C6Jn7mm6MeODR/3nE0CUvAUSQE/z09SRk1dnZgAMLgLvoPg7AT8LWEw==',key_name='tempest-TestServerBasicOps-712723952',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80d80bc50bfd40539762353a02ff7870',ramdisk_id='',reservation_id='r-600vfdbm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1001913954',owner_user_name='tempest-TestServerBasicOps-1001913954-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3ec42aac51d84cf985243c562087f0fa',uuid=e2cd40ca-3604-448b-b55d-f90698a0f28a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.174 186792 DEBUG nova.network.os_vif_util [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Converting VIF {"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.174 186792 DEBUG nova.network.os_vif_util [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:39:da,bridge_name='br-int',has_traffic_filtering=True,id=7dac20df-e449-4a6a-876e-07468688cf7b,network=Network(e29c4867-0d91-40a0-a0bb-dd5eead4a0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dac20df-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.175 186792 DEBUG nova.objects.instance [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lazy-loading 'pci_devices' on Instance uuid e2cd40ca-3604-448b-b55d-f90698a0f28a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.187 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <uuid>e2cd40ca-3604-448b-b55d-f90698a0f28a</uuid>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <name>instance-00000072</name>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestServerBasicOps-server-1178214837</nova:name>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:08:12</nova:creationTime>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        <nova:user uuid="3ec42aac51d84cf985243c562087f0fa">tempest-TestServerBasicOps-1001913954-project-member</nova:user>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        <nova:project uuid="80d80bc50bfd40539762353a02ff7870">tempest-TestServerBasicOps-1001913954</nova:project>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        <nova:port uuid="7dac20df-e449-4a6a-876e-07468688cf7b">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <entry name="serial">e2cd40ca-3604-448b-b55d-f90698a0f28a</entry>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <entry name="uuid">e2cd40ca-3604-448b-b55d-f90698a0f28a</entry>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.config"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:d1:39:da"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <target dev="tap7dac20df-e4"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/console.log" append="off"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:08:12 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:08:12 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:08:12 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:08:12 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.189 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Preparing to wait for external event network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.189 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.189 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.190 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.190 186792 DEBUG nova.virt.libvirt.vif [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:08:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1178214837',display_name='tempest-TestServerBasicOps-server-1178214837',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1178214837',id=114,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDpQsp5k296LFuveuWYcfdhkRkHtXKdiOD6yU4/A12CwH5o3asnU1Q6kA3/dIDgqZ6lPIsmTP7C6Jn7mm6MeODR/3nE0CUvAUSQE/z09SRk1dnZgAMLgLvoPg7AT8LWEw==',key_name='tempest-TestServerBasicOps-712723952',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80d80bc50bfd40539762353a02ff7870',ramdisk_id='',reservation_id='r-600vfdbm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1001913954',owner_user_name='tempest-TestServerBasicOps-1001913954-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3ec42aac51d84cf985243c562087f0fa',uuid=e2cd40ca-3604-448b-b55d-f90698a0f28a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.191 186792 DEBUG nova.network.os_vif_util [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Converting VIF {"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.191 186792 DEBUG nova.network.os_vif_util [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:39:da,bridge_name='br-int',has_traffic_filtering=True,id=7dac20df-e449-4a6a-876e-07468688cf7b,network=Network(e29c4867-0d91-40a0-a0bb-dd5eead4a0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dac20df-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.191 186792 DEBUG os_vif [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:39:da,bridge_name='br-int',has_traffic_filtering=True,id=7dac20df-e449-4a6a-876e-07468688cf7b,network=Network(e29c4867-0d91-40a0-a0bb-dd5eead4a0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dac20df-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.192 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.192 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.193 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.196 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.196 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7dac20df-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.196 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7dac20df-e4, col_values=(('external_ids', {'iface-id': '7dac20df-e449-4a6a-876e-07468688cf7b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:39:da', 'vm-uuid': 'e2cd40ca-3604-448b-b55d-f90698a0f28a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:12 np0005531888 NetworkManager[55166]: <info>  [1763798892.1993] manager: (tap7dac20df-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.198 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.207 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.210 186792 INFO os_vif [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:39:da,bridge_name='br-int',has_traffic_filtering=True,id=7dac20df-e449-4a6a-876e-07468688cf7b,network=Network(e29c4867-0d91-40a0-a0bb-dd5eead4a0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dac20df-e4')#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.250 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.250 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.250 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] No VIF found with MAC fa:16:3e:d1:39:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.251 186792 INFO nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Using config drive#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.679 186792 INFO nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Creating config drive at /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.config#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.686 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_8bfxhdn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.814 186792 DEBUG oslo_concurrency.processutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_8bfxhdn" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:12 np0005531888 kernel: tap7dac20df-e4: entered promiscuous mode
Nov 22 03:08:12 np0005531888 NetworkManager[55166]: <info>  [1763798892.8841] manager: (tap7dac20df-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.886 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:12Z|00411|binding|INFO|Claiming lport 7dac20df-e449-4a6a-876e-07468688cf7b for this chassis.
Nov 22 03:08:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:12Z|00412|binding|INFO|7dac20df-e449-4a6a-876e-07468688cf7b: Claiming fa:16:3e:d1:39:da 10.100.0.7
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.892 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.896 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.904 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:39:da 10.100.0.7'], port_security=['fa:16:3e:d1:39:da 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e29c4867-0d91-40a0-a0bb-dd5eead4a0be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80d80bc50bfd40539762353a02ff7870', 'neutron:revision_number': '2', 'neutron:security_group_ids': '13cb9ae7-f274-4b5b-a055-de5b2118e834 e9d21bf8-72d1-4f3d-ad30-87942f562ac1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b93c281-d03a-4cdc-bbb2-39cdc3f7840a, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=7dac20df-e449-4a6a-876e-07468688cf7b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.906 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 7dac20df-e449-4a6a-876e-07468688cf7b in datapath e29c4867-0d91-40a0-a0bb-dd5eead4a0be bound to our chassis#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.907 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e29c4867-0d91-40a0-a0bb-dd5eead4a0be#033[00m
Nov 22 03:08:12 np0005531888 systemd-udevd[232426]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.922 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2f1115-1788-444e-9faa-e8e545f07e10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.923 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape29c4867-01 in ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.926 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape29c4867-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.926 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[44d57319-11a3-4561-8b10-bedc967c61c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.927 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d4de7e4a-eb63-4072-9710-98b701d4f889]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:12 np0005531888 systemd-machined[153106]: New machine qemu-54-instance-00000072.
Nov 22 03:08:12 np0005531888 NetworkManager[55166]: <info>  [1763798892.9357] device (tap7dac20df-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:08:12 np0005531888 NetworkManager[55166]: <info>  [1763798892.9365] device (tap7dac20df-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.941 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[8df2d6e7-bf45-41ba-b546-35ae06fd93de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:12 np0005531888 systemd[1]: Started Virtual Machine qemu-54-instance-00000072.
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.955 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:12Z|00413|binding|INFO|Setting lport 7dac20df-e449-4a6a-876e-07468688cf7b ovn-installed in OVS
Nov 22 03:08:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:12Z|00414|binding|INFO|Setting lport 7dac20df-e449-4a6a-876e-07468688cf7b up in Southbound
Nov 22 03:08:12 np0005531888 nova_compute[186788]: 2025-11-22 08:08:12.960 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.961 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[70806a13-ff96-4896-b9aa-f820dc90eb84]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:12.994 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb7544a-6a25-47a5-833c-650ee937d4e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 systemd-udevd[232429]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:08:13 np0005531888 NetworkManager[55166]: <info>  [1763798893.0037] manager: (tape29c4867-00): new Veth device (/org/freedesktop/NetworkManager/Devices/200)
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.004 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a85440b3-511d-4cc9-aa63-0fa1138c81e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.044 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[609ab3bb-795c-4370-aca5-0ed1245d2ac7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.049 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[67bb5a79-1585-4933-9f3f-b6dd54050260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 NetworkManager[55166]: <info>  [1763798893.0815] device (tape29c4867-00): carrier: link connected
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.090 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[229146e3-4926-4ce4-9e22-dfb9d879c828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.108 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c19067-55bf-4030-8b39-7ca433e7b881]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape29c4867-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:53:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563372, 'reachable_time': 28652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232458, 'error': None, 'target': 'ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.127 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3099b3c7-d98c-4c1b-9ef1-3e38cdbb82f7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:5365'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 563372, 'tstamp': 563372}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232459, 'error': None, 'target': 'ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.145 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d54c46d4-1410-49cd-a88a-0513b7e2fa82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape29c4867-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:53:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563372, 'reachable_time': 28652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232460, 'error': None, 'target': 'ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.180 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[87717274-851b-4de2-9cae-28f8a3310dab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.238 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[420bfb1b-c9c6-40e1-8fde-161642c41764]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 kernel: tape29c4867-00: entered promiscuous mode
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.239 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape29c4867-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.240 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.240 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape29c4867-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:13 np0005531888 nova_compute[186788]: 2025-11-22 08:08:13.242 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:13 np0005531888 NetworkManager[55166]: <info>  [1763798893.2428] manager: (tape29c4867-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Nov 22 03:08:13 np0005531888 nova_compute[186788]: 2025-11-22 08:08:13.244 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.246 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape29c4867-00, col_values=(('external_ids', {'iface-id': '05da5975-a719-42f0-8f3a-0ff76d100892'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:13 np0005531888 nova_compute[186788]: 2025-11-22 08:08:13.247 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:13Z|00415|binding|INFO|Releasing lport 05da5975-a719-42f0-8f3a-0ff76d100892 from this chassis (sb_readonly=0)
Nov 22 03:08:13 np0005531888 nova_compute[186788]: 2025-11-22 08:08:13.259 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.261 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e29c4867-0d91-40a0-a0bb-dd5eead4a0be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e29c4867-0d91-40a0-a0bb-dd5eead4a0be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.262 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[022c65c7-d638-46b6-83e3-29d5bc1208dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.263 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-e29c4867-0d91-40a0-a0bb-dd5eead4a0be
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/e29c4867-0d91-40a0-a0bb-dd5eead4a0be.pid.haproxy
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID e29c4867-0d91-40a0-a0bb-dd5eead4a0be
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:08:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:13.264 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be', 'env', 'PROCESS_TAG=haproxy-e29c4867-0d91-40a0-a0bb-dd5eead4a0be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e29c4867-0d91-40a0-a0bb-dd5eead4a0be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:08:13 np0005531888 nova_compute[186788]: 2025-11-22 08:08:13.554 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:13 np0005531888 podman[232491]: 2025-11-22 08:08:13.63660602 +0000 UTC m=+0.063605424 container create fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:08:13 np0005531888 systemd[1]: Started libpod-conmon-fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f.scope.
Nov 22 03:08:13 np0005531888 podman[232491]: 2025-11-22 08:08:13.603272971 +0000 UTC m=+0.030272395 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:08:13 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:08:13 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32ae1e908eaf8fd3b41e22495484bc99bb278e00a4c24093df38621cbe3f2b42/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:08:13 np0005531888 podman[232491]: 2025-11-22 08:08:13.72891372 +0000 UTC m=+0.155913144 container init fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:08:13 np0005531888 podman[232491]: 2025-11-22 08:08:13.734552269 +0000 UTC m=+0.161551673 container start fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:08:13 np0005531888 neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232506]: [NOTICE]   (232512) : New worker (232514) forked
Nov 22 03:08:13 np0005531888 neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232506]: [NOTICE]   (232512) : Loading success.
Nov 22 03:08:13 np0005531888 nova_compute[186788]: 2025-11-22 08:08:13.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:14.365 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:08:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:14.366 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.367 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.403 186792 DEBUG nova.network.neutron [req-799d7d98-c3fe-4432-8a03-ef18e9764c86 req-c6d00857-45f8-4bf9-b214-acef40af0bb2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Updated VIF entry in instance network info cache for port 7dac20df-e449-4a6a-876e-07468688cf7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.404 186792 DEBUG nova.network.neutron [req-799d7d98-c3fe-4432-8a03-ef18e9764c86 req-c6d00857-45f8-4bf9-b214-acef40af0bb2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Updating instance_info_cache with network_info: [{"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.419 186792 DEBUG oslo_concurrency.lockutils [req-799d7d98-c3fe-4432-8a03-ef18e9764c86 req-c6d00857-45f8-4bf9-b214-acef40af0bb2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.504 186792 DEBUG nova.compute.manager [req-03eca782-36cf-4c2a-8b56-29839e157519 req-703da2d7-14cb-4d70-905b-b5a79439adcf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received event network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.505 186792 DEBUG oslo_concurrency.lockutils [req-03eca782-36cf-4c2a-8b56-29839e157519 req-703da2d7-14cb-4d70-905b-b5a79439adcf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.505 186792 DEBUG oslo_concurrency.lockutils [req-03eca782-36cf-4c2a-8b56-29839e157519 req-703da2d7-14cb-4d70-905b-b5a79439adcf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.505 186792 DEBUG oslo_concurrency.lockutils [req-03eca782-36cf-4c2a-8b56-29839e157519 req-703da2d7-14cb-4d70-905b-b5a79439adcf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.506 186792 DEBUG nova.compute.manager [req-03eca782-36cf-4c2a-8b56-29839e157519 req-703da2d7-14cb-4d70-905b-b5a79439adcf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Processing event network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.653 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798894.6528394, e2cd40ca-3604-448b-b55d-f90698a0f28a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.653 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] VM Started (Lifecycle Event)#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.656 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.659 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.663 186792 INFO nova.virt.libvirt.driver [-] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Instance spawned successfully.#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.664 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.692 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.700 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.704 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.705 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.705 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.706 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.706 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.707 186792 DEBUG nova.virt.libvirt.driver [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.732 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.733 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798894.6536953, e2cd40ca-3604-448b-b55d-f90698a0f28a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.733 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.758 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.763 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798894.6586971, e2cd40ca-3604-448b-b55d-f90698a0f28a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.764 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.777 186792 INFO nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Took 7.33 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.778 186792 DEBUG nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.783 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.789 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.821 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.866 186792 INFO nova.compute.manager [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Took 7.76 seconds to build instance.#033[00m
Nov 22 03:08:14 np0005531888 nova_compute[186788]: 2025-11-22 08:08:14.883 186792 DEBUG oslo_concurrency.lockutils [None req-406f3d32-14cb-4d85-883d-18e57c9c4750 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:16 np0005531888 podman[232530]: 2025-11-22 08:08:16.689487349 +0000 UTC m=+0.062735653 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible)
Nov 22 03:08:16 np0005531888 NetworkManager[55166]: <info>  [1763798896.8329] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Nov 22 03:08:16 np0005531888 NetworkManager[55166]: <info>  [1763798896.8336] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Nov 22 03:08:16 np0005531888 nova_compute[186788]: 2025-11-22 08:08:16.832 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:16 np0005531888 nova_compute[186788]: 2025-11-22 08:08:16.948 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:16 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:16Z|00416|binding|INFO|Releasing lport 05da5975-a719-42f0-8f3a-0ff76d100892 from this chassis (sb_readonly=0)
Nov 22 03:08:16 np0005531888 nova_compute[186788]: 2025-11-22 08:08:16.965 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.072 186792 DEBUG nova.compute.manager [req-fd8cc3bf-0245-4ea5-8cb7-4287c1eb8e19 req-88b4f38f-5b27-419e-b17f-1de100639d19 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received event network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.073 186792 DEBUG oslo_concurrency.lockutils [req-fd8cc3bf-0245-4ea5-8cb7-4287c1eb8e19 req-88b4f38f-5b27-419e-b17f-1de100639d19 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.073 186792 DEBUG oslo_concurrency.lockutils [req-fd8cc3bf-0245-4ea5-8cb7-4287c1eb8e19 req-88b4f38f-5b27-419e-b17f-1de100639d19 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.073 186792 DEBUG oslo_concurrency.lockutils [req-fd8cc3bf-0245-4ea5-8cb7-4287c1eb8e19 req-88b4f38f-5b27-419e-b17f-1de100639d19 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.074 186792 DEBUG nova.compute.manager [req-fd8cc3bf-0245-4ea5-8cb7-4287c1eb8e19 req-88b4f38f-5b27-419e-b17f-1de100639d19 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] No waiting events found dispatching network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.074 186792 WARNING nova.compute.manager [req-fd8cc3bf-0245-4ea5-8cb7-4287c1eb8e19 req-88b4f38f-5b27-419e-b17f-1de100639d19 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received unexpected event network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b for instance with vm_state active and task_state None.#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.199 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.436 186792 DEBUG nova.compute.manager [req-bc8b27d5-c481-468c-a945-53adebffc775 req-e2f7f2db-b6a5-40d3-999c-c6ce39657a04 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received event network-changed-7dac20df-e449-4a6a-876e-07468688cf7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.437 186792 DEBUG nova.compute.manager [req-bc8b27d5-c481-468c-a945-53adebffc775 req-e2f7f2db-b6a5-40d3-999c-c6ce39657a04 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Refreshing instance network info cache due to event network-changed-7dac20df-e449-4a6a-876e-07468688cf7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.437 186792 DEBUG oslo_concurrency.lockutils [req-bc8b27d5-c481-468c-a945-53adebffc775 req-e2f7f2db-b6a5-40d3-999c-c6ce39657a04 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.438 186792 DEBUG oslo_concurrency.lockutils [req-bc8b27d5-c481-468c-a945-53adebffc775 req-e2f7f2db-b6a5-40d3-999c-c6ce39657a04 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:08:17 np0005531888 nova_compute[186788]: 2025-11-22 08:08:17.438 186792 DEBUG nova.network.neutron [req-bc8b27d5-c481-468c-a945-53adebffc775 req-e2f7f2db-b6a5-40d3-999c-c6ce39657a04 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Refreshing network info cache for port 7dac20df-e449-4a6a-876e-07468688cf7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:08:18 np0005531888 nova_compute[186788]: 2025-11-22 08:08:18.563 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:18 np0005531888 podman[232554]: 2025-11-22 08:08:18.685863009 +0000 UTC m=+0.057735811 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:08:18 np0005531888 nova_compute[186788]: 2025-11-22 08:08:18.974 186792 DEBUG nova.network.neutron [req-bc8b27d5-c481-468c-a945-53adebffc775 req-e2f7f2db-b6a5-40d3-999c-c6ce39657a04 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Updated VIF entry in instance network info cache for port 7dac20df-e449-4a6a-876e-07468688cf7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:08:18 np0005531888 nova_compute[186788]: 2025-11-22 08:08:18.974 186792 DEBUG nova.network.neutron [req-bc8b27d5-c481-468c-a945-53adebffc775 req-e2f7f2db-b6a5-40d3-999c-c6ce39657a04 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Updating instance_info_cache with network_info: [{"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:08:18 np0005531888 nova_compute[186788]: 2025-11-22 08:08:18.994 186792 DEBUG oslo_concurrency.lockutils [req-bc8b27d5-c481-468c-a945-53adebffc775 req-e2f7f2db-b6a5-40d3-999c-c6ce39657a04 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e2cd40ca-3604-448b-b55d-f90698a0f28a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:08:21 np0005531888 nova_compute[186788]: 2025-11-22 08:08:21.082 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:21 np0005531888 nova_compute[186788]: 2025-11-22 08:08:21.903 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:22 np0005531888 nova_compute[186788]: 2025-11-22 08:08:22.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:22 np0005531888 podman[232578]: 2025-11-22 08:08:22.689974007 +0000 UTC m=+0.061973604 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Nov 22 03:08:23 np0005531888 nova_compute[186788]: 2025-11-22 08:08:23.565 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:24 np0005531888 nova_compute[186788]: 2025-11-22 08:08:24.065 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:24 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:24.368 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:27 np0005531888 nova_compute[186788]: 2025-11-22 08:08:27.205 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:28Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:39:da 10.100.0.7
Nov 22 03:08:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:28Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:39:da 10.100.0.7
Nov 22 03:08:28 np0005531888 nova_compute[186788]: 2025-11-22 08:08:28.566 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:28 np0005531888 podman[232614]: 2025-11-22 08:08:28.696478137 +0000 UTC m=+0.067889140 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:08:28 np0005531888 podman[232615]: 2025-11-22 08:08:28.716198112 +0000 UTC m=+0.083881813 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:08:30 np0005531888 nova_compute[186788]: 2025-11-22 08:08:30.243 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:32 np0005531888 nova_compute[186788]: 2025-11-22 08:08:32.208 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:33 np0005531888 nova_compute[186788]: 2025-11-22 08:08:33.569 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:34Z|00417|binding|INFO|Releasing lport 05da5975-a719-42f0-8f3a-0ff76d100892 from this chassis (sb_readonly=0)
Nov 22 03:08:34 np0005531888 nova_compute[186788]: 2025-11-22 08:08:34.437 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:36 np0005531888 nova_compute[186788]: 2025-11-22 08:08:36.644 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:36.823 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:36.824 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:36.825 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.847 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'name': 'tempest-TestServerBasicOps-server-1178214837', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000072', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '80d80bc50bfd40539762353a02ff7870', 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'hostId': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.847 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.873 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.874 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abefc9b6-23aa-40b2-a758-dd7b55dd460f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.847883', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7275c820-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': '6ac5b394ba851b08ff36a98ec2f6b7c6bd741f0b660af1bd2fa4a8a8b0e5d0cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.847883', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7275d798-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': 'ed5bf3b20d1e8c35ad33206bbfa93c1eaa7a58b62ddf4158816af4edd0ff974a'}]}, 'timestamp': '2025-11-22 08:08:36.874997', '_unique_id': '0fbd41e10b684474bff4178fdf20241b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.877 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.881 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e2cd40ca-3604-448b-b55d-f90698a0f28a / tap7dac20df-e4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.882 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.incoming.bytes volume: 1722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f68cbfd-92ce-4820-beef-d383b98bcbd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1722, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.877896', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '72770834-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': 'a117fb2294a99b9788b6d0c4ed07b0f2c38a75f119da635c1c29da279f5237b5'}]}, 'timestamp': '2025-11-22 08:08:36.882894', '_unique_id': 'b20b36bb2fe44851a760ff9d57e0d0bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.885 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.885 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1178214837>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1178214837>]
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.886 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c939802-5425-49dc-9193-4f9165894738', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.886003', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '72779416-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': '6f36adc2a7dfba64261586575c5165c8c99649c17aaa36c96d78b6e6b932cebb'}]}, 'timestamp': '2025-11-22 08:08:36.886384', '_unique_id': 'dedcc58d2b2d4c189aee8a79022a3516'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.887 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.888 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.read.requests volume: 1135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.888 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07afc1ef-b5e7-44f5-8cdb-097dc82d1b18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1135, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.888081', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7277e434-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': '2a81ccbce104c48baf18de9121d7f4d89859385d3dd82c5d60797ac96ab01b1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.888081', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7277ed9e-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': '56de692395c92f7534da37eb42a135094627164105a73552ab56a82bca37908f'}]}, 'timestamp': '2025-11-22 08:08:36.888658', '_unique_id': '096ecc1a7017408284cd6f38b151a606'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.890 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.890 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1178214837>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1178214837>]
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.890 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '166c5904-e992-42a7-bb8b-99e4a86a95a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.890306', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '72783ad8-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': 'fb08ba1f7b39d3c1cc3c79161df5140e1584ed04778c8f8bc045780421445b22'}]}, 'timestamp': '2025-11-22 08:08:36.890664', '_unique_id': '69eae07afc404fa596240acd4cfe120d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.891 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.read.bytes volume: 31017472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe539e1f-73b8-4339-8b4d-8f8dd91b30b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31017472, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.891944', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7278787c-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': '969c5301f39a4920f426d42a222315f83619503fa4735585466e2cecac243dc1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.891944', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72788132-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': 'a73fbb87ffe69069d66887a1172413dea2d11e84507964f83a0bc3c0299667b6'}]}, 'timestamp': '2025-11-22 08:08:36.892387', '_unique_id': 'e622abada4a84750a987085618b5318a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.893 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.912 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/cpu volume: 12460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2755e01d-8c9c-4e6b-aca1-5f918311abc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12460000000, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'timestamp': '2025-11-22T08:08:36.893654', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '727ba664-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.611882335, 'message_signature': 'bebffdebb472e28a5702b049c5af08e898c771226de4cf737bd26c8a5c1f44c7'}]}, 'timestamp': '2025-11-22 08:08:36.913075', '_unique_id': '362af7d4703e4bc78a3fe02bf4310a57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.914 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.915 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1178214837>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1178214837>]
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.915 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99ab428a-e79c-4215-90e2-f51674946a7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.915371', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '727c0e74-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': '588f96777ee60193422aa5372dda6ef7cdc64df10182a8a467052de496e1dd87'}]}, 'timestamp': '2025-11-22 08:08:36.915737', '_unique_id': '4e3fa04b9a7f4b9fb2ccc7f9f815b882'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.917 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.write.latency volume: 6134978186 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.917 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53094e8c-bda4-470a-a33a-35ee1fe38c38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6134978186, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.917548', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '727c61e4-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': 'e875a4e12c63103cf4318916abb7d27f7c72840d2639ea43252f0413eed8d2f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.917548', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '727c6b62-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': 'c66d7f323f5328e5b21eab3895eae986e8ecd27f4bc7b79d732a8fea646e9529'}]}, 'timestamp': '2025-11-22 08:08:36.918048', '_unique_id': '52cec1af878744a6951f9b6e9a41dba0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.919 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.919 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/memory.usage volume: 42.49609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab3c04d5-2785-4780-843e-96530e068799', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.49609375, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'timestamp': '2025-11-22T08:08:36.919243', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '727ca29e-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.611882335, 'message_signature': 'a99c5bfff8e5d5912a1a22a1dd93a37935558015830692c34658390c2682a430'}]}, 'timestamp': '2025-11-22 08:08:36.919466', '_unique_id': 'b8c6f594c69545a2a2f0b35b4c8f7586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.920 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.outgoing.bytes volume: 1480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ff2cf05-031d-4e0f-a4b7-39c6481e4647', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1480, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.920866', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '727ce48e-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': 'daf419fedce52bdd6f1b3e6fa65c8c74d7f09a34cce0daf3276de787a2ff9031'}]}, 'timestamp': '2025-11-22 08:08:36.921219', '_unique_id': '4ee6381b66294b1db15525372a4c8bd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.922 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.922 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1178214837>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1178214837>]
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.922 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.outgoing.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '265d434a-2851-4d28-bfb1-3dd6e75b94a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.922844', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '727d31dc-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': '33f3dc01628dc808a0793c9b7c74c9cf9feeae68887d3f817f31559e3c443fa1'}]}, 'timestamp': '2025-11-22 08:08:36.923179', '_unique_id': '3d1760612ed44b4595f161bc937956f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.924 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89d7613f-ea06-4898-b304-47c43745e104', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.924541', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '727d73c2-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': 'fa841b940204a0da6ddae2d9c7755bbb71336747195c4c2157ad70947940e1e3'}]}, 'timestamp': '2025-11-22 08:08:36.924865', '_unique_id': '8a1b9184f0ec44568acb8e865653335b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.944 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.945 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03c15ca0-73b9-4bcf-88c9-03ce6c7f1b9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.926034', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72809624-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.625549831, 'message_signature': '7d4ace5af460289161413a77f635adc67b72b28885b07570b80bd65f26e805a9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.926034', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7280a876-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.625549831, 'message_signature': '951e5f3f8abc1a912647c5e56ea984cdd5e4a26e972ffb6b7defe2482c829601'}]}, 'timestamp': '2025-11-22 08:08:36.945895', '_unique_id': 'e52841ce82304e708e3f2a86eaf0f777'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.948 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7d11bf-3dee-4fc9-ad32-c4109e702d7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.948931', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '72812fa8-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': '7c9b84f850da2061a75785664caaf2cf67457d289da7bd0f87a38e796671bbc9'}]}, 'timestamp': '2025-11-22 08:08:36.949421', '_unique_id': '7b1305f6fadc4d94b63619a688ad25ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.951 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.952 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23fa89e9-81e7-4139-9547-1bdebfb344fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.951843', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7281a0f0-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.625549831, 'message_signature': '58943d97a0ff0c81dd0ac5e8d508b843f6ef3ec7647b9b565fa61f716932a458'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.951843', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7281af6e-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.625549831, 'message_signature': '16faa2715f739c5bc579c9b0e293751ee8fcffcc479a5af8a7d08cfc35e4d915'}]}, 'timestamp': '2025-11-22 08:08:36.952641', '_unique_id': '29317ff787f044929ccbde538ee493f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.955 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.955 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.read.latency volume: 1036206891 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.956 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.read.latency volume: 269315802 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '395e7ea6-2326-4737-a759-18db127941a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1036206891, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.955854', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72823fe2-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': '13d593dc8d2ade0876c812c8754ceeeb3238f814803a8847f2610c3642c220b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 269315802, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.955854', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72824c4e-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': '81c9a7bba33aae134fdb9456ed1fa5c5d41426d022ec066d7fafe71bdf5f158c'}]}, 'timestamp': '2025-11-22 08:08:36.956620', '_unique_id': '8e6b82877ac54cf49758509c7561a8c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.958 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.958 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.959 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a75f04ea-040c-4f82-a495-c5bdce39a73f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.958805', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7282ad88-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.625549831, 'message_signature': 'e8fc0dd70f0d2ab000c62c556bebb68168ba30d2ab1ad37d3ca0f1a9cd7b2790'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.958805', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7282b936-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.625549831, 'message_signature': '2986e779fa869554c83fdc9318dd240328014af37a569d6386b64e192f942cff'}]}, 'timestamp': '2025-11-22 08:08:36.959363', '_unique_id': '075f2500e5d149a7a6fcb5bd18e19faa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.961 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.961 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9afbc74-3493-494e-990f-c76ef785ae39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.961389', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '7283117e-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': 'cbbc8985f21a11b32e4957dbc38661f23fb862a2946b4e30c11e646118f25b1f'}]}, 'timestamp': '2025-11-22 08:08:36.961679', '_unique_id': 'c988b797534b4c1ab07758e56a6f0abe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5efc4c46-b712-4347-854a-e5b80300709c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'instance-00000072-e2cd40ca-3604-448b-b55d-f90698a0f28a-tap7dac20df-e4', 'timestamp': '2025-11-22T08:08:36.963064', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'tap7dac20df-e4', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:39:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7dac20df-e4'}, 'message_id': '7283536e-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.577447149, 'message_signature': '336e7be83aa55446be34beafc421eb8413094ec03c5854caa4e090af4883c546'}]}, 'timestamp': '2025-11-22 08:08:36.963324', '_unique_id': 'eb5bc0a60076485f87425429ab680ccb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.964 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.write.bytes volume: 72916992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.964 12 DEBUG ceilometer.compute.pollsters [-] e2cd40ca-3604-448b-b55d-f90698a0f28a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0327a585-7226-45cd-8bd9-fa37dcf4efb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72916992, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-vda', 'timestamp': '2025-11-22T08:08:36.964452', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72838960-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': '4be8bccd84c9a62dd945ad65f208689a9e9cc0ee11d89e873e88ca13125ad107'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3ec42aac51d84cf985243c562087f0fa', 'user_name': None, 'project_id': '80d80bc50bfd40539762353a02ff7870', 'project_name': None, 'resource_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a-sda', 'timestamp': '2025-11-22T08:08:36.964452', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1178214837', 'name': 'instance-00000072', 'instance_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'instance_type': 'm1.nano', 'host': '49a4b6c57f3f519ff1859873921569a0609d425e84b5821241b18d7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7283918a-c77a-11f0-941d-fa163e6775e5', 'monotonic_time': 5657.547377309, 'message_signature': '61ee0eb072bdf3e8ed947ac83012139dc36e205ac9419ad8e6b181ebb1e63beb'}]}, 'timestamp': '2025-11-22 08:08:36.964918', '_unique_id': '24d578ae4ac442b8bedabc0281c95ce7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:08:36.965 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:08:37 np0005531888 nova_compute[186788]: 2025-11-22 08:08:37.209 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:37 np0005531888 podman[232660]: 2025-11-22 08:08:37.683074473 +0000 UTC m=+0.056582403 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:08:37 np0005531888 podman[232661]: 2025-11-22 08:08:37.697648011 +0000 UTC m=+0.063286127 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:08:38 np0005531888 nova_compute[186788]: 2025-11-22 08:08:38.570 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:41Z|00418|binding|INFO|Releasing lport 05da5975-a719-42f0-8f3a-0ff76d100892 from this chassis (sb_readonly=0)
Nov 22 03:08:41 np0005531888 nova_compute[186788]: 2025-11-22 08:08:41.726 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:42 np0005531888 nova_compute[186788]: 2025-11-22 08:08:42.211 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:42 np0005531888 nova_compute[186788]: 2025-11-22 08:08:42.881 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:43 np0005531888 nova_compute[186788]: 2025-11-22 08:08:43.571 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:47 np0005531888 nova_compute[186788]: 2025-11-22 08:08:47.214 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:47 np0005531888 podman[232702]: 2025-11-22 08:08:47.68444695 +0000 UTC m=+0.056990283 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 22 03:08:48 np0005531888 nova_compute[186788]: 2025-11-22 08:08:48.574 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:49 np0005531888 podman[232722]: 2025-11-22 08:08:49.686321425 +0000 UTC m=+0.055118287 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:51.156 104131 DEBUG eventlet.wsgi.server [-] (104131) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:51.159 104131 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: Accept: */*#015
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: Connection: close#015
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: Content-Type: text/plain#015
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: Host: 169.254.169.254#015
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: User-Agent: curl/7.84.0#015
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: X-Forwarded-For: 10.100.0.7#015
Nov 22 03:08:51 np0005531888 ovn_metadata_agent[104018]: X-Ovn-Network-Id: e29c4867-0d91-40a0-a0bb-dd5eead4a0be __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Nov 22 03:08:52 np0005531888 nova_compute[186788]: 2025-11-22 08:08:52.215 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:53.515 104131 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:53.516 104131 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 2.3577399#033[00m
Nov 22 03:08:53 np0005531888 haproxy-metadata-proxy-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232514]: 10.100.0.7:42038 [22/Nov/2025:08:08:51.154] listener listener/metadata 0/0/0/2361/2361 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Nov 22 03:08:53 np0005531888 nova_compute[186788]: 2025-11-22 08:08:53.576 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:53.607 104131 DEBUG eventlet.wsgi.server [-] (104131) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:53.608 104131 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: Accept: */*#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: Connection: close#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: Content-Length: 100#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: Content-Type: application/x-www-form-urlencoded#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: Host: 169.254.169.254#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: User-Agent: curl/7.84.0#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: X-Forwarded-For: 10.100.0.7#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: X-Ovn-Network-Id: e29c4867-0d91-40a0-a0bb-dd5eead4a0be#015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: #015
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Nov 22 03:08:53 np0005531888 podman[232746]: 2025-11-22 08:08:53.69354861 +0000 UTC m=+0.064237300 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:53.752 104131 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Nov 22 03:08:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:53.752 104131 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.1436968#033[00m
Nov 22 03:08:53 np0005531888 haproxy-metadata-proxy-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232514]: 10.100.0.7:42046 [22/Nov/2025:08:08:53.606] listener listener/metadata 0/0/0/146/146 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Nov 22 03:08:55 np0005531888 nova_compute[186788]: 2025-11-22 08:08:55.596 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.051 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "e2cd40ca-3604-448b-b55d-f90698a0f28a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.052 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.052 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.052 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.052 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.060 186792 INFO nova.compute.manager [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Terminating instance#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.066 186792 DEBUG nova.compute.manager [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:08:56 np0005531888 kernel: tap7dac20df-e4 (unregistering): left promiscuous mode
Nov 22 03:08:56 np0005531888 NetworkManager[55166]: <info>  [1763798936.0909] device (tap7dac20df-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.101 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:56Z|00419|binding|INFO|Releasing lport 7dac20df-e449-4a6a-876e-07468688cf7b from this chassis (sb_readonly=0)
Nov 22 03:08:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:56Z|00420|binding|INFO|Setting lport 7dac20df-e449-4a6a-876e-07468688cf7b down in Southbound
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.103 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:08:56Z|00421|binding|INFO|Removing iface tap7dac20df-e4 ovn-installed in OVS
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.104 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.115 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:39:da 10.100.0.7'], port_security=['fa:16:3e:d1:39:da 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e2cd40ca-3604-448b-b55d-f90698a0f28a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e29c4867-0d91-40a0-a0bb-dd5eead4a0be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80d80bc50bfd40539762353a02ff7870', 'neutron:revision_number': '4', 'neutron:security_group_ids': '13cb9ae7-f274-4b5b-a055-de5b2118e834 e9d21bf8-72d1-4f3d-ad30-87942f562ac1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b93c281-d03a-4cdc-bbb2-39cdc3f7840a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=7dac20df-e449-4a6a-876e-07468688cf7b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.117 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 7dac20df-e449-4a6a-876e-07468688cf7b in datapath e29c4867-0d91-40a0-a0bb-dd5eead4a0be unbound from our chassis#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.119 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.120 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e29c4867-0d91-40a0-a0bb-dd5eead4a0be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.122 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3d606f1e-dd72-4bd5-9fa5-c235e7b52c2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.123 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be namespace which is not needed anymore#033[00m
Nov 22 03:08:56 np0005531888 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000072.scope: Deactivated successfully.
Nov 22 03:08:56 np0005531888 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000072.scope: Consumed 16.552s CPU time.
Nov 22 03:08:56 np0005531888 systemd-machined[153106]: Machine qemu-54-instance-00000072 terminated.
Nov 22 03:08:56 np0005531888 neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232506]: [NOTICE]   (232512) : haproxy version is 2.8.14-c23fe91
Nov 22 03:08:56 np0005531888 neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232506]: [NOTICE]   (232512) : path to executable is /usr/sbin/haproxy
Nov 22 03:08:56 np0005531888 neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232506]: [WARNING]  (232512) : Exiting Master process...
Nov 22 03:08:56 np0005531888 neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232506]: [ALERT]    (232512) : Current worker (232514) exited with code 143 (Terminated)
Nov 22 03:08:56 np0005531888 neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be[232506]: [WARNING]  (232512) : All workers exited. Exiting... (0)
Nov 22 03:08:56 np0005531888 systemd[1]: libpod-fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f.scope: Deactivated successfully.
Nov 22 03:08:56 np0005531888 podman[232792]: 2025-11-22 08:08:56.263735189 +0000 UTC m=+0.054732067 container died fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:08:56 np0005531888 NetworkManager[55166]: <info>  [1763798936.2973] manager: (tap7dac20df-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.299 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f-userdata-shm.mount: Deactivated successfully.
Nov 22 03:08:56 np0005531888 systemd[1]: var-lib-containers-storage-overlay-32ae1e908eaf8fd3b41e22495484bc99bb278e00a4c24093df38621cbe3f2b42-merged.mount: Deactivated successfully.
Nov 22 03:08:56 np0005531888 podman[232792]: 2025-11-22 08:08:56.334437798 +0000 UTC m=+0.125434656 container cleanup fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.342 186792 INFO nova.virt.libvirt.driver [-] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Instance destroyed successfully.#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.342 186792 DEBUG nova.objects.instance [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lazy-loading 'resources' on Instance uuid e2cd40ca-3604-448b-b55d-f90698a0f28a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:08:56 np0005531888 systemd[1]: libpod-conmon-fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f.scope: Deactivated successfully.
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.356 186792 DEBUG nova.virt.libvirt.vif [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:08:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1178214837',display_name='tempest-TestServerBasicOps-server-1178214837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1178214837',id=114,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFDpQsp5k296LFuveuWYcfdhkRkHtXKdiOD6yU4/A12CwH5o3asnU1Q6kA3/dIDgqZ6lPIsmTP7C6Jn7mm6MeODR/3nE0CUvAUSQE/z09SRk1dnZgAMLgLvoPg7AT8LWEw==',key_name='tempest-TestServerBasicOps-712723952',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:08:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='80d80bc50bfd40539762353a02ff7870',ramdisk_id='',reservation_id='r-600vfdbm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-1001913954',owner_user_name='tempest-TestServerBasicOps-1001913954-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:08:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3ec42aac51d84cf985243c562087f0fa',uuid=e2cd40ca-3604-448b-b55d-f90698a0f28a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.356 186792 DEBUG nova.network.os_vif_util [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Converting VIF {"id": "7dac20df-e449-4a6a-876e-07468688cf7b", "address": "fa:16:3e:d1:39:da", "network": {"id": "e29c4867-0d91-40a0-a0bb-dd5eead4a0be", "bridge": "br-int", "label": "tempest-TestServerBasicOps-83768842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80d80bc50bfd40539762353a02ff7870", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dac20df-e4", "ovs_interfaceid": "7dac20df-e449-4a6a-876e-07468688cf7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.357 186792 DEBUG nova.network.os_vif_util [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:39:da,bridge_name='br-int',has_traffic_filtering=True,id=7dac20df-e449-4a6a-876e-07468688cf7b,network=Network(e29c4867-0d91-40a0-a0bb-dd5eead4a0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dac20df-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.357 186792 DEBUG os_vif [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:39:da,bridge_name='br-int',has_traffic_filtering=True,id=7dac20df-e449-4a6a-876e-07468688cf7b,network=Network(e29c4867-0d91-40a0-a0bb-dd5eead4a0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dac20df-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.359 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.359 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7dac20df-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.360 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.362 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.367 186792 INFO os_vif [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:39:da,bridge_name='br-int',has_traffic_filtering=True,id=7dac20df-e449-4a6a-876e-07468688cf7b,network=Network(e29c4867-0d91-40a0-a0bb-dd5eead4a0be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dac20df-e4')#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.368 186792 INFO nova.virt.libvirt.driver [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Deleting instance files /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a_del#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.368 186792 INFO nova.virt.libvirt.driver [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Deletion of /var/lib/nova/instances/e2cd40ca-3604-448b-b55d-f90698a0f28a_del complete#033[00m
Nov 22 03:08:56 np0005531888 podman[232840]: 2025-11-22 08:08:56.463396289 +0000 UTC m=+0.101064516 container remove fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.469 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[45adae7d-7e1b-47f9-9134-9f1af92827af]: (4, ('Sat Nov 22 08:08:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be (fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f)\nfb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f\nSat Nov 22 08:08:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be (fb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f)\nfb1279cab1b64a9876425bdef293a3eab98f1b207ed2280b992384390ba9973f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.471 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f09d505e-dc24-4f20-9c37-af46960d903d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.472 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape29c4867-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.474 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 kernel: tape29c4867-00: left promiscuous mode
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.486 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.489 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d496bd41-4fe8-40cf-a20d-01f51d4d851c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.503 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3be64f72-0efb-429e-bc71-eb029c4cf652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.504 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[358eee33-e57a-4a9e-8ffd-d67a924f3dd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.519 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[77513d18-1dd9-471a-a372-20b3d5c64e0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563363, 'reachable_time': 25581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232855, 'error': None, 'target': 'ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.522 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e29c4867-0d91-40a0-a0bb-dd5eead4a0be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:08:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:08:56.522 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0893eccb-a895-45ad-94a9-5b1a10c3d3b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:56 np0005531888 systemd[1]: run-netns-ovnmeta\x2de29c4867\x2d0d91\x2d40a0\x2da0bb\x2ddd5eead4a0be.mount: Deactivated successfully.
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.646 186792 DEBUG nova.compute.manager [req-e63107e7-15af-4c73-b0b3-e6e52ba5e0be req-e89214f7-a291-4775-a9b7-21f2b8267f38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received event network-vif-unplugged-7dac20df-e449-4a6a-876e-07468688cf7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.647 186792 DEBUG oslo_concurrency.lockutils [req-e63107e7-15af-4c73-b0b3-e6e52ba5e0be req-e89214f7-a291-4775-a9b7-21f2b8267f38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.647 186792 DEBUG oslo_concurrency.lockutils [req-e63107e7-15af-4c73-b0b3-e6e52ba5e0be req-e89214f7-a291-4775-a9b7-21f2b8267f38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.648 186792 DEBUG oslo_concurrency.lockutils [req-e63107e7-15af-4c73-b0b3-e6e52ba5e0be req-e89214f7-a291-4775-a9b7-21f2b8267f38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.648 186792 DEBUG nova.compute.manager [req-e63107e7-15af-4c73-b0b3-e6e52ba5e0be req-e89214f7-a291-4775-a9b7-21f2b8267f38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] No waiting events found dispatching network-vif-unplugged-7dac20df-e449-4a6a-876e-07468688cf7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.649 186792 DEBUG nova.compute.manager [req-e63107e7-15af-4c73-b0b3-e6e52ba5e0be req-e89214f7-a291-4775-a9b7-21f2b8267f38 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received event network-vif-unplugged-7dac20df-e449-4a6a-876e-07468688cf7b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:08:56 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:08:56 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.973 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.973 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:08:56 np0005531888 nova_compute[186788]: 2025-11-22 08:08:56.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:57 np0005531888 nova_compute[186788]: 2025-11-22 08:08:57.064 186792 INFO nova.compute.manager [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Took 1.00 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:08:57 np0005531888 nova_compute[186788]: 2025-11-22 08:08:57.064 186792 DEBUG oslo.service.loopingcall [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:08:57 np0005531888 nova_compute[186788]: 2025-11-22 08:08:57.065 186792 DEBUG nova.compute.manager [-] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:08:57 np0005531888 nova_compute[186788]: 2025-11-22 08:08:57.065 186792 DEBUG nova.network.neutron [-] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:08:57 np0005531888 nova_compute[186788]: 2025-11-22 08:08:57.968 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.578 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.763 186792 DEBUG nova.compute.manager [req-f8c41ef1-d88b-4c37-a990-a10c2aca8bca req-4b04144c-c9e4-4fcf-ad5e-5311588587ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received event network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.764 186792 DEBUG oslo_concurrency.lockutils [req-f8c41ef1-d88b-4c37-a990-a10c2aca8bca req-4b04144c-c9e4-4fcf-ad5e-5311588587ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.764 186792 DEBUG oslo_concurrency.lockutils [req-f8c41ef1-d88b-4c37-a990-a10c2aca8bca req-4b04144c-c9e4-4fcf-ad5e-5311588587ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.764 186792 DEBUG oslo_concurrency.lockutils [req-f8c41ef1-d88b-4c37-a990-a10c2aca8bca req-4b04144c-c9e4-4fcf-ad5e-5311588587ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.765 186792 DEBUG nova.compute.manager [req-f8c41ef1-d88b-4c37-a990-a10c2aca8bca req-4b04144c-c9e4-4fcf-ad5e-5311588587ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] No waiting events found dispatching network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.765 186792 WARNING nova.compute.manager [req-f8c41ef1-d88b-4c37-a990-a10c2aca8bca req-4b04144c-c9e4-4fcf-ad5e-5311588587ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received unexpected event network-vif-plugged-7dac20df-e449-4a6a-876e-07468688cf7b for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:08:58 np0005531888 nova_compute[186788]: 2025-11-22 08:08:58.971 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.388 186792 DEBUG nova.network.neutron [-] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.433 186792 INFO nova.compute.manager [-] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Took 2.37 seconds to deallocate network for instance.#033[00m
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.517 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.518 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.594 186792 DEBUG nova.compute.provider_tree [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.625 186792 DEBUG nova.scheduler.client.report [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.652 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:59 np0005531888 podman[232857]: 2025-11-22 08:08:59.691442454 +0000 UTC m=+0.058903440 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:08:59 np0005531888 podman[232858]: 2025-11-22 08:08:59.714307086 +0000 UTC m=+0.080392408 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:59 np0005531888 nova_compute[186788]: 2025-11-22 08:08:59.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:09:00 np0005531888 nova_compute[186788]: 2025-11-22 08:09:00.015 186792 INFO nova.scheduler.client.report [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Deleted allocations for instance e2cd40ca-3604-448b-b55d-f90698a0f28a#033[00m
Nov 22 03:09:00 np0005531888 nova_compute[186788]: 2025-11-22 08:09:00.305 186792 DEBUG oslo_concurrency.lockutils [None req-59b013e0-18a0-4f75-87de-93380ad45716 3ec42aac51d84cf985243c562087f0fa 80d80bc50bfd40539762353a02ff7870 - - default default] Lock "e2cd40ca-3604-448b-b55d-f90698a0f28a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:00 np0005531888 nova_compute[186788]: 2025-11-22 08:09:00.914 186792 DEBUG nova.compute.manager [req-32e32f72-dd2c-47b8-a6c4-d081b3b191b2 req-383306c7-9329-4f72-ac03-c0db21d8aa91 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Received event network-vif-deleted-7dac20df-e449-4a6a-876e-07468688cf7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:00 np0005531888 nova_compute[186788]: 2025-11-22 08:09:00.978 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:00 np0005531888 nova_compute[186788]: 2025-11-22 08:09:00.978 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:01 np0005531888 nova_compute[186788]: 2025-11-22 08:09:01.362 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:02 np0005531888 nova_compute[186788]: 2025-11-22 08:09:02.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:03 np0005531888 nova_compute[186788]: 2025-11-22 08:09:03.580 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.727 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.765 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.766 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.802 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.969 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.970 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.976 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:09:04 np0005531888 nova_compute[186788]: 2025-11-22 08:09:04.977 186792 INFO nova.compute.claims [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.198 186792 DEBUG nova.compute.provider_tree [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.218 186792 DEBUG nova.scheduler.client.report [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.238 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.238 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.448 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.448 186792 DEBUG nova.network.neutron [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.463 186792 INFO nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.479 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.634 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.636 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.636 186792 INFO nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Creating image(s)#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.636 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.636 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.637 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.650 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.719 186792 DEBUG nova.policy [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '867dbb7f34964c339e824aadd897d3f9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '347404e1ff614e68bf6621e027c9212f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.724 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.725 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.726 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.742 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.817 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.818 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.855 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.856 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.856 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.915 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.916 186792 DEBUG nova.virt.disk.api [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Checking if we can resize image /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.916 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.984 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.985 186792 DEBUG nova.virt.disk.api [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Cannot resize image /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:09:05 np0005531888 nova_compute[186788]: 2025-11-22 08:09:05.985 186792 DEBUG nova.objects.instance [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'migration_context' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.001 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.002 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Ensure instance console log exists: /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.002 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.003 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.003 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.365 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:06 np0005531888 nova_compute[186788]: 2025-11-22 08:09:06.987 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.162 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.164 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5710MB free_disk=73.27450942993164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.164 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.164 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.262 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.263 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.263 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.272 186792 DEBUG nova.network.neutron [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Successfully created port: 6e448b80-3c71-41c9-b2b7-9f424c2b854e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.313 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.325 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.377 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:09:07 np0005531888 nova_compute[186788]: 2025-11-22 08:09:07.377 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.036 186792 DEBUG nova.network.neutron [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Successfully updated port: 6e448b80-3c71-41c9-b2b7-9f424c2b854e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.048 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.048 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquired lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.048 186792 DEBUG nova.network.neutron [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.083 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.083 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.109 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.133 186792 DEBUG nova.compute.manager [req-eca84d0c-2c1e-4ede-bf3d-5408d06b555d req-76d1ea7a-7ccd-4f6b-a7fe-b13a7f517752 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-changed-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.134 186792 DEBUG nova.compute.manager [req-eca84d0c-2c1e-4ede-bf3d-5408d06b555d req-76d1ea7a-7ccd-4f6b-a7fe-b13a7f517752 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Refreshing instance network info cache due to event network-changed-6e448b80-3c71-41c9-b2b7-9f424c2b854e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.134 186792 DEBUG oslo_concurrency.lockutils [req-eca84d0c-2c1e-4ede-bf3d-5408d06b555d req-76d1ea7a-7ccd-4f6b-a7fe-b13a7f517752 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.185 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.185 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.191 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.191 186792 INFO nova.compute.claims [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.195 186792 DEBUG nova.network.neutron [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.337 186792 DEBUG nova.compute.provider_tree [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.363 186792 DEBUG nova.scheduler.client.report [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.397 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.398 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.444 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.444 186792 DEBUG nova.network.neutron [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.460 186792 INFO nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.478 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.480 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.564 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.565 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.566 186792 INFO nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Creating image(s)#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.566 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "/var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.567 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "/var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.567 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "/var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.578 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.635 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.636 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.636 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.649 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.707 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.711 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.712 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.730 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.753 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.753 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.754 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:08 np0005531888 podman[232916]: 2025-11-22 08:09:08.774519222 +0000 UTC m=+0.052726848 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:09:08 np0005531888 podman[232923]: 2025-11-22 08:09:08.775140077 +0000 UTC m=+0.052381579 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.812 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.812 186792 DEBUG nova.virt.disk.api [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Checking if we can resize image /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.813 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.871 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.872 186792 DEBUG nova.virt.disk.api [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Cannot resize image /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.872 186792 DEBUG nova.objects.instance [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lazy-loading 'migration_context' on Instance uuid 16e73c4a-1de9-49ff-a324-136ed5bd2f13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.884 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.885 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Ensure instance console log exists: /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.885 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.885 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.886 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:08 np0005531888 nova_compute[186788]: 2025-11-22 08:09:08.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.116 186792 DEBUG nova.policy [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bd860070e4b545f1a5e5f171c49032fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd6e68001b48946dd965a602b9f88b202', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.330 186792 DEBUG nova.network.neutron [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Updating instance_info_cache with network_info: [{"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.376 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Releasing lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.377 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance network_info: |[{"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.377 186792 DEBUG oslo_concurrency.lockutils [req-eca84d0c-2c1e-4ede-bf3d-5408d06b555d req-76d1ea7a-7ccd-4f6b-a7fe-b13a7f517752 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.377 186792 DEBUG nova.network.neutron [req-eca84d0c-2c1e-4ede-bf3d-5408d06b555d req-76d1ea7a-7ccd-4f6b-a7fe-b13a7f517752 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Refreshing network info cache for port 6e448b80-3c71-41c9-b2b7-9f424c2b854e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.381 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Start _get_guest_xml network_info=[{"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.385 186792 WARNING nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.392 186792 DEBUG nova.virt.libvirt.host [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.393 186792 DEBUG nova.virt.libvirt.host [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.399 186792 DEBUG nova.virt.libvirt.host [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.400 186792 DEBUG nova.virt.libvirt.host [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.401 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.401 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.402 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.402 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.402 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.402 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.403 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.403 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.403 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.403 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.404 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.404 186792 DEBUG nova.virt.hardware [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.408 186792 DEBUG nova.virt.libvirt.vif [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-648490682',display_name='tempest-ServerRescueTestJSON-server-648490682',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-648490682',id=117,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='347404e1ff614e68bf6621e027c9212f',ramdisk_id='',reservation_id='r-b1z2x95o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1650311982',owner_user_name='tempest-ServerRescueTestJSON-1650311982-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:05Z,user_data=None,user_id='867dbb7f34964c339e824aadd897d3f9',uuid=eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.408 186792 DEBUG nova.network.os_vif_util [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converting VIF {"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.409 186792 DEBUG nova.network.os_vif_util [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:12:71,bridge_name='br-int',has_traffic_filtering=True,id=6e448b80-3c71-41c9-b2b7-9f424c2b854e,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e448b80-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.409 186792 DEBUG nova.objects.instance [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'pci_devices' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.422 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <uuid>eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462</uuid>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <name>instance-00000075</name>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerRescueTestJSON-server-648490682</nova:name>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:09:09</nova:creationTime>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        <nova:user uuid="867dbb7f34964c339e824aadd897d3f9">tempest-ServerRescueTestJSON-1650311982-project-member</nova:user>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        <nova:project uuid="347404e1ff614e68bf6621e027c9212f">tempest-ServerRescueTestJSON-1650311982</nova:project>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        <nova:port uuid="6e448b80-3c71-41c9-b2b7-9f424c2b854e">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <entry name="serial">eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462</entry>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <entry name="uuid">eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462</entry>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.config"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:34:12:71"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <target dev="tap6e448b80-3c"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/console.log" append="off"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:09:09 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:09:09 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:09:09 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:09:09 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.423 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Preparing to wait for external event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.423 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.423 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.423 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.424 186792 DEBUG nova.virt.libvirt.vif [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-648490682',display_name='tempest-ServerRescueTestJSON-server-648490682',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-648490682',id=117,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='347404e1ff614e68bf6621e027c9212f',ramdisk_id='',reservation_id='r-b1z2x95o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1650311982',owner_user_name='tempest-ServerRescueTestJSON-1650311982-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:05Z,user_data=None,user_id='867dbb7f34964c339e824aadd897d3f9',uuid=eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.424 186792 DEBUG nova.network.os_vif_util [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converting VIF {"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.424 186792 DEBUG nova.network.os_vif_util [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:12:71,bridge_name='br-int',has_traffic_filtering=True,id=6e448b80-3c71-41c9-b2b7-9f424c2b854e,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e448b80-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.425 186792 DEBUG os_vif [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:12:71,bridge_name='br-int',has_traffic_filtering=True,id=6e448b80-3c71-41c9-b2b7-9f424c2b854e,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e448b80-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.425 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.426 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.426 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.428 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.429 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e448b80-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.429 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e448b80-3c, col_values=(('external_ids', {'iface-id': '6e448b80-3c71-41c9-b2b7-9f424c2b854e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:12:71', 'vm-uuid': 'eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.431 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:09 np0005531888 NetworkManager[55166]: <info>  [1763798949.4325] manager: (tap6e448b80-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.435 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.438 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.439 186792 INFO os_vif [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:12:71,bridge_name='br-int',has_traffic_filtering=True,id=6e448b80-3c71-41c9-b2b7-9f424c2b854e,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e448b80-3c')#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.483 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.483 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.483 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No VIF found with MAC fa:16:3e:34:12:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.484 186792 INFO nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Using config drive#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.814 186792 INFO nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Creating config drive at /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.config#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.820 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp29yij6zu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.882 186792 DEBUG nova.network.neutron [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Successfully created port: 99ba6744-630f-4001-b54c-f5ec8ea2647a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.948 186792 DEBUG oslo_concurrency.processutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp29yij6zu" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:09 np0005531888 nova_compute[186788]: 2025-11-22 08:09:09.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:10 np0005531888 kernel: tap6e448b80-3c: entered promiscuous mode
Nov 22 03:09:10 np0005531888 NetworkManager[55166]: <info>  [1763798950.0223] manager: (tap6e448b80-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.022 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:10Z|00422|binding|INFO|Claiming lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e for this chassis.
Nov 22 03:09:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:10Z|00423|binding|INFO|6e448b80-3c71-41c9-b2b7-9f424c2b854e: Claiming fa:16:3e:34:12:71 10.100.0.2
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.026 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:10.036 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:12:71 10.100.0.2'], port_security=['fa:16:3e:34:12:71 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=6e448b80-3c71-41c9-b2b7-9f424c2b854e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:10.037 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 6e448b80-3c71-41c9-b2b7-9f424c2b854e in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 bound to our chassis#033[00m
Nov 22 03:09:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:10.038 104023 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:09:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:10.039 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6e63eb-213c-420a-8a34-8d6f5014d8d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:10 np0005531888 systemd-udevd[232992]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:10 np0005531888 NetworkManager[55166]: <info>  [1763798950.0739] device (tap6e448b80-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:10 np0005531888 systemd-machined[153106]: New machine qemu-55-instance-00000075.
Nov 22 03:09:10 np0005531888 NetworkManager[55166]: <info>  [1763798950.0775] device (tap6e448b80-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.084 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:10 np0005531888 systemd[1]: Started Virtual Machine qemu-55-instance-00000075.
Nov 22 03:09:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:10Z|00424|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e ovn-installed in OVS
Nov 22 03:09:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:10Z|00425|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e up in Southbound
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.094 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.569 186792 DEBUG nova.compute.manager [req-9a4840b6-5dcb-4b41-85ad-c37102276d04 req-4bd5eb43-03bf-4191-a74a-9050400b2ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.571 186792 DEBUG oslo_concurrency.lockutils [req-9a4840b6-5dcb-4b41-85ad-c37102276d04 req-4bd5eb43-03bf-4191-a74a-9050400b2ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.571 186792 DEBUG oslo_concurrency.lockutils [req-9a4840b6-5dcb-4b41-85ad-c37102276d04 req-4bd5eb43-03bf-4191-a74a-9050400b2ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.571 186792 DEBUG oslo_concurrency.lockutils [req-9a4840b6-5dcb-4b41-85ad-c37102276d04 req-4bd5eb43-03bf-4191-a74a-9050400b2ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.571 186792 DEBUG nova.compute.manager [req-9a4840b6-5dcb-4b41-85ad-c37102276d04 req-4bd5eb43-03bf-4191-a74a-9050400b2ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Processing event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.786 186792 DEBUG nova.network.neutron [req-eca84d0c-2c1e-4ede-bf3d-5408d06b555d req-76d1ea7a-7ccd-4f6b-a7fe-b13a7f517752 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Updated VIF entry in instance network info cache for port 6e448b80-3c71-41c9-b2b7-9f424c2b854e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.787 186792 DEBUG nova.network.neutron [req-eca84d0c-2c1e-4ede-bf3d-5408d06b555d req-76d1ea7a-7ccd-4f6b-a7fe-b13a7f517752 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Updating instance_info_cache with network_info: [{"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:10 np0005531888 nova_compute[186788]: 2025-11-22 08:09:10.803 186792 DEBUG oslo_concurrency.lockutils [req-eca84d0c-2c1e-4ede-bf3d-5408d06b555d req-76d1ea7a-7ccd-4f6b-a7fe-b13a7f517752 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.086 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.087 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798951.0876813, eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.088 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.091 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.094 186792 INFO nova.virt.libvirt.driver [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance spawned successfully.#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.094 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.114 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.118 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.118 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.119 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.119 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.119 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.120 186792 DEBUG nova.virt.libvirt.driver [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.124 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.138 186792 DEBUG nova.network.neutron [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Successfully updated port: 99ba6744-630f-4001-b54c-f5ec8ea2647a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.158 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "refresh_cache-16e73c4a-1de9-49ff-a324-136ed5bd2f13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.159 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquired lock "refresh_cache-16e73c4a-1de9-49ff-a324-136ed5bd2f13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.159 186792 DEBUG nova.network.neutron [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.168 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.168 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798951.090557, eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.169 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.195 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.199 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798951.0910363, eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.199 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.216 186792 INFO nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Took 5.58 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.217 186792 DEBUG nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.222 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.227 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.257 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.289 186792 INFO nova.compute.manager [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Took 6.43 seconds to build instance.#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.304 186792 DEBUG oslo_concurrency.lockutils [None req-737bcd7d-178b-450a-91a0-980d5a980ee9 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.340 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798936.3389041, e2cd40ca-3604-448b-b55d-f90698a0f28a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.340 186792 INFO nova.compute.manager [-] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.357 186792 DEBUG nova.compute.manager [None req-634d949b-7386-4f4b-b604-c34b090963c7 - - - - - -] [instance: e2cd40ca-3604-448b-b55d-f90698a0f28a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:11 np0005531888 nova_compute[186788]: 2025-11-22 08:09:11.360 186792 DEBUG nova.network.neutron [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.273 186792 DEBUG nova.network.neutron [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Updating instance_info_cache with network_info: [{"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.290 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Releasing lock "refresh_cache-16e73c4a-1de9-49ff-a324-136ed5bd2f13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.290 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Instance network_info: |[{"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.292 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Start _get_guest_xml network_info=[{"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.299 186792 WARNING nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.312 186792 DEBUG nova.virt.libvirt.host [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.314 186792 DEBUG nova.virt.libvirt.host [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.318 186792 DEBUG nova.virt.libvirt.host [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.319 186792 DEBUG nova.virt.libvirt.host [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.320 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.320 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.321 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.321 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.321 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.322 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.322 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.322 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.322 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.323 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.323 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.323 186792 DEBUG nova.virt.hardware [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.327 186792 DEBUG nova.virt.libvirt.vif [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-616559732',display_name='tempest-InstanceActionsNegativeTestJSON-server-616559732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-616559732',id=118,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d6e68001b48946dd965a602b9f88b202',ramdisk_id='',reservation_id='r-xpzzvu50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-269050617',owner_user_name='tempest-InstanceActionsNegativeTestJSON-269050617-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:08Z,user_data=None,user_id='bd860070e4b545f1a5e5f171c49032fb',uuid=16e73c4a-1de9-49ff-a324-136ed5bd2f13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.329 186792 DEBUG nova.network.os_vif_util [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Converting VIF {"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.330 186792 DEBUG nova.network.os_vif_util [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:e8:54,bridge_name='br-int',has_traffic_filtering=True,id=99ba6744-630f-4001-b54c-f5ec8ea2647a,network=Network(1cfcea60-813d-46ef-a7d6-3855729ecee4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99ba6744-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.332 186792 DEBUG nova.objects.instance [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lazy-loading 'pci_devices' on Instance uuid 16e73c4a-1de9-49ff-a324-136ed5bd2f13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.346 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <uuid>16e73c4a-1de9-49ff-a324-136ed5bd2f13</uuid>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <name>instance-00000076</name>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <nova:name>tempest-InstanceActionsNegativeTestJSON-server-616559732</nova:name>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:09:12</nova:creationTime>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        <nova:user uuid="bd860070e4b545f1a5e5f171c49032fb">tempest-InstanceActionsNegativeTestJSON-269050617-project-member</nova:user>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        <nova:project uuid="d6e68001b48946dd965a602b9f88b202">tempest-InstanceActionsNegativeTestJSON-269050617</nova:project>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        <nova:port uuid="99ba6744-630f-4001-b54c-f5ec8ea2647a">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <entry name="serial">16e73c4a-1de9-49ff-a324-136ed5bd2f13</entry>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <entry name="uuid">16e73c4a-1de9-49ff-a324-136ed5bd2f13</entry>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk.config"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:29:e8:54"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <target dev="tap99ba6744-63"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/console.log" append="off"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:09:12 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:09:12 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:09:12 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:09:12 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.347 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Preparing to wait for external event network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.347 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.348 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.348 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.348 186792 DEBUG nova.virt.libvirt.vif [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-616559732',display_name='tempest-InstanceActionsNegativeTestJSON-server-616559732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-616559732',id=118,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d6e68001b48946dd965a602b9f88b202',ramdisk_id='',reservation_id='r-xpzzvu50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-269050617',owner_user_name='tempest-InstanceActionsNegativeTestJSON-269050617-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:08Z,user_data=None,user_id='bd860070e4b545f1a5e5f171c49032fb',uuid=16e73c4a-1de9-49ff-a324-136ed5bd2f13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.349 186792 DEBUG nova.network.os_vif_util [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Converting VIF {"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.349 186792 DEBUG nova.network.os_vif_util [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:e8:54,bridge_name='br-int',has_traffic_filtering=True,id=99ba6744-630f-4001-b54c-f5ec8ea2647a,network=Network(1cfcea60-813d-46ef-a7d6-3855729ecee4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99ba6744-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.350 186792 DEBUG os_vif [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:e8:54,bridge_name='br-int',has_traffic_filtering=True,id=99ba6744-630f-4001-b54c-f5ec8ea2647a,network=Network(1cfcea60-813d-46ef-a7d6-3855729ecee4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99ba6744-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.350 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.351 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.351 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.355 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.355 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99ba6744-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.356 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap99ba6744-63, col_values=(('external_ids', {'iface-id': '99ba6744-630f-4001-b54c-f5ec8ea2647a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:e8:54', 'vm-uuid': '16e73c4a-1de9-49ff-a324-136ed5bd2f13'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.358 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:12 np0005531888 NetworkManager[55166]: <info>  [1763798952.3589] manager: (tap99ba6744-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.359 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.365 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.366 186792 INFO os_vif [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:e8:54,bridge_name='br-int',has_traffic_filtering=True,id=99ba6744-630f-4001-b54c-f5ec8ea2647a,network=Network(1cfcea60-813d-46ef-a7d6-3855729ecee4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99ba6744-63')#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.425 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.426 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.426 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] No VIF found with MAC fa:16:3e:29:e8:54, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.426 186792 INFO nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Using config drive#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.640 186792 INFO nova.compute.manager [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Rescuing#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.640 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.641 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquired lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.641 186792 DEBUG nova.network.neutron [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.671 186792 DEBUG nova.compute.manager [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.671 186792 DEBUG oslo_concurrency.lockutils [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.671 186792 DEBUG oslo_concurrency.lockutils [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.672 186792 DEBUG oslo_concurrency.lockutils [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.672 186792 DEBUG nova.compute.manager [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.672 186792 WARNING nova.compute.manager [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.672 186792 DEBUG nova.compute.manager [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received event network-changed-99ba6744-630f-4001-b54c-f5ec8ea2647a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.673 186792 DEBUG nova.compute.manager [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Refreshing instance network info cache due to event network-changed-99ba6744-630f-4001-b54c-f5ec8ea2647a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.673 186792 DEBUG oslo_concurrency.lockutils [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-16e73c4a-1de9-49ff-a324-136ed5bd2f13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.673 186792 DEBUG oslo_concurrency.lockutils [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-16e73c4a-1de9-49ff-a324-136ed5bd2f13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.673 186792 DEBUG nova.network.neutron [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Refreshing network info cache for port 99ba6744-630f-4001-b54c-f5ec8ea2647a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.777 186792 INFO nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Creating config drive at /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk.config#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.782 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7xjrrrmo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.912 186792 DEBUG oslo_concurrency.processutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7xjrrrmo" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:12 np0005531888 kernel: tap99ba6744-63: entered promiscuous mode
Nov 22 03:09:12 np0005531888 NetworkManager[55166]: <info>  [1763798952.9818] manager: (tap99ba6744-63): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Nov 22 03:09:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:12Z|00426|binding|INFO|Claiming lport 99ba6744-630f-4001-b54c-f5ec8ea2647a for this chassis.
Nov 22 03:09:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:12Z|00427|binding|INFO|99ba6744-630f-4001-b54c-f5ec8ea2647a: Claiming fa:16:3e:29:e8:54 10.100.0.8
Nov 22 03:09:12 np0005531888 systemd-udevd[232996]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:12 np0005531888 nova_compute[186788]: 2025-11-22 08:09:12.987 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:12.998 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:e8:54 10.100.0.8'], port_security=['fa:16:3e:29:e8:54 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '16e73c4a-1de9-49ff-a324-136ed5bd2f13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1cfcea60-813d-46ef-a7d6-3855729ecee4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd6e68001b48946dd965a602b9f88b202', 'neutron:revision_number': '2', 'neutron:security_group_ids': '03b8c62b-0a20-44b9-a61a-051dbd3a275f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c1e468c-70b3-4712-b209-9b6ff0037c39, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=99ba6744-630f-4001-b54c-f5ec8ea2647a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:12.999 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 99ba6744-630f-4001-b54c-f5ec8ea2647a in datapath 1cfcea60-813d-46ef-a7d6-3855729ecee4 bound to our chassis#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.000 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1cfcea60-813d-46ef-a7d6-3855729ecee4#033[00m
Nov 22 03:09:13 np0005531888 NetworkManager[55166]: <info>  [1763798953.0043] device (tap99ba6744-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:13 np0005531888 NetworkManager[55166]: <info>  [1763798953.0054] device (tap99ba6744-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.016 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[76a13485-c7e9-4092-95ea-fca3641fcb52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.018 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1cfcea60-81 in ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.020 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1cfcea60-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.020 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3edc9751-354b-4596-9e30-c136cae5d765]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.022 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0903ff-982e-4687-9cc6-6463c6095081]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 systemd-machined[153106]: New machine qemu-56-instance-00000076.
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.043 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0b76bb74-f9fc-4260-b5e1-f069b33068dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.048 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:13Z|00428|binding|INFO|Setting lport 99ba6744-630f-4001-b54c-f5ec8ea2647a ovn-installed in OVS
Nov 22 03:09:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:13Z|00429|binding|INFO|Setting lport 99ba6744-630f-4001-b54c-f5ec8ea2647a up in Southbound
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 systemd[1]: Started Virtual Machine qemu-56-instance-00000076.
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.062 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5009c6-f67a-4f72-8d2d-42c856aa7e36]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.097 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[0a727abd-f345-41e5-bf68-286db4511cfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.104 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5b3380b4-ec9f-4ab4-a7b0-aaf5de0455ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 NetworkManager[55166]: <info>  [1763798953.1103] manager: (tap1cfcea60-80): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Nov 22 03:09:13 np0005531888 systemd-udevd[233043]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.145 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a7abac99-b859-4780-9b53-65d19619418a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.151 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[275b1620-4f8f-486d-849b-8fba971124a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 NetworkManager[55166]: <info>  [1763798953.1771] device (tap1cfcea60-80): carrier: link connected
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.181 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[57006960-26f4-4d9f-9114-1746c5f65278]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.204 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f6905098-05c6-46c3-80a7-3af00ea0d483]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1cfcea60-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:83:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569382, 'reachable_time': 19003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233063, 'error': None, 'target': 'ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.222 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd3675c-5dff-4101-9346-3c1ae1392ed8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:83f1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569382, 'tstamp': 569382}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233064, 'error': None, 'target': 'ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.239 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[365aaaec-0941-42ed-b176-f51ccd1678ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1cfcea60-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:83:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569382, 'reachable_time': 19003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233065, 'error': None, 'target': 'ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.272 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e1411150-96b3-4edb-8788-13b6f16b9b03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.336 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6160e88a-6488-4bd3-a8ad-f10fa61caa75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.338 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1cfcea60-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.338 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.339 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1cfcea60-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.341 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 kernel: tap1cfcea60-80: entered promiscuous mode
Nov 22 03:09:13 np0005531888 NetworkManager[55166]: <info>  [1763798953.3421] manager: (tap1cfcea60-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.343 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.345 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1cfcea60-80, col_values=(('external_ids', {'iface-id': '38609dae-73d1-450e-9878-7b817b81ddcd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.346 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:13Z|00430|binding|INFO|Releasing lport 38609dae-73d1-450e-9878-7b817b81ddcd from this chassis (sb_readonly=0)
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.360 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.362 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.363 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1cfcea60-813d-46ef-a7d6-3855729ecee4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1cfcea60-813d-46ef-a7d6-3855729ecee4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.364 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2398f358-6d8f-459b-aa72-472cdca5e83c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.365 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-1cfcea60-813d-46ef-a7d6-3855729ecee4
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/1cfcea60-813d-46ef-a7d6-3855729ecee4.pid.haproxy
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 1cfcea60-813d-46ef-a7d6-3855729ecee4
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:09:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:13.365 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4', 'env', 'PROCESS_TAG=haproxy-1cfcea60-813d-46ef-a7d6-3855729ecee4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1cfcea60-813d-46ef-a7d6-3855729ecee4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.451 186792 DEBUG nova.compute.manager [req-9424eca6-a307-4fac-983d-bf009a7d65a8 req-f1841697-393a-4b68-9d3a-5a942071801f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received event network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.453 186792 DEBUG oslo_concurrency.lockutils [req-9424eca6-a307-4fac-983d-bf009a7d65a8 req-f1841697-393a-4b68-9d3a-5a942071801f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.453 186792 DEBUG oslo_concurrency.lockutils [req-9424eca6-a307-4fac-983d-bf009a7d65a8 req-f1841697-393a-4b68-9d3a-5a942071801f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.454 186792 DEBUG oslo_concurrency.lockutils [req-9424eca6-a307-4fac-983d-bf009a7d65a8 req-f1841697-393a-4b68-9d3a-5a942071801f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.454 186792 DEBUG nova.compute.manager [req-9424eca6-a307-4fac-983d-bf009a7d65a8 req-f1841697-393a-4b68-9d3a-5a942071801f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Processing event network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.710 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531888 podman[233097]: 2025-11-22 08:09:13.814291397 +0000 UTC m=+0.049503168 container create 813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:09:13 np0005531888 systemd[1]: Started libpod-conmon-813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092.scope.
Nov 22 03:09:13 np0005531888 podman[233097]: 2025-11-22 08:09:13.784523725 +0000 UTC m=+0.019735546 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:09:13 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:09:13 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2716ff065788baf6404dc85fdd4faeac314e38acad648fcfc752bf6474a5431a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.905 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798953.9049141, 16e73c4a-1de9-49ff-a324-136ed5bd2f13 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.906 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.909 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.919 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.923 186792 INFO nova.virt.libvirt.driver [-] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Instance spawned successfully.#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.923 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.927 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.929 186792 DEBUG nova.network.neutron [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Updated VIF entry in instance network info cache for port 99ba6744-630f-4001-b54c-f5ec8ea2647a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.929 186792 DEBUG nova.network.neutron [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Updating instance_info_cache with network_info: [{"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:13 np0005531888 podman[233097]: 2025-11-22 08:09:13.930077794 +0000 UTC m=+0.165289585 container init 813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.933 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:13 np0005531888 podman[233097]: 2025-11-22 08:09:13.936345518 +0000 UTC m=+0.171557289 container start 813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.949 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.950 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.951 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.951 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.952 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.953 186792 DEBUG nova.virt.libvirt.driver [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:13 np0005531888 neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4[233118]: [NOTICE]   (233123) : New worker (233125) forked
Nov 22 03:09:13 np0005531888 neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4[233118]: [NOTICE]   (233123) : Loading success.
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.958 186792 DEBUG oslo_concurrency.lockutils [req-1c0381fc-1b9d-4bfb-990a-879ff42d1f53 req-314cd49b-4148-43bd-bb21-697574738282 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-16e73c4a-1de9-49ff-a324-136ed5bd2f13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.964 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.966 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798953.9053073, 16e73c4a-1de9-49ff-a324-136ed5bd2f13 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.966 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.990 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.994 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798953.913287, 16e73c4a-1de9-49ff-a324-136ed5bd2f13 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:13 np0005531888 nova_compute[186788]: 2025-11-22 08:09:13.995 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.022 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.025 186792 INFO nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Took 5.46 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.026 186792 DEBUG nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.029 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.057 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.123 186792 INFO nova.compute.manager [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Took 5.96 seconds to build instance.#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.139 186792 DEBUG oslo_concurrency.lockutils [None req-ef0ec29a-0805-4ec0-95bf-9e4461532cf0 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.163 186792 DEBUG nova.network.neutron [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Updating instance_info_cache with network_info: [{"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.184 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Releasing lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:14 np0005531888 nova_compute[186788]: 2025-11-22 08:09:14.767 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.550 186792 DEBUG nova.compute.manager [req-1290f420-b0a8-426b-849c-b7e14ed6eb2b req-44a06a27-da31-4738-946e-ac4a5b9d6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received event network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.551 186792 DEBUG oslo_concurrency.lockutils [req-1290f420-b0a8-426b-849c-b7e14ed6eb2b req-44a06a27-da31-4738-946e-ac4a5b9d6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.551 186792 DEBUG oslo_concurrency.lockutils [req-1290f420-b0a8-426b-849c-b7e14ed6eb2b req-44a06a27-da31-4738-946e-ac4a5b9d6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.552 186792 DEBUG oslo_concurrency.lockutils [req-1290f420-b0a8-426b-849c-b7e14ed6eb2b req-44a06a27-da31-4738-946e-ac4a5b9d6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.552 186792 DEBUG nova.compute.manager [req-1290f420-b0a8-426b-849c-b7e14ed6eb2b req-44a06a27-da31-4738-946e-ac4a5b9d6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] No waiting events found dispatching network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.552 186792 WARNING nova.compute.manager [req-1290f420-b0a8-426b-849c-b7e14ed6eb2b req-44a06a27-da31-4738-946e-ac4a5b9d6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received unexpected event network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a for instance with vm_state active and task_state None.#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.814 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.816 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.816 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.816 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.817 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.825 186792 INFO nova.compute.manager [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Terminating instance#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.833 186792 DEBUG nova.compute.manager [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:09:15 np0005531888 kernel: tap99ba6744-63 (unregistering): left promiscuous mode
Nov 22 03:09:15 np0005531888 NetworkManager[55166]: <info>  [1763798955.8578] device (tap99ba6744-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.865 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:15 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:15Z|00431|binding|INFO|Releasing lport 99ba6744-630f-4001-b54c-f5ec8ea2647a from this chassis (sb_readonly=0)
Nov 22 03:09:15 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:15Z|00432|binding|INFO|Setting lport 99ba6744-630f-4001-b54c-f5ec8ea2647a down in Southbound
Nov 22 03:09:15 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:15Z|00433|binding|INFO|Removing iface tap99ba6744-63 ovn-installed in OVS
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.873 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:15 np0005531888 nova_compute[186788]: 2025-11-22 08:09:15.889 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:15.891 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:e8:54 10.100.0.8'], port_security=['fa:16:3e:29:e8:54 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '16e73c4a-1de9-49ff-a324-136ed5bd2f13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1cfcea60-813d-46ef-a7d6-3855729ecee4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd6e68001b48946dd965a602b9f88b202', 'neutron:revision_number': '4', 'neutron:security_group_ids': '03b8c62b-0a20-44b9-a61a-051dbd3a275f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c1e468c-70b3-4712-b209-9b6ff0037c39, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=99ba6744-630f-4001-b54c-f5ec8ea2647a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:15.892 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 99ba6744-630f-4001-b54c-f5ec8ea2647a in datapath 1cfcea60-813d-46ef-a7d6-3855729ecee4 unbound from our chassis#033[00m
Nov 22 03:09:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:15.895 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1cfcea60-813d-46ef-a7d6-3855729ecee4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:09:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:15.896 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f5527d65-e7aa-46e6-a3b8-6530f007bcad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:15.897 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4 namespace which is not needed anymore#033[00m
Nov 22 03:09:15 np0005531888 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000076.scope: Deactivated successfully.
Nov 22 03:09:15 np0005531888 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000076.scope: Consumed 2.758s CPU time.
Nov 22 03:09:15 np0005531888 systemd-machined[153106]: Machine qemu-56-instance-00000076 terminated.
Nov 22 03:09:16 np0005531888 neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4[233118]: [NOTICE]   (233123) : haproxy version is 2.8.14-c23fe91
Nov 22 03:09:16 np0005531888 neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4[233118]: [NOTICE]   (233123) : path to executable is /usr/sbin/haproxy
Nov 22 03:09:16 np0005531888 neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4[233118]: [WARNING]  (233123) : Exiting Master process...
Nov 22 03:09:16 np0005531888 neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4[233118]: [WARNING]  (233123) : Exiting Master process...
Nov 22 03:09:16 np0005531888 neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4[233118]: [ALERT]    (233123) : Current worker (233125) exited with code 143 (Terminated)
Nov 22 03:09:16 np0005531888 neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4[233118]: [WARNING]  (233123) : All workers exited. Exiting... (0)
Nov 22 03:09:16 np0005531888 systemd[1]: libpod-813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092.scope: Deactivated successfully.
Nov 22 03:09:16 np0005531888 podman[233156]: 2025-11-22 08:09:16.033438572 +0000 UTC m=+0.044315931 container died 813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:09:16 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092-userdata-shm.mount: Deactivated successfully.
Nov 22 03:09:16 np0005531888 systemd[1]: var-lib-containers-storage-overlay-2716ff065788baf6404dc85fdd4faeac314e38acad648fcfc752bf6474a5431a-merged.mount: Deactivated successfully.
Nov 22 03:09:16 np0005531888 podman[233156]: 2025-11-22 08:09:16.085134664 +0000 UTC m=+0.096012023 container cleanup 813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:09:16 np0005531888 systemd[1]: libpod-conmon-813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092.scope: Deactivated successfully.
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.104 186792 INFO nova.virt.libvirt.driver [-] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Instance destroyed successfully.#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.104 186792 DEBUG nova.objects.instance [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lazy-loading 'resources' on Instance uuid 16e73c4a-1de9-49ff-a324-136ed5bd2f13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.116 186792 DEBUG nova.virt.libvirt.vif [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-616559732',display_name='tempest-InstanceActionsNegativeTestJSON-server-616559732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-616559732',id=118,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d6e68001b48946dd965a602b9f88b202',ramdisk_id='',reservation_id='r-xpzzvu50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsNegativeTestJSON-269050617',owner_user_name='tempest-InstanceActionsNegativeTestJSON-269050617-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:14Z,user_data=None,user_id='bd860070e4b545f1a5e5f171c49032fb',uuid=16e73c4a-1de9-49ff-a324-136ed5bd2f13,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.117 186792 DEBUG nova.network.os_vif_util [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Converting VIF {"id": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "address": "fa:16:3e:29:e8:54", "network": {"id": "1cfcea60-813d-46ef-a7d6-3855729ecee4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-237489710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d6e68001b48946dd965a602b9f88b202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99ba6744-63", "ovs_interfaceid": "99ba6744-630f-4001-b54c-f5ec8ea2647a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.118 186792 DEBUG nova.network.os_vif_util [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:e8:54,bridge_name='br-int',has_traffic_filtering=True,id=99ba6744-630f-4001-b54c-f5ec8ea2647a,network=Network(1cfcea60-813d-46ef-a7d6-3855729ecee4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99ba6744-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.119 186792 DEBUG os_vif [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:e8:54,bridge_name='br-int',has_traffic_filtering=True,id=99ba6744-630f-4001-b54c-f5ec8ea2647a,network=Network(1cfcea60-813d-46ef-a7d6-3855729ecee4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99ba6744-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.121 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.121 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99ba6744-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.123 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.124 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.126 186792 INFO os_vif [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:e8:54,bridge_name='br-int',has_traffic_filtering=True,id=99ba6744-630f-4001-b54c-f5ec8ea2647a,network=Network(1cfcea60-813d-46ef-a7d6-3855729ecee4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99ba6744-63')#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.127 186792 INFO nova.virt.libvirt.driver [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Deleting instance files /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13_del#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.128 186792 INFO nova.virt.libvirt.driver [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Deletion of /var/lib/nova/instances/16e73c4a-1de9-49ff-a324-136ed5bd2f13_del complete#033[00m
Nov 22 03:09:16 np0005531888 podman[233198]: 2025-11-22 08:09:16.161187164 +0000 UTC m=+0.056843668 container remove 813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.167 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[25b9c680-a5ae-450b-b6c0-4f322f8ffdfe]: (4, ('Sat Nov 22 08:09:15 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4 (813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092)\n813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092\nSat Nov 22 08:09:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4 (813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092)\n813da9d4ba1425877d40e9897cf0987cb47286aa771e0e033ba4d00e671e3092\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.169 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7da245-3431-4f2b-8c96-da04b2ddfe7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.170 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1cfcea60-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.172 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:16 np0005531888 kernel: tap1cfcea60-80: left promiscuous mode
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.174 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.187 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6b0f67b2-41d8-4edd-bc02-b19fc57a3a0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.189 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.196 186792 INFO nova.compute.manager [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.196 186792 DEBUG oslo.service.loopingcall [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.197 186792 DEBUG nova.compute.manager [-] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.197 186792 DEBUG nova.network.neutron [-] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.213 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8e134f6d-1052-4d59-be0d-6ebb8af065bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.215 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f481e7c4-85ef-413d-90ea-9430ad62d313]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.231 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[20b732da-d30a-471d-afba-3cae49b8c4d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569373, 'reachable_time': 41581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233210, 'error': None, 'target': 'ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:16 np0005531888 systemd[1]: run-netns-ovnmeta\x2d1cfcea60\x2d813d\x2d46ef\x2da7d6\x2d3855729ecee4.mount: Deactivated successfully.
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.234 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1cfcea60-813d-46ef-a7d6-3855729ecee4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.234 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[624c39d4-d3c5-44a0-ab84-2b3aa95c947d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:16 np0005531888 nova_compute[186788]: 2025-11-22 08:09:16.354 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.356 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:16.357 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.001 186792 DEBUG nova.network.neutron [-] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.019 186792 INFO nova.compute.manager [-] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Took 0.82 seconds to deallocate network for instance.#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.086 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.087 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.169 186792 DEBUG nova.compute.provider_tree [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.180 186792 DEBUG nova.scheduler.client.report [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.198 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.238 186792 INFO nova.scheduler.client.report [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Deleted allocations for instance 16e73c4a-1de9-49ff-a324-136ed5bd2f13#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.293 186792 DEBUG oslo_concurrency.lockutils [None req-c71f482d-d3f6-4153-a661-c587c1e92c66 bd860070e4b545f1a5e5f171c49032fb d6e68001b48946dd965a602b9f88b202 - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:17.359 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.629 186792 DEBUG nova.compute.manager [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received event network-vif-unplugged-99ba6744-630f-4001-b54c-f5ec8ea2647a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.629 186792 DEBUG oslo_concurrency.lockutils [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.629 186792 DEBUG oslo_concurrency.lockutils [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.629 186792 DEBUG oslo_concurrency.lockutils [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.630 186792 DEBUG nova.compute.manager [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] No waiting events found dispatching network-vif-unplugged-99ba6744-630f-4001-b54c-f5ec8ea2647a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.630 186792 WARNING nova.compute.manager [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received unexpected event network-vif-unplugged-99ba6744-630f-4001-b54c-f5ec8ea2647a for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.630 186792 DEBUG nova.compute.manager [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received event network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.630 186792 DEBUG oslo_concurrency.lockutils [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.630 186792 DEBUG oslo_concurrency.lockutils [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.631 186792 DEBUG oslo_concurrency.lockutils [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "16e73c4a-1de9-49ff-a324-136ed5bd2f13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.631 186792 DEBUG nova.compute.manager [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] No waiting events found dispatching network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.631 186792 WARNING nova.compute.manager [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received unexpected event network-vif-plugged-99ba6744-630f-4001-b54c-f5ec8ea2647a for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:09:17 np0005531888 nova_compute[186788]: 2025-11-22 08:09:17.631 186792 DEBUG nova.compute.manager [req-ce86b790-d2a2-4817-82b6-d072254ec8b4 req-4250f9ed-f852-47a8-875c-dc8e77fdd7c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Received event network-vif-deleted-99ba6744-630f-4001-b54c-f5ec8ea2647a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:18 np0005531888 podman[233215]: 2025-11-22 08:09:18.697996603 +0000 UTC m=+0.070827223 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 22 03:09:18 np0005531888 nova_compute[186788]: 2025-11-22 08:09:18.713 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:20 np0005531888 podman[233234]: 2025-11-22 08:09:20.680377766 +0000 UTC m=+0.052944114 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:09:21 np0005531888 nova_compute[186788]: 2025-11-22 08:09:21.125 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:21 np0005531888 nova_compute[186788]: 2025-11-22 08:09:21.211 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:23 np0005531888 nova_compute[186788]: 2025-11-22 08:09:23.715 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:24 np0005531888 podman[233270]: 2025-11-22 08:09:24.675545609 +0000 UTC m=+0.049939028 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:09:24 np0005531888 nova_compute[186788]: 2025-11-22 08:09:24.816 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:09:26 np0005531888 nova_compute[186788]: 2025-11-22 08:09:26.128 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:27 np0005531888 kernel: tap6e448b80-3c (unregistering): left promiscuous mode
Nov 22 03:09:27 np0005531888 NetworkManager[55166]: <info>  [1763798967.3128] device (tap6e448b80-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.320 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:27Z|00434|binding|INFO|Releasing lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e from this chassis (sb_readonly=0)
Nov 22 03:09:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:27Z|00435|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e down in Southbound
Nov 22 03:09:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:27Z|00436|binding|INFO|Removing iface tap6e448b80-3c ovn-installed in OVS
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.325 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:27.333 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:12:71 10.100.0.2'], port_security=['fa:16:3e:34:12:71 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=6e448b80-3c71-41c9-b2b7-9f424c2b854e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:27.334 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 6e448b80-3c71-41c9-b2b7-9f424c2b854e in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 unbound from our chassis#033[00m
Nov 22 03:09:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:27.336 104023 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:09:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:27.337 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd2d001-ddbc-40ea-9782-c282000699d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.341 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:27 np0005531888 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000075.scope: Deactivated successfully.
Nov 22 03:09:27 np0005531888 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000075.scope: Consumed 14.429s CPU time.
Nov 22 03:09:27 np0005531888 systemd-machined[153106]: Machine qemu-55-instance-00000075 terminated.
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.550 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.554 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.794 186792 DEBUG nova.compute.manager [req-de78d688-08ce-4a43-af40-97b9ffc5879d req-95bedb4f-373c-4c78-9c09-04c4e6ba27be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.794 186792 DEBUG oslo_concurrency.lockutils [req-de78d688-08ce-4a43-af40-97b9ffc5879d req-95bedb4f-373c-4c78-9c09-04c4e6ba27be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.794 186792 DEBUG oslo_concurrency.lockutils [req-de78d688-08ce-4a43-af40-97b9ffc5879d req-95bedb4f-373c-4c78-9c09-04c4e6ba27be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.795 186792 DEBUG oslo_concurrency.lockutils [req-de78d688-08ce-4a43-af40-97b9ffc5879d req-95bedb4f-373c-4c78-9c09-04c4e6ba27be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.795 186792 DEBUG nova.compute.manager [req-de78d688-08ce-4a43-af40-97b9ffc5879d req-95bedb4f-373c-4c78-9c09-04c4e6ba27be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.795 186792 WARNING nova.compute.manager [req-de78d688-08ce-4a43-af40-97b9ffc5879d req-95bedb4f-373c-4c78-9c09-04c4e6ba27be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.828 186792 INFO nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.834 186792 INFO nova.virt.libvirt.driver [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance destroyed successfully.#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.834 186792 DEBUG nova.objects.instance [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'numa_topology' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.859 186792 INFO nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Attempting rescue#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.859 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.864 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.865 186792 INFO nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Creating image(s)#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.866 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.866 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.867 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.868 186792 DEBUG nova.objects.instance [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'trusted_certs' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.891 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.892 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.903 186792 DEBUG oslo_concurrency.processutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.960 186792 DEBUG oslo_concurrency.processutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:27 np0005531888 nova_compute[186788]: 2025-11-22 08:09:27.961 186792 DEBUG oslo_concurrency.processutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.000 186792 DEBUG oslo_concurrency.processutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.rescue" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.002 186792 DEBUG oslo_concurrency.lockutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.002 186792 DEBUG nova.objects.instance [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'migration_context' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.015 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.016 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Start _get_guest_xml network_info=[{"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2017524329-network", "vif_mac": "fa:16:3e:34:12:71"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.016 186792 DEBUG nova.objects.instance [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'resources' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.030 186792 WARNING nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.041 186792 DEBUG nova.virt.libvirt.host [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.042 186792 DEBUG nova.virt.libvirt.host [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.046 186792 DEBUG nova.virt.libvirt.host [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.047 186792 DEBUG nova.virt.libvirt.host [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.048 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.049 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.049 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.049 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.050 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.050 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.050 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.050 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.051 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.051 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.051 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.051 186792 DEBUG nova.virt.hardware [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.052 186792 DEBUG nova.objects.instance [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'vcpu_model' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.066 186792 DEBUG nova.virt.libvirt.vif [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-648490682',display_name='tempest-ServerRescueTestJSON-server-648490682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-648490682',id=117,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='347404e1ff614e68bf6621e027c9212f',ramdisk_id='',reservation_id='r-b1z2x95o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1650311982',owner_user_name='tempest-ServerRescueTestJSON-1650311982-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:11Z,user_data=None,user_id='867dbb7f34964c339e824aadd897d3f9',uuid=eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2017524329-network", "vif_mac": "fa:16:3e:34:12:71"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.067 186792 DEBUG nova.network.os_vif_util [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converting VIF {"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2017524329-network", "vif_mac": "fa:16:3e:34:12:71"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.068 186792 DEBUG nova.network.os_vif_util [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:12:71,bridge_name='br-int',has_traffic_filtering=True,id=6e448b80-3c71-41c9-b2b7-9f424c2b854e,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e448b80-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.068 186792 DEBUG nova.objects.instance [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'pci_devices' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.080 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <uuid>eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462</uuid>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <name>instance-00000075</name>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerRescueTestJSON-server-648490682</nova:name>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:09:28</nova:creationTime>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        <nova:user uuid="867dbb7f34964c339e824aadd897d3f9">tempest-ServerRescueTestJSON-1650311982-project-member</nova:user>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        <nova:project uuid="347404e1ff614e68bf6621e027c9212f">tempest-ServerRescueTestJSON-1650311982</nova:project>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        <nova:port uuid="6e448b80-3c71-41c9-b2b7-9f424c2b854e">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <entry name="serial">eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462</entry>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <entry name="uuid">eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462</entry>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.rescue"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <target dev="vdb" bus="virtio"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.config.rescue"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:34:12:71"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <target dev="tap6e448b80-3c"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/console.log" append="off"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:09:28 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:09:28 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:09:28 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:09:28 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.089 186792 INFO nova.virt.libvirt.driver [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance destroyed successfully.#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.158 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.159 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.159 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.159 186792 DEBUG nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No VIF found with MAC fa:16:3e:34:12:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.160 186792 INFO nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Using config drive#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.171 186792 DEBUG nova.objects.instance [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'ec2_ids' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.194 186792 DEBUG nova.objects.instance [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'keypairs' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:28 np0005531888 nova_compute[186788]: 2025-11-22 08:09:28.718 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:29 np0005531888 nova_compute[186788]: 2025-11-22 08:09:29.296 186792 INFO nova.virt.libvirt.driver [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Creating config drive at /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.config.rescue#033[00m
Nov 22 03:09:29 np0005531888 nova_compute[186788]: 2025-11-22 08:09:29.301 186792 DEBUG oslo_concurrency.processutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkmi1uate execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:29 np0005531888 nova_compute[186788]: 2025-11-22 08:09:29.427 186792 DEBUG oslo_concurrency.processutils [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkmi1uate" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:29 np0005531888 kernel: tap6e448b80-3c: entered promiscuous mode
Nov 22 03:09:29 np0005531888 systemd-udevd[233296]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:29 np0005531888 NetworkManager[55166]: <info>  [1763798969.5135] manager: (tap6e448b80-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Nov 22 03:09:29 np0005531888 nova_compute[186788]: 2025-11-22 08:09:29.516 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:29Z|00437|binding|INFO|Claiming lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e for this chassis.
Nov 22 03:09:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:29Z|00438|binding|INFO|6e448b80-3c71-41c9-b2b7-9f424c2b854e: Claiming fa:16:3e:34:12:71 10.100.0.2
Nov 22 03:09:29 np0005531888 NetworkManager[55166]: <info>  [1763798969.5247] device (tap6e448b80-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:29 np0005531888 NetworkManager[55166]: <info>  [1763798969.5256] device (tap6e448b80-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:29.527 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:12:71 10.100.0.2'], port_security=['fa:16:3e:34:12:71 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=6e448b80-3c71-41c9-b2b7-9f424c2b854e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:29.528 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 6e448b80-3c71-41c9-b2b7-9f424c2b854e in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 bound to our chassis#033[00m
Nov 22 03:09:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:29.529 104023 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:09:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:29.530 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2f425f13-a233-430e-8cf1-9b41cc7d707e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:29Z|00439|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e up in Southbound
Nov 22 03:09:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:29Z|00440|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e ovn-installed in OVS
Nov 22 03:09:29 np0005531888 nova_compute[186788]: 2025-11-22 08:09:29.533 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:29 np0005531888 nova_compute[186788]: 2025-11-22 08:09:29.534 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:29 np0005531888 nova_compute[186788]: 2025-11-22 08:09:29.537 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:29 np0005531888 systemd-machined[153106]: New machine qemu-57-instance-00000075.
Nov 22 03:09:29 np0005531888 systemd[1]: Started Virtual Machine qemu-57-instance-00000075.
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.031 186792 DEBUG nova.compute.manager [req-3570620a-5de5-49c5-b0a7-68d5ed91427f req-64c5f194-a8aa-45f0-952f-7a70bd6eb9b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.032 186792 DEBUG oslo_concurrency.lockutils [req-3570620a-5de5-49c5-b0a7-68d5ed91427f req-64c5f194-a8aa-45f0-952f-7a70bd6eb9b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.033 186792 DEBUG oslo_concurrency.lockutils [req-3570620a-5de5-49c5-b0a7-68d5ed91427f req-64c5f194-a8aa-45f0-952f-7a70bd6eb9b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.033 186792 DEBUG oslo_concurrency.lockutils [req-3570620a-5de5-49c5-b0a7-68d5ed91427f req-64c5f194-a8aa-45f0-952f-7a70bd6eb9b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.034 186792 DEBUG nova.compute.manager [req-3570620a-5de5-49c5-b0a7-68d5ed91427f req-64c5f194-a8aa-45f0-952f-7a70bd6eb9b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.034 186792 WARNING nova.compute.manager [req-3570620a-5de5-49c5-b0a7-68d5ed91427f req-64c5f194-a8aa-45f0-952f-7a70bd6eb9b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.096 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.097 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798970.095928, eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.097 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.127 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.129 186792 DEBUG nova.compute.manager [None req-9f9931d0-ad0b-43b4-8fbe-c464b91a617f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.133 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.172 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.172 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798970.098133, eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.173 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.217 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:30 np0005531888 nova_compute[186788]: 2025-11-22 08:09:30.222 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:30 np0005531888 podman[233356]: 2025-11-22 08:09:30.72599405 +0000 UTC m=+0.094351722 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:30 np0005531888 podman[233357]: 2025-11-22 08:09:30.732931811 +0000 UTC m=+0.101047057 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 03:09:31 np0005531888 nova_compute[186788]: 2025-11-22 08:09:31.103 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798956.1020627, 16e73c4a-1de9-49ff-a324-136ed5bd2f13 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:31 np0005531888 nova_compute[186788]: 2025-11-22 08:09:31.103 186792 INFO nova.compute.manager [-] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:09:31 np0005531888 nova_compute[186788]: 2025-11-22 08:09:31.124 186792 DEBUG nova.compute.manager [None req-696d847a-9cd6-469b-ae6a-1e2270fc8ae9 - - - - - -] [instance: 16e73c4a-1de9-49ff-a324-136ed5bd2f13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:31 np0005531888 nova_compute[186788]: 2025-11-22 08:09:31.130 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:31 np0005531888 nova_compute[186788]: 2025-11-22 08:09:31.494 186792 INFO nova.compute.manager [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Unrescuing#033[00m
Nov 22 03:09:31 np0005531888 nova_compute[186788]: 2025-11-22 08:09:31.495 186792 DEBUG oslo_concurrency.lockutils [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:31 np0005531888 nova_compute[186788]: 2025-11-22 08:09:31.495 186792 DEBUG oslo_concurrency.lockutils [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquired lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:31 np0005531888 nova_compute[186788]: 2025-11-22 08:09:31.496 186792 DEBUG nova.network.neutron [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.125 186792 DEBUG nova.compute.manager [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.126 186792 DEBUG oslo_concurrency.lockutils [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.126 186792 DEBUG oslo_concurrency.lockutils [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.126 186792 DEBUG oslo_concurrency.lockutils [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.127 186792 DEBUG nova.compute.manager [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.127 186792 WARNING nova.compute.manager [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.127 186792 DEBUG nova.compute.manager [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.127 186792 DEBUG oslo_concurrency.lockutils [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.128 186792 DEBUG oslo_concurrency.lockutils [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.128 186792 DEBUG oslo_concurrency.lockutils [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.128 186792 DEBUG nova.compute.manager [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:32 np0005531888 nova_compute[186788]: 2025-11-22 08:09:32.129 186792 WARNING nova.compute.manager [req-0817b01a-968c-4b0c-81f7-881dbe0bf1c4 req-4fb26f43-293f-468c-ac0d-ac91aee484b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 22 03:09:33 np0005531888 nova_compute[186788]: 2025-11-22 08:09:33.722 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.526 186792 DEBUG nova.network.neutron [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Updating instance_info_cache with network_info: [{"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.543 186792 DEBUG oslo_concurrency.lockutils [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Releasing lock "refresh_cache-eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.544 186792 DEBUG nova.objects.instance [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'flavor' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:34 np0005531888 kernel: tap6e448b80-3c (unregistering): left promiscuous mode
Nov 22 03:09:34 np0005531888 NetworkManager[55166]: <info>  [1763798974.5905] device (tap6e448b80-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:09:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:34Z|00441|binding|INFO|Releasing lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e from this chassis (sb_readonly=0)
Nov 22 03:09:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:34Z|00442|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e down in Southbound
Nov 22 03:09:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:34Z|00443|binding|INFO|Removing iface tap6e448b80-3c ovn-installed in OVS
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.599 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.601 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.614 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:34.629 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:12:71 10.100.0.2'], port_security=['fa:16:3e:34:12:71 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=6e448b80-3c71-41c9-b2b7-9f424c2b854e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:34.631 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 6e448b80-3c71-41c9-b2b7-9f424c2b854e in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 unbound from our chassis#033[00m
Nov 22 03:09:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:34.632 104023 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:09:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:34.634 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f52933cc-9fdd-4444-be73-d3fe75b09f52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:34 np0005531888 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000075.scope: Deactivated successfully.
Nov 22 03:09:34 np0005531888 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000075.scope: Consumed 4.918s CPU time.
Nov 22 03:09:34 np0005531888 systemd-machined[153106]: Machine qemu-57-instance-00000075 terminated.
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.795 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.844 186792 INFO nova.virt.libvirt.driver [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance destroyed successfully.#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.845 186792 DEBUG nova.objects.instance [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'numa_topology' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:34 np0005531888 kernel: tap6e448b80-3c: entered promiscuous mode
Nov 22 03:09:34 np0005531888 systemd-udevd[233406]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:34Z|00444|binding|INFO|Claiming lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e for this chassis.
Nov 22 03:09:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:34Z|00445|binding|INFO|6e448b80-3c71-41c9-b2b7-9f424c2b854e: Claiming fa:16:3e:34:12:71 10.100.0.2
Nov 22 03:09:34 np0005531888 NetworkManager[55166]: <info>  [1763798974.9360] manager: (tap6e448b80-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.935 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 NetworkManager[55166]: <info>  [1763798974.9444] device (tap6e448b80-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:34.944 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:12:71 10.100.0.2'], port_security=['fa:16:3e:34:12:71 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=6e448b80-3c71-41c9-b2b7-9f424c2b854e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:34 np0005531888 NetworkManager[55166]: <info>  [1763798974.9452] device (tap6e448b80-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:34.945 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 6e448b80-3c71-41c9-b2b7-9f424c2b854e in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 bound to our chassis#033[00m
Nov 22 03:09:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:34.946 104023 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:09:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:34.947 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0d711085-019f-4599-85af-ba77fd0e2702]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:34Z|00446|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e up in Southbound
Nov 22 03:09:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:34Z|00447|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e ovn-installed in OVS
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.948 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.949 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 nova_compute[186788]: 2025-11-22 08:09:34.951 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531888 systemd-machined[153106]: New machine qemu-58-instance-00000075.
Nov 22 03:09:34 np0005531888 systemd[1]: Started Virtual Machine qemu-58-instance-00000075.
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.316 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.317 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798975.3164346, eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.317 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.321 186792 DEBUG nova.compute.manager [None req-2d5f2986-c764-403d-a609-cc8867b73189 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.356 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.360 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.376 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.376 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798975.3167195, eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.376 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.394 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.398 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.428 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.954 186792 DEBUG nova.compute.manager [req-e40089b2-ef6e-4a5f-a15c-a6b5cef26f44 req-76b8e0eb-b312-4781-aa6b-bdaea92b9d26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.954 186792 DEBUG oslo_concurrency.lockutils [req-e40089b2-ef6e-4a5f-a15c-a6b5cef26f44 req-76b8e0eb-b312-4781-aa6b-bdaea92b9d26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.954 186792 DEBUG oslo_concurrency.lockutils [req-e40089b2-ef6e-4a5f-a15c-a6b5cef26f44 req-76b8e0eb-b312-4781-aa6b-bdaea92b9d26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.954 186792 DEBUG oslo_concurrency.lockutils [req-e40089b2-ef6e-4a5f-a15c-a6b5cef26f44 req-76b8e0eb-b312-4781-aa6b-bdaea92b9d26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.955 186792 DEBUG nova.compute.manager [req-e40089b2-ef6e-4a5f-a15c-a6b5cef26f44 req-76b8e0eb-b312-4781-aa6b-bdaea92b9d26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:35 np0005531888 nova_compute[186788]: 2025-11-22 08:09:35.955 186792 WARNING nova.compute.manager [req-e40089b2-ef6e-4a5f-a15c-a6b5cef26f44 req-76b8e0eb-b312-4781-aa6b-bdaea92b9d26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state active and task_state None.#033[00m
Nov 22 03:09:36 np0005531888 nova_compute[186788]: 2025-11-22 08:09:36.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:36.824 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:36.826 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:36.826 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.944 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.944 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.945 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.945 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.946 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.953 186792 INFO nova.compute.manager [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Terminating instance#033[00m
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.962 186792 DEBUG nova.compute.manager [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:09:37 np0005531888 kernel: tap6e448b80-3c (unregistering): left promiscuous mode
Nov 22 03:09:37 np0005531888 NetworkManager[55166]: <info>  [1763798977.9824] device (tap6e448b80-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:09:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:37Z|00448|binding|INFO|Releasing lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e from this chassis (sb_readonly=0)
Nov 22 03:09:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:37Z|00449|binding|INFO|Setting lport 6e448b80-3c71-41c9-b2b7-9f424c2b854e down in Southbound
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.990 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:37Z|00450|binding|INFO|Removing iface tap6e448b80-3c ovn-installed in OVS
Nov 22 03:09:37 np0005531888 nova_compute[186788]: 2025-11-22 08:09:37.992 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:38.000 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:12:71 10.100.0.2'], port_security=['fa:16:3e:34:12:71 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=6e448b80-3c71-41c9-b2b7-9f424c2b854e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:38.002 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 6e448b80-3c71-41c9-b2b7-9f424c2b854e in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 unbound from our chassis#033[00m
Nov 22 03:09:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:38.004 104023 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:09:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:38.005 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d513b5d5-0593-4dda-98eb-fe1731172e87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:38 np0005531888 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000075.scope: Deactivated successfully.
Nov 22 03:09:38 np0005531888 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000075.scope: Consumed 3.030s CPU time.
Nov 22 03:09:38 np0005531888 systemd-machined[153106]: Machine qemu-58-instance-00000075 terminated.
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.078 186792 DEBUG nova.compute.manager [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.078 186792 DEBUG oslo_concurrency.lockutils [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.078 186792 DEBUG oslo_concurrency.lockutils [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.078 186792 DEBUG oslo_concurrency.lockutils [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.079 186792 DEBUG nova.compute.manager [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.079 186792 WARNING nova.compute.manager [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.079 186792 DEBUG nova.compute.manager [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.079 186792 DEBUG oslo_concurrency.lockutils [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.079 186792 DEBUG oslo_concurrency.lockutils [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.079 186792 DEBUG oslo_concurrency.lockutils [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.079 186792 DEBUG nova.compute.manager [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.080 186792 WARNING nova.compute.manager [req-74145ae8-9535-4863-ac02-252c6fa89f8a req-90c72840-3f8c-4f82-bd66-3d8fc3569326 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.219 186792 INFO nova.virt.libvirt.driver [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Instance destroyed successfully.#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.219 186792 DEBUG nova.objects.instance [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'resources' on Instance uuid eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.231 186792 DEBUG nova.virt.libvirt.vif [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-648490682',display_name='tempest-ServerRescueTestJSON-server-648490682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-648490682',id=117,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='347404e1ff614e68bf6621e027c9212f',ramdisk_id='',reservation_id='r-b1z2x95o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1650311982',owner_user_name='tempest-ServerRescueTestJSON-1650311982-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:35Z,user_data=None,user_id='867dbb7f34964c339e824aadd897d3f9',uuid=eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.232 186792 DEBUG nova.network.os_vif_util [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converting VIF {"id": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "address": "fa:16:3e:34:12:71", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e448b80-3c", "ovs_interfaceid": "6e448b80-3c71-41c9-b2b7-9f424c2b854e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.232 186792 DEBUG nova.network.os_vif_util [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:12:71,bridge_name='br-int',has_traffic_filtering=True,id=6e448b80-3c71-41c9-b2b7-9f424c2b854e,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e448b80-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.232 186792 DEBUG os_vif [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:12:71,bridge_name='br-int',has_traffic_filtering=True,id=6e448b80-3c71-41c9-b2b7-9f424c2b854e,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e448b80-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.234 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.234 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e448b80-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.235 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.237 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.239 186792 INFO os_vif [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:12:71,bridge_name='br-int',has_traffic_filtering=True,id=6e448b80-3c71-41c9-b2b7-9f424c2b854e,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e448b80-3c')#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.239 186792 INFO nova.virt.libvirt.driver [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Deleting instance files /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462_del#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.240 186792 INFO nova.virt.libvirt.driver [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Deletion of /var/lib/nova/instances/eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462_del complete#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.310 186792 INFO nova.compute.manager [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.311 186792 DEBUG oslo.service.loopingcall [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.312 186792 DEBUG nova.compute.manager [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.312 186792 DEBUG nova.network.neutron [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:09:38 np0005531888 nova_compute[186788]: 2025-11-22 08:09:38.722 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.262 186792 DEBUG nova.network.neutron [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.280 186792 INFO nova.compute.manager [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Took 0.97 seconds to deallocate network for instance.#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.409 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.409 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.429 186792 DEBUG nova.compute.manager [req-13de8099-ae82-4811-9d52-ce07a18f0e93 req-b1de7281-a15e-49d7-8eb2-ec402276e32c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-deleted-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.472 186792 DEBUG nova.compute.provider_tree [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.482 186792 DEBUG nova.scheduler.client.report [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.503 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.531 186792 INFO nova.scheduler.client.report [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Deleted allocations for instance eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462#033[00m
Nov 22 03:09:39 np0005531888 nova_compute[186788]: 2025-11-22 08:09:39.606 186792 DEBUG oslo_concurrency.lockutils [None req-e6f36476-9fc6-4659-98f3-ab31b18a70da 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:39 np0005531888 podman[233482]: 2025-11-22 08:09:39.688972358 +0000 UTC m=+0.056841329 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:39 np0005531888 podman[233481]: 2025-11-22 08:09:39.690436073 +0000 UTC m=+0.060420946 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.155 186792 DEBUG nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.155 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.156 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.156 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.156 186792 DEBUG nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.157 186792 WARNING nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.157 186792 DEBUG nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.157 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.157 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.158 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.158 186792 DEBUG nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.158 186792 WARNING nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-unplugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.158 186792 DEBUG nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.158 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.159 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.159 186792 DEBUG oslo_concurrency.lockutils [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.159 186792 DEBUG nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] No waiting events found dispatching network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:40 np0005531888 nova_compute[186788]: 2025-11-22 08:09:40.159 186792 WARNING nova.compute.manager [req-56ff6072-31df-4dfd-9c6b-b2852aa6a251 req-623e6ab4-5867-44fd-8985-2f3ae8265b4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Received unexpected event network-vif-plugged-6e448b80-3c71-41c9-b2b7-9f424c2b854e for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.138 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "f585652d-c90a-4001-bd55-2ffc90c6bab2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.139 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.164 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.277 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.277 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.283 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.284 186792 INFO nova.compute.claims [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.473 186792 DEBUG nova.compute.provider_tree [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.488 186792 DEBUG nova.scheduler.client.report [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.509 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.510 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.573 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.574 186792 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.617 186792 INFO nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.652 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.764 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.765 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.765 186792 INFO nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Creating image(s)#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.766 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "/var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.766 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "/var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.767 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "/var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.779 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.839 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.840 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.840 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.853 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.881 186792 DEBUG nova.policy [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c867ad823e59410b995507d3e85b3465', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c564dfb60114407b72d22a9c49ed513', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.908 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.909 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.946 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.947 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:42 np0005531888 nova_compute[186788]: 2025-11-22 08:09:42.947 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.003 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.004 186792 DEBUG nova.virt.disk.api [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Checking if we can resize image /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.005 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.063 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.065 186792 DEBUG nova.virt.disk.api [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Cannot resize image /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.066 186792 DEBUG nova.objects.instance [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'migration_context' on Instance uuid f585652d-c90a-4001-bd55-2ffc90c6bab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.209 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.210 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Ensure instance console log exists: /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.210 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.210 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.211 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.239 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.723 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:43 np0005531888 nova_compute[186788]: 2025-11-22 08:09:43.915 186792 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Successfully created port: c520a39e-661c-4109-845c-df6757d06e77 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.018 186792 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Successfully updated port: c520a39e-661c-4109-845c-df6757d06e77 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.035 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "refresh_cache-f585652d-c90a-4001-bd55-2ffc90c6bab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.036 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquired lock "refresh_cache-f585652d-c90a-4001-bd55-2ffc90c6bab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.036 186792 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.096 186792 DEBUG nova.compute.manager [req-8e91dc72-b223-4769-84f9-7f805daff708 req-2e2293cf-be53-4380-bf49-2a9f486a381e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received event network-changed-c520a39e-661c-4109-845c-df6757d06e77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.096 186792 DEBUG nova.compute.manager [req-8e91dc72-b223-4769-84f9-7f805daff708 req-2e2293cf-be53-4380-bf49-2a9f486a381e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Refreshing instance network info cache due to event network-changed-c520a39e-661c-4109-845c-df6757d06e77. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.096 186792 DEBUG oslo_concurrency.lockutils [req-8e91dc72-b223-4769-84f9-7f805daff708 req-2e2293cf-be53-4380-bf49-2a9f486a381e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-f585652d-c90a-4001-bd55-2ffc90c6bab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.252 186792 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.828 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.829 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.848 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.948 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.948 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.958 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:09:45 np0005531888 nova_compute[186788]: 2025-11-22 08:09:45.959 186792 INFO nova.compute.claims [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.105 186792 DEBUG nova.compute.provider_tree [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.116 186792 DEBUG nova.scheduler.client.report [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.140 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.141 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.227 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.228 186792 DEBUG nova.network.neutron [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.278 186792 INFO nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.301 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.344 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.347 186792 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Updating instance_info_cache with network_info: [{"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.377 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Releasing lock "refresh_cache-f585652d-c90a-4001-bd55-2ffc90c6bab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.377 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Instance network_info: |[{"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.377 186792 DEBUG oslo_concurrency.lockutils [req-8e91dc72-b223-4769-84f9-7f805daff708 req-2e2293cf-be53-4380-bf49-2a9f486a381e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-f585652d-c90a-4001-bd55-2ffc90c6bab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.378 186792 DEBUG nova.network.neutron [req-8e91dc72-b223-4769-84f9-7f805daff708 req-2e2293cf-be53-4380-bf49-2a9f486a381e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Refreshing network info cache for port c520a39e-661c-4109-845c-df6757d06e77 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.381 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Start _get_guest_xml network_info=[{"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.387 186792 WARNING nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.398 186792 DEBUG nova.virt.libvirt.host [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.398 186792 DEBUG nova.virt.libvirt.host [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.402 186792 DEBUG nova.virt.libvirt.host [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.402 186792 DEBUG nova.virt.libvirt.host [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.403 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.403 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.404 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.404 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.404 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.404 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.405 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.405 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.405 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.405 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.405 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.406 186792 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.409 186792 DEBUG nova.virt.libvirt.vif [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-705204345',display_name='tempest-tempest.common.compute-instance-705204345-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-705204345-1',id=119,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-hgk57t70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:42Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=f585652d-c90a-4001-bd55-2ffc90c6bab2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.410 186792 DEBUG nova.network.os_vif_util [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.410 186792 DEBUG nova.network.os_vif_util [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:48,bridge_name='br-int',has_traffic_filtering=True,id=c520a39e-661c-4109-845c-df6757d06e77,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc520a39e-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.411 186792 DEBUG nova.objects.instance [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'pci_devices' on Instance uuid f585652d-c90a-4001-bd55-2ffc90c6bab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.436 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <uuid>f585652d-c90a-4001-bd55-2ffc90c6bab2</uuid>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <name>instance-00000077</name>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <nova:name>tempest-tempest.common.compute-instance-705204345-1</nova:name>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:09:46</nova:creationTime>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        <nova:user uuid="c867ad823e59410b995507d3e85b3465">tempest-MultipleCreateTestJSON-1558462004-project-member</nova:user>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        <nova:project uuid="9c564dfb60114407b72d22a9c49ed513">tempest-MultipleCreateTestJSON-1558462004</nova:project>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        <nova:port uuid="c520a39e-661c-4109-845c-df6757d06e77">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <entry name="serial">f585652d-c90a-4001-bd55-2ffc90c6bab2</entry>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <entry name="uuid">f585652d-c90a-4001-bd55-2ffc90c6bab2</entry>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk.config"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:56:a7:48"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <target dev="tapc520a39e-66"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/console.log" append="off"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:09:46 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:09:46 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:09:46 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:09:46 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.438 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Preparing to wait for external event network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.438 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.438 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.439 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.439 186792 DEBUG nova.virt.libvirt.vif [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-705204345',display_name='tempest-tempest.common.compute-instance-705204345-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-705204345-1',id=119,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-hgk57t70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:42Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=f585652d-c90a-4001-bd55-2ffc90c6bab2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.440 186792 DEBUG nova.network.os_vif_util [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.440 186792 DEBUG nova.network.os_vif_util [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:48,bridge_name='br-int',has_traffic_filtering=True,id=c520a39e-661c-4109-845c-df6757d06e77,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc520a39e-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.440 186792 DEBUG os_vif [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:48,bridge_name='br-int',has_traffic_filtering=True,id=c520a39e-661c-4109-845c-df6757d06e77,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc520a39e-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.441 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.441 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.442 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.444 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.444 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc520a39e-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.444 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc520a39e-66, col_values=(('external_ids', {'iface-id': 'c520a39e-661c-4109-845c-df6757d06e77', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:a7:48', 'vm-uuid': 'f585652d-c90a-4001-bd55-2ffc90c6bab2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.446 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:46 np0005531888 NetworkManager[55166]: <info>  [1763798986.4471] manager: (tapc520a39e-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.454 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.455 186792 INFO os_vif [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:48,bridge_name='br-int',has_traffic_filtering=True,id=c520a39e-661c-4109-845c-df6757d06e77,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc520a39e-66')#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.459 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.460 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.461 186792 INFO nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Creating image(s)#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.461 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.462 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.462 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.477 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.513 186792 DEBUG nova.policy [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.535 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.536 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.537 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.551 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.578 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.578 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.579 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No VIF found with MAC fa:16:3e:56:a7:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.580 186792 INFO nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Using config drive#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.610 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.611 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.765 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk 1073741824" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.766 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.766 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.826 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.828 186792 DEBUG nova.virt.disk.api [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.828 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.886 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.887 186792 DEBUG nova.virt.disk.api [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.888 186792 DEBUG nova.objects.instance [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 8bd0c27f-4042-4314-9eee-7939d2dd2f99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.904 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.905 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Ensure instance console log exists: /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.905 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.906 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:46 np0005531888 nova_compute[186788]: 2025-11-22 08:09:46.906 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.067 186792 INFO nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Creating config drive at /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk.config#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.071 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfaiy0g4x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.202 186792 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfaiy0g4x" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:47 np0005531888 kernel: tapc520a39e-66: entered promiscuous mode
Nov 22 03:09:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:47Z|00451|binding|INFO|Claiming lport c520a39e-661c-4109-845c-df6757d06e77 for this chassis.
Nov 22 03:09:47 np0005531888 NetworkManager[55166]: <info>  [1763798987.2577] manager: (tapc520a39e-66): new Tun device (/org/freedesktop/NetworkManager/Devices/214)
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.257 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:47Z|00452|binding|INFO|c520a39e-661c-4109-845c-df6757d06e77: Claiming fa:16:3e:56:a7:48 10.100.0.11
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.261 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.264 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.274 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:a7:48 10.100.0.11'], port_security=['fa:16:3e:56:a7:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f585652d-c90a-4001-bd55-2ffc90c6bab2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=c520a39e-661c-4109-845c-df6757d06e77) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.275 104023 INFO neutron.agent.ovn.metadata.agent [-] Port c520a39e-661c-4109-845c-df6757d06e77 in datapath c75f33da-8305-4145-97ef-eef656e4f067 bound to our chassis#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.277 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c75f33da-8305-4145-97ef-eef656e4f067#033[00m
Nov 22 03:09:47 np0005531888 systemd-udevd[233575]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:47 np0005531888 systemd-machined[153106]: New machine qemu-59-instance-00000077.
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.288 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ffbeef-e8ad-41ab-8d5c-c1d9404a23d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.289 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc75f33da-81 in ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.290 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc75f33da-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.290 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4b0ba4-6dad-4048-abd7-5d59c1f751bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.291 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[caafa876-9127-46a9-8c95-f933c10bbc43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 NetworkManager[55166]: <info>  [1763798987.2949] device (tapc520a39e-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:47 np0005531888 NetworkManager[55166]: <info>  [1763798987.2956] device (tapc520a39e-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.303 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[a669d4df-1986-4e82-9e06-cd323094458c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.317 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 systemd[1]: Started Virtual Machine qemu-59-instance-00000077.
Nov 22 03:09:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:47Z|00453|binding|INFO|Setting lport c520a39e-661c-4109-845c-df6757d06e77 ovn-installed in OVS
Nov 22 03:09:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:47Z|00454|binding|INFO|Setting lport c520a39e-661c-4109-845c-df6757d06e77 up in Southbound
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.321 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.326 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5bdb8669-b409-4404-aa53-f2274da06306]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.351 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c126eb8c-ab1e-48f5-a776-eb0d0b2f4afd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 systemd-udevd[233579]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.356 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[880b7107-c937-4f45-b1ad-3b9d851e160b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 NetworkManager[55166]: <info>  [1763798987.3571] manager: (tapc75f33da-80): new Veth device (/org/freedesktop/NetworkManager/Devices/215)
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.385 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[70b69daf-7d8b-4582-ae6e-3af536e4546b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.388 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[561d76f7-51da-413a-8d57-0cd3804d4d96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 NetworkManager[55166]: <info>  [1763798987.4094] device (tapc75f33da-80): carrier: link connected
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.415 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b9dd2d37-bbce-4cb6-af84-c08ae56ce592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.431 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d941d430-d3b6-4c39-89f9-b2a9b87a1cfb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc75f33da-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c8:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572805, 'reachable_time': 26944, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233609, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.446 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[29c143ed-52c6-4156-a744-75308d0b9709]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:c898'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 572805, 'tstamp': 572805}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233610, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.464 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[141125dd-cbcc-45cc-b6ca-a8d4b9650cf5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc75f33da-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c8:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572805, 'reachable_time': 26944, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233611, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.498 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4517a5-1d4d-4d8e-917a-b739f5402297]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.522 186792 DEBUG nova.network.neutron [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Successfully created port: 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.554 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b6edcda4-4499-49b0-bbb9-70c6093ca650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.556 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc75f33da-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.556 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.556 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc75f33da-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.558 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 NetworkManager[55166]: <info>  [1763798987.5589] manager: (tapc75f33da-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Nov 22 03:09:47 np0005531888 kernel: tapc75f33da-80: entered promiscuous mode
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.562 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc75f33da-80, col_values=(('external_ids', {'iface-id': 'd2b1e9d2-8364-40b7-8c31-edbcc237653b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.563 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:47Z|00455|binding|INFO|Releasing lport d2b1e9d2-8364-40b7-8c31-edbcc237653b from this chassis (sb_readonly=0)
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.564 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.564 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.575 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2a4c478e-be84-4f24-8eed-2df2d77a8e78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.576 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.577 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:47.577 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'env', 'PROCESS_TAG=haproxy-c75f33da-8305-4145-97ef-eef656e4f067', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c75f33da-8305-4145-97ef-eef656e4f067.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.628 186792 DEBUG nova.compute.manager [req-a3b780c9-bf99-4b76-81ae-f7cbdb6c7ca0 req-54222fca-2813-4d41-bb93-7c985e146a56 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received event network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.629 186792 DEBUG oslo_concurrency.lockutils [req-a3b780c9-bf99-4b76-81ae-f7cbdb6c7ca0 req-54222fca-2813-4d41-bb93-7c985e146a56 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.629 186792 DEBUG oslo_concurrency.lockutils [req-a3b780c9-bf99-4b76-81ae-f7cbdb6c7ca0 req-54222fca-2813-4d41-bb93-7c985e146a56 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.629 186792 DEBUG oslo_concurrency.lockutils [req-a3b780c9-bf99-4b76-81ae-f7cbdb6c7ca0 req-54222fca-2813-4d41-bb93-7c985e146a56 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.629 186792 DEBUG nova.compute.manager [req-a3b780c9-bf99-4b76-81ae-f7cbdb6c7ca0 req-54222fca-2813-4d41-bb93-7c985e146a56 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Processing event network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.779 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.780 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798987.7788322, f585652d-c90a-4001-bd55-2ffc90c6bab2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.781 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.784 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.787 186792 INFO nova.virt.libvirt.driver [-] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Instance spawned successfully.#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.788 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.825 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.830 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.830 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.831 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.832 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.832 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.833 186792 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.837 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.875 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.875 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798987.7799933, f585652d-c90a-4001-bd55-2ffc90c6bab2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.876 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.920 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.924 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798987.7835264, f585652d-c90a-4001-bd55-2ffc90c6bab2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.925 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.948 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.954 186792 INFO nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Took 5.19 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.954 186792 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.956 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:47 np0005531888 nova_compute[186788]: 2025-11-22 08:09:47.990 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:48 np0005531888 podman[233650]: 2025-11-22 08:09:47.912968443 +0000 UTC m=+0.024039062 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:09:48 np0005531888 podman[233650]: 2025-11-22 08:09:48.050469345 +0000 UTC m=+0.161539964 container create 412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:48 np0005531888 nova_compute[186788]: 2025-11-22 08:09:48.056 186792 INFO nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Took 5.82 seconds to build instance.#033[00m
Nov 22 03:09:48 np0005531888 nova_compute[186788]: 2025-11-22 08:09:48.059 186792 DEBUG nova.network.neutron [req-8e91dc72-b223-4769-84f9-7f805daff708 req-2e2293cf-be53-4380-bf49-2a9f486a381e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Updated VIF entry in instance network info cache for port c520a39e-661c-4109-845c-df6757d06e77. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:09:48 np0005531888 nova_compute[186788]: 2025-11-22 08:09:48.060 186792 DEBUG nova.network.neutron [req-8e91dc72-b223-4769-84f9-7f805daff708 req-2e2293cf-be53-4380-bf49-2a9f486a381e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Updating instance_info_cache with network_info: [{"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:48 np0005531888 nova_compute[186788]: 2025-11-22 08:09:48.072 186792 DEBUG oslo_concurrency.lockutils [req-8e91dc72-b223-4769-84f9-7f805daff708 req-2e2293cf-be53-4380-bf49-2a9f486a381e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-f585652d-c90a-4001-bd55-2ffc90c6bab2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:48 np0005531888 nova_compute[186788]: 2025-11-22 08:09:48.075 186792 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:48 np0005531888 systemd[1]: Started libpod-conmon-412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b.scope.
Nov 22 03:09:48 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:09:48 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36e8b06040f105cc5077e405f0aac4ef9abaf6c11777c1445293bf015d8b24d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:09:48 np0005531888 podman[233650]: 2025-11-22 08:09:48.232698226 +0000 UTC m=+0.343768855 container init 412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 03:09:48 np0005531888 podman[233650]: 2025-11-22 08:09:48.240749064 +0000 UTC m=+0.351819673 container start 412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:48 np0005531888 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[233665]: [NOTICE]   (233669) : New worker (233671) forked
Nov 22 03:09:48 np0005531888 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[233665]: [NOTICE]   (233669) : Loading success.
Nov 22 03:09:48 np0005531888 nova_compute[186788]: 2025-11-22 08:09:48.681 186792 DEBUG nova.network.neutron [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Successfully created port: bf553f99-0dbc-4a65-a64a-54074e8070f1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:09:48 np0005531888 nova_compute[186788]: 2025-11-22 08:09:48.725 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:49 np0005531888 podman[233680]: 2025-11-22 08:09:49.7210739 +0000 UTC m=+0.075745684 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 03:09:49 np0005531888 nova_compute[186788]: 2025-11-22 08:09:49.884 186792 DEBUG nova.compute.manager [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received event network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:49 np0005531888 nova_compute[186788]: 2025-11-22 08:09:49.884 186792 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:49 np0005531888 nova_compute[186788]: 2025-11-22 08:09:49.884 186792 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:49 np0005531888 nova_compute[186788]: 2025-11-22 08:09:49.884 186792 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:49 np0005531888 nova_compute[186788]: 2025-11-22 08:09:49.885 186792 DEBUG nova.compute.manager [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] No waiting events found dispatching network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:49 np0005531888 nova_compute[186788]: 2025-11-22 08:09:49.885 186792 WARNING nova.compute.manager [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received unexpected event network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:09:49 np0005531888 nova_compute[186788]: 2025-11-22 08:09:49.918 186792 DEBUG nova.network.neutron [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Successfully updated port: 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.067 186792 DEBUG nova.compute.manager [req-14c70992-c47d-41ca-8043-ebea1d99ae29 req-453a9b7d-3b8f-441c-a8d7-9acbfea0e012 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-changed-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.067 186792 DEBUG nova.compute.manager [req-14c70992-c47d-41ca-8043-ebea1d99ae29 req-453a9b7d-3b8f-441c-a8d7-9acbfea0e012 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Refreshing instance network info cache due to event network-changed-8bb24240-cb32-4c05-a4f7-1d73a46b1b71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.068 186792 DEBUG oslo_concurrency.lockutils [req-14c70992-c47d-41ca-8043-ebea1d99ae29 req-453a9b7d-3b8f-441c-a8d7-9acbfea0e012 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.068 186792 DEBUG oslo_concurrency.lockutils [req-14c70992-c47d-41ca-8043-ebea1d99ae29 req-453a9b7d-3b8f-441c-a8d7-9acbfea0e012 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.068 186792 DEBUG nova.network.neutron [req-14c70992-c47d-41ca-8043-ebea1d99ae29 req-453a9b7d-3b8f-441c-a8d7-9acbfea0e012 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Refreshing network info cache for port 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.476 186792 DEBUG nova.network.neutron [req-14c70992-c47d-41ca-8043-ebea1d99ae29 req-453a9b7d-3b8f-441c-a8d7-9acbfea0e012 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.699 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "fd71e2ab-4255-4855-bc4a-28045582ce90" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.700 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.708 186792 DEBUG nova.network.neutron [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Successfully updated port: bf553f99-0dbc-4a65-a64a-54074e8070f1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.715 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.737 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.802 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.803 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.812 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.813 186792 INFO nova.compute.claims [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.993 186792 DEBUG nova.compute.provider_tree [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:50 np0005531888 nova_compute[186788]: 2025-11-22 08:09:50.996 186792 DEBUG nova.network.neutron [req-14c70992-c47d-41ca-8043-ebea1d99ae29 req-453a9b7d-3b8f-441c-a8d7-9acbfea0e012 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.012 186792 DEBUG nova.scheduler.client.report [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.016 186792 DEBUG oslo_concurrency.lockutils [req-14c70992-c47d-41ca-8043-ebea1d99ae29 req-453a9b7d-3b8f-441c-a8d7-9acbfea0e012 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.018 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.018 186792 DEBUG nova.network.neutron [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.042 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.043 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.107 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.107 186792 DEBUG nova.network.neutron [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.131 186792 INFO nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.156 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.265 186792 DEBUG nova.network.neutron [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.269 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.271 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.271 186792 INFO nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Creating image(s)#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.272 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "/var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.272 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "/var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.273 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "/var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.288 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.349 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.351 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.351 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.365 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.430 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.431 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.454 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.471 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.472 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.473 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.530 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.532 186792 DEBUG nova.virt.disk.api [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Checking if we can resize image /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.532 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.591 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.593 186792 DEBUG nova.virt.disk.api [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Cannot resize image /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.593 186792 DEBUG nova.objects.instance [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'migration_context' on Instance uuid fd71e2ab-4255-4855-bc4a-28045582ce90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.606 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.607 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Ensure instance console log exists: /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.608 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.608 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:51 np0005531888 nova_compute[186788]: 2025-11-22 08:09:51.609 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:51 np0005531888 podman[233715]: 2025-11-22 08:09:51.684036305 +0000 UTC m=+0.052142523 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.109 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "f585652d-c90a-4001-bd55-2ffc90c6bab2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.110 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.111 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.111 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.111 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.120 186792 INFO nova.compute.manager [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Terminating instance#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.127 186792 DEBUG nova.compute.manager [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:09:52 np0005531888 kernel: tapc520a39e-66 (unregistering): left promiscuous mode
Nov 22 03:09:52 np0005531888 NetworkManager[55166]: <info>  [1763798992.1498] device (tapc520a39e-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.154 186792 DEBUG nova.compute.manager [req-baf4ec3a-41f6-4587-b611-c7aea0c6376e req-05869ac3-d0a1-4974-b7a9-54ab18da1f30 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-changed-bf553f99-0dbc-4a65-a64a-54074e8070f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.155 186792 DEBUG nova.compute.manager [req-baf4ec3a-41f6-4587-b611-c7aea0c6376e req-05869ac3-d0a1-4974-b7a9-54ab18da1f30 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Refreshing instance network info cache due to event network-changed-bf553f99-0dbc-4a65-a64a-54074e8070f1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.155 186792 DEBUG oslo_concurrency.lockutils [req-baf4ec3a-41f6-4587-b611-c7aea0c6376e req-05869ac3-d0a1-4974-b7a9-54ab18da1f30 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:52 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:52Z|00456|binding|INFO|Releasing lport c520a39e-661c-4109-845c-df6757d06e77 from this chassis (sb_readonly=0)
Nov 22 03:09:52 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:52Z|00457|binding|INFO|Setting lport c520a39e-661c-4109-845c-df6757d06e77 down in Southbound
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.161 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:52Z|00458|binding|INFO|Removing iface tapc520a39e-66 ovn-installed in OVS
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.164 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.171 186792 DEBUG nova.network.neutron [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Successfully created port: 42febaff-ab08-422c-921c-4d29969e13f0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.169 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:a7:48 10.100.0.11'], port_security=['fa:16:3e:56:a7:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f585652d-c90a-4001-bd55-2ffc90c6bab2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=c520a39e-661c-4109-845c-df6757d06e77) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.171 104023 INFO neutron.agent.ovn.metadata.agent [-] Port c520a39e-661c-4109-845c-df6757d06e77 in datapath c75f33da-8305-4145-97ef-eef656e4f067 unbound from our chassis#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.173 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c75f33da-8305-4145-97ef-eef656e4f067, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.174 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc5c4d0-f29b-4392-8175-6c1b3c5eddc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.176 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 namespace which is not needed anymore#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.181 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000077.scope: Deactivated successfully.
Nov 22 03:09:52 np0005531888 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000077.scope: Consumed 4.845s CPU time.
Nov 22 03:09:52 np0005531888 systemd-machined[153106]: Machine qemu-59-instance-00000077 terminated.
Nov 22 03:09:52 np0005531888 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[233665]: [NOTICE]   (233669) : haproxy version is 2.8.14-c23fe91
Nov 22 03:09:52 np0005531888 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[233665]: [NOTICE]   (233669) : path to executable is /usr/sbin/haproxy
Nov 22 03:09:52 np0005531888 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[233665]: [WARNING]  (233669) : Exiting Master process...
Nov 22 03:09:52 np0005531888 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[233665]: [ALERT]    (233669) : Current worker (233671) exited with code 143 (Terminated)
Nov 22 03:09:52 np0005531888 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[233665]: [WARNING]  (233669) : All workers exited. Exiting... (0)
Nov 22 03:09:52 np0005531888 systemd[1]: libpod-412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b.scope: Deactivated successfully.
Nov 22 03:09:52 np0005531888 podman[233764]: 2025-11-22 08:09:52.326745822 +0000 UTC m=+0.052581784 container died 412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.357 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b-userdata-shm.mount: Deactivated successfully.
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.360 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay-36e8b06040f105cc5077e405f0aac4ef9abaf6c11777c1445293bf015d8b24d1-merged.mount: Deactivated successfully.
Nov 22 03:09:52 np0005531888 podman[233764]: 2025-11-22 08:09:52.383847776 +0000 UTC m=+0.109683738 container cleanup 412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:09:52 np0005531888 systemd[1]: libpod-conmon-412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b.scope: Deactivated successfully.
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.392 186792 DEBUG nova.compute.manager [req-23b011d5-8c50-4386-acc4-6f0b395a855a req-478fa0f1-0385-4c3c-aada-7258a70c0c94 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received event network-vif-unplugged-c520a39e-661c-4109-845c-df6757d06e77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.393 186792 DEBUG oslo_concurrency.lockutils [req-23b011d5-8c50-4386-acc4-6f0b395a855a req-478fa0f1-0385-4c3c-aada-7258a70c0c94 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.393 186792 DEBUG oslo_concurrency.lockutils [req-23b011d5-8c50-4386-acc4-6f0b395a855a req-478fa0f1-0385-4c3c-aada-7258a70c0c94 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.393 186792 DEBUG oslo_concurrency.lockutils [req-23b011d5-8c50-4386-acc4-6f0b395a855a req-478fa0f1-0385-4c3c-aada-7258a70c0c94 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.393 186792 DEBUG nova.compute.manager [req-23b011d5-8c50-4386-acc4-6f0b395a855a req-478fa0f1-0385-4c3c-aada-7258a70c0c94 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] No waiting events found dispatching network-vif-unplugged-c520a39e-661c-4109-845c-df6757d06e77 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.393 186792 DEBUG nova.compute.manager [req-23b011d5-8c50-4386-acc4-6f0b395a855a req-478fa0f1-0385-4c3c-aada-7258a70c0c94 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received event network-vif-unplugged-c520a39e-661c-4109-845c-df6757d06e77 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.415 186792 INFO nova.virt.libvirt.driver [-] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Instance destroyed successfully.#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.418 186792 DEBUG nova.objects.instance [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'resources' on Instance uuid f585652d-c90a-4001-bd55-2ffc90c6bab2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.437 186792 DEBUG nova.virt.libvirt.vif [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-705204345',display_name='tempest-tempest.common.compute-instance-705204345-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-705204345-1',id=119,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-hgk57t70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:48Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=f585652d-c90a-4001-bd55-2ffc90c6bab2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.438 186792 DEBUG nova.network.os_vif_util [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "c520a39e-661c-4109-845c-df6757d06e77", "address": "fa:16:3e:56:a7:48", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc520a39e-66", "ovs_interfaceid": "c520a39e-661c-4109-845c-df6757d06e77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.439 186792 DEBUG nova.network.os_vif_util [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:48,bridge_name='br-int',has_traffic_filtering=True,id=c520a39e-661c-4109-845c-df6757d06e77,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc520a39e-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.439 186792 DEBUG os_vif [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:48,bridge_name='br-int',has_traffic_filtering=True,id=c520a39e-661c-4109-845c-df6757d06e77,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc520a39e-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.442 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc520a39e-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.444 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.445 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.448 186792 INFO os_vif [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:48,bridge_name='br-int',has_traffic_filtering=True,id=c520a39e-661c-4109-845c-df6757d06e77,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc520a39e-66')#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.449 186792 INFO nova.virt.libvirt.driver [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Deleting instance files /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2_del#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.450 186792 INFO nova.virt.libvirt.driver [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Deletion of /var/lib/nova/instances/f585652d-c90a-4001-bd55-2ffc90c6bab2_del complete#033[00m
Nov 22 03:09:52 np0005531888 podman[233806]: 2025-11-22 08:09:52.488156572 +0000 UTC m=+0.069338107 container remove 412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.495 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[92a93e97-e780-4aae-8eda-03f02a13512e]: (4, ('Sat Nov 22 08:09:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 (412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b)\n412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b\nSat Nov 22 08:09:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 (412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b)\n412ce646d23f9c677735a8988ee50a442d499e7a210c07ac486369810d0a9c1b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.498 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[151e274b-af33-48ef-be74-07c017eeec8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.499 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc75f33da-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.502 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 kernel: tapc75f33da-80: left promiscuous mode
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.518 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.521 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cab12577-1fd2-46b7-a15b-b58eeae68332]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.535 186792 INFO nova.compute.manager [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.536 186792 DEBUG oslo.service.loopingcall [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.536 186792 DEBUG nova.compute.manager [-] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:09:52 np0005531888 nova_compute[186788]: 2025-11-22 08:09:52.537 186792 DEBUG nova.network.neutron [-] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.538 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[123c17f9-114a-4795-9236-a119c4c04632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.541 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f4785dd7-8dea-4502-b58f-8839d1d11e5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.556 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[53d0a3b1-4871-449a-8770-b3f26cd2c660]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572799, 'reachable_time': 31397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233822, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.561 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:09:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:52.562 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f0bf4227-c168-4682-a070-3bab3d06d2ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:52 np0005531888 systemd[1]: run-netns-ovnmeta\x2dc75f33da\x2d8305\x2d4145\x2d97ef\x2deef656e4f067.mount: Deactivated successfully.
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.144 186792 DEBUG nova.network.neutron [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Successfully updated port: 42febaff-ab08-422c-921c-4d29969e13f0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.178 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "refresh_cache-fd71e2ab-4255-4855-bc4a-28045582ce90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.178 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquired lock "refresh_cache-fd71e2ab-4255-4855-bc4a-28045582ce90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.178 186792 DEBUG nova.network.neutron [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.218 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798978.2177262, eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.218 186792 INFO nova.compute.manager [-] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.238 186792 DEBUG nova.compute.manager [None req-b7acf02c-4e3d-40e3-bbce-2c03f331ce3b - - - - - -] [instance: eb2e450c-ba4a-4269-b6cc-d1cb5e1c5462] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.508 186792 DEBUG nova.network.neutron [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.522 186792 DEBUG nova.network.neutron [-] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.544 186792 INFO nova.compute.manager [-] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Took 1.01 seconds to deallocate network for instance.#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.610 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.610 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.729 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.733 186792 DEBUG nova.compute.provider_tree [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.745 186792 DEBUG nova.scheduler.client.report [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.764 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.787 186792 INFO nova.scheduler.client.report [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Deleted allocations for instance f585652d-c90a-4001-bd55-2ffc90c6bab2#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.871 186792 DEBUG oslo_concurrency.lockutils [None req-1d6d9e7f-ca5a-4dd6-ae67-887c2386006a c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.890 186792 DEBUG nova.network.neutron [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updating instance_info_cache with network_info: [{"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.921 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.922 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Instance network_info: |[{"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.922 186792 DEBUG oslo_concurrency.lockutils [req-baf4ec3a-41f6-4587-b611-c7aea0c6376e req-05869ac3-d0a1-4974-b7a9-54ab18da1f30 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.922 186792 DEBUG nova.network.neutron [req-baf4ec3a-41f6-4587-b611-c7aea0c6376e req-05869ac3-d0a1-4974-b7a9-54ab18da1f30 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Refreshing network info cache for port bf553f99-0dbc-4a65-a64a-54074e8070f1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.925 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Start _get_guest_xml network_info=[{"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.930 186792 WARNING nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.935 186792 DEBUG nova.virt.libvirt.host [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.936 186792 DEBUG nova.virt.libvirt.host [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.943 186792 DEBUG nova.virt.libvirt.host [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.944 186792 DEBUG nova.virt.libvirt.host [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.945 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.946 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.946 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.946 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.946 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.947 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.947 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.947 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.947 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.947 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.948 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.948 186792 DEBUG nova.virt.hardware [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.952 186792 DEBUG nova.virt.libvirt.vif [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1540631613',display_name='tempest-TestGettingAddress-server-1540631613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1540631613',id=122,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-j0pvysn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:46Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=8bd0c27f-4042-4314-9eee-7939d2dd2f99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.952 186792 DEBUG nova.network.os_vif_util [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.953 186792 DEBUG nova.network.os_vif_util [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:b6:1e,bridge_name='br-int',has_traffic_filtering=True,id=8bb24240-cb32-4c05-a4f7-1d73a46b1b71,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb24240-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.953 186792 DEBUG nova.virt.libvirt.vif [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1540631613',display_name='tempest-TestGettingAddress-server-1540631613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1540631613',id=122,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-j0pvysn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:46Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=8bd0c27f-4042-4314-9eee-7939d2dd2f99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.954 186792 DEBUG nova.network.os_vif_util [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.954 186792 DEBUG nova.network.os_vif_util [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:4d:e2,bridge_name='br-int',has_traffic_filtering=True,id=bf553f99-0dbc-4a65-a64a-54074e8070f1,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf553f99-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.955 186792 DEBUG nova.objects.instance [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 8bd0c27f-4042-4314-9eee-7939d2dd2f99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.974 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <uuid>8bd0c27f-4042-4314-9eee-7939d2dd2f99</uuid>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <name>instance-0000007a</name>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestGettingAddress-server-1540631613</nova:name>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:09:53</nova:creationTime>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:port uuid="8bb24240-cb32-4c05-a4f7-1d73a46b1b71">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        <nova:port uuid="bf553f99-0dbc-4a65-a64a-54074e8070f1">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:feb2:4de2" ipVersion="6"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <entry name="serial">8bd0c27f-4042-4314-9eee-7939d2dd2f99</entry>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <entry name="uuid">8bd0c27f-4042-4314-9eee-7939d2dd2f99</entry>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk.config"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:73:b6:1e"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <target dev="tap8bb24240-cb"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:b2:4d:e2"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <target dev="tapbf553f99-0d"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/console.log" append="off"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:09:53 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:09:53 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:09:53 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:09:53 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.974 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Preparing to wait for external event network-vif-plugged-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.974 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.975 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.975 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.975 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Preparing to wait for external event network-vif-plugged-bf553f99-0dbc-4a65-a64a-54074e8070f1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.975 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.975 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.975 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.976 186792 DEBUG nova.virt.libvirt.vif [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1540631613',display_name='tempest-TestGettingAddress-server-1540631613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1540631613',id=122,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-j0pvysn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:46Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=8bd0c27f-4042-4314-9eee-7939d2dd2f99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.976 186792 DEBUG nova.network.os_vif_util [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.977 186792 DEBUG nova.network.os_vif_util [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:b6:1e,bridge_name='br-int',has_traffic_filtering=True,id=8bb24240-cb32-4c05-a4f7-1d73a46b1b71,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb24240-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.977 186792 DEBUG os_vif [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:b6:1e,bridge_name='br-int',has_traffic_filtering=True,id=8bb24240-cb32-4c05-a4f7-1d73a46b1b71,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb24240-cb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.978 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.979 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.979 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.982 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.982 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bb24240-cb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.982 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8bb24240-cb, col_values=(('external_ids', {'iface-id': '8bb24240-cb32-4c05-a4f7-1d73a46b1b71', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:73:b6:1e', 'vm-uuid': '8bd0c27f-4042-4314-9eee-7939d2dd2f99'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531888 NetworkManager[55166]: <info>  [1763798993.9849] manager: (tap8bb24240-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.986 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.989 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.990 186792 INFO os_vif [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:b6:1e,bridge_name='br-int',has_traffic_filtering=True,id=8bb24240-cb32-4c05-a4f7-1d73a46b1b71,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb24240-cb')#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.991 186792 DEBUG nova.virt.libvirt.vif [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1540631613',display_name='tempest-TestGettingAddress-server-1540631613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1540631613',id=122,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-j0pvysn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:46Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=8bd0c27f-4042-4314-9eee-7939d2dd2f99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.991 186792 DEBUG nova.network.os_vif_util [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.991 186792 DEBUG nova.network.os_vif_util [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:4d:e2,bridge_name='br-int',has_traffic_filtering=True,id=bf553f99-0dbc-4a65-a64a-54074e8070f1,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf553f99-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.992 186792 DEBUG os_vif [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:4d:e2,bridge_name='br-int',has_traffic_filtering=True,id=bf553f99-0dbc-4a65-a64a-54074e8070f1,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf553f99-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.992 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.992 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.993 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.995 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.995 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf553f99-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.995 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbf553f99-0d, col_values=(('external_ids', {'iface-id': 'bf553f99-0dbc-4a65-a64a-54074e8070f1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:4d:e2', 'vm-uuid': '8bd0c27f-4042-4314-9eee-7939d2dd2f99'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.996 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531888 NetworkManager[55166]: <info>  [1763798993.9975] manager: (tapbf553f99-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Nov 22 03:09:53 np0005531888 nova_compute[186788]: 2025-11-22 08:09:53.998 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.003 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.004 186792 INFO os_vif [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:4d:e2,bridge_name='br-int',has_traffic_filtering=True,id=bf553f99-0dbc-4a65-a64a-54074e8070f1,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf553f99-0d')#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.243 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.243 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.244 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:73:b6:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.244 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:b2:4d:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.245 186792 INFO nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Using config drive#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.310 186792 DEBUG nova.compute.manager [req-faaf05b5-7333-4643-bc0e-9b2522036cee req-8332803d-c4ab-4434-806a-7536c9c56f99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received event network-vif-deleted-c520a39e-661c-4109-845c-df6757d06e77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.467 186792 DEBUG nova.compute.manager [req-ffde8196-29cc-4bdc-9f82-c1e4183e0ed5 req-e784f914-16fb-4321-a0cb-119db935c1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received event network-changed-42febaff-ab08-422c-921c-4d29969e13f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.467 186792 DEBUG nova.compute.manager [req-ffde8196-29cc-4bdc-9f82-c1e4183e0ed5 req-e784f914-16fb-4321-a0cb-119db935c1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Refreshing instance network info cache due to event network-changed-42febaff-ab08-422c-921c-4d29969e13f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.468 186792 DEBUG oslo_concurrency.lockutils [req-ffde8196-29cc-4bdc-9f82-c1e4183e0ed5 req-e784f914-16fb-4321-a0cb-119db935c1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-fd71e2ab-4255-4855-bc4a-28045582ce90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.519 186792 DEBUG nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received event network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.519 186792 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.519 186792 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.520 186792 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f585652d-c90a-4001-bd55-2ffc90c6bab2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.520 186792 DEBUG nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] No waiting events found dispatching network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.520 186792 WARNING nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Received unexpected event network-vif-plugged-c520a39e-661c-4109-845c-df6757d06e77 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.734 186792 DEBUG nova.network.neutron [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Updating instance_info_cache with network_info: [{"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.759 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Releasing lock "refresh_cache-fd71e2ab-4255-4855-bc4a-28045582ce90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.759 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Instance network_info: |[{"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.761 186792 DEBUG oslo_concurrency.lockutils [req-ffde8196-29cc-4bdc-9f82-c1e4183e0ed5 req-e784f914-16fb-4321-a0cb-119db935c1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-fd71e2ab-4255-4855-bc4a-28045582ce90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.761 186792 DEBUG nova.network.neutron [req-ffde8196-29cc-4bdc-9f82-c1e4183e0ed5 req-e784f914-16fb-4321-a0cb-119db935c1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Refreshing network info cache for port 42febaff-ab08-422c-921c-4d29969e13f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.764 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Start _get_guest_xml network_info=[{"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.769 186792 WARNING nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.778 186792 DEBUG nova.virt.libvirt.host [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.779 186792 DEBUG nova.virt.libvirt.host [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.789 186792 DEBUG nova.virt.libvirt.host [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.790 186792 DEBUG nova.virt.libvirt.host [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.791 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.791 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.792 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.792 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.792 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.792 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.793 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.793 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.793 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.794 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.794 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.794 186792 DEBUG nova.virt.hardware [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.798 186792 DEBUG nova.virt.libvirt.vif [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-921620997',display_name='tempest-TestServerMultinode-server-921620997',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-921620997',id=124,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-yx28502t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:51Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=fd71e2ab-4255-4855-bc4a-28045582ce90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.799 186792 DEBUG nova.network.os_vif_util [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.800 186792 DEBUG nova.network.os_vif_util [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:67:04,bridge_name='br-int',has_traffic_filtering=True,id=42febaff-ab08-422c-921c-4d29969e13f0,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42febaff-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.801 186792 DEBUG nova.objects.instance [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'pci_devices' on Instance uuid fd71e2ab-4255-4855-bc4a-28045582ce90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.805 186792 INFO nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Creating config drive at /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk.config#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.810 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprspq4qhs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.846 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <uuid>fd71e2ab-4255-4855-bc4a-28045582ce90</uuid>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <name>instance-0000007c</name>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestServerMultinode-server-921620997</nova:name>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:09:54</nova:creationTime>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        <nova:user uuid="1bc17d213e01420ebb2a0bf75f44e357">tempest-TestServerMultinode-1734646453-project-admin</nova:user>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        <nova:project uuid="b67388009f754931a62cbdd391fb4f53">tempest-TestServerMultinode-1734646453</nova:project>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        <nova:port uuid="42febaff-ab08-422c-921c-4d29969e13f0">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <entry name="serial">fd71e2ab-4255-4855-bc4a-28045582ce90</entry>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <entry name="uuid">fd71e2ab-4255-4855-bc4a-28045582ce90</entry>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk.config"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:76:67:04"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <target dev="tap42febaff-ab"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/console.log" append="off"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:09:54 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:09:54 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:09:54 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:09:54 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.846 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Preparing to wait for external event network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.847 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.848 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.849 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.850 186792 DEBUG nova.virt.libvirt.vif [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-921620997',display_name='tempest-TestServerMultinode-server-921620997',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-921620997',id=124,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-yx28502t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:51Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=fd71e2ab-4255-4855-bc4a-28045582ce90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.850 186792 DEBUG nova.network.os_vif_util [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.851 186792 DEBUG nova.network.os_vif_util [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:67:04,bridge_name='br-int',has_traffic_filtering=True,id=42febaff-ab08-422c-921c-4d29969e13f0,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42febaff-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.852 186792 DEBUG os_vif [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:67:04,bridge_name='br-int',has_traffic_filtering=True,id=42febaff-ab08-422c-921c-4d29969e13f0,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42febaff-ab') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.858 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.859 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.860 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.864 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.864 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42febaff-ab, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.865 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap42febaff-ab, col_values=(('external_ids', {'iface-id': '42febaff-ab08-422c-921c-4d29969e13f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:67:04', 'vm-uuid': 'fd71e2ab-4255-4855-bc4a-28045582ce90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.866 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:54 np0005531888 NetworkManager[55166]: <info>  [1763798994.8676] manager: (tap42febaff-ab): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.868 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.877 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.879 186792 INFO os_vif [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:67:04,bridge_name='br-int',has_traffic_filtering=True,id=42febaff-ab08-422c-921c-4d29969e13f0,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42febaff-ab')#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.940 186792 DEBUG oslo_concurrency.processutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprspq4qhs" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.950 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.950 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.950 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No VIF found with MAC fa:16:3e:76:67:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:54 np0005531888 nova_compute[186788]: 2025-11-22 08:09:54.951 186792 INFO nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Using config drive#033[00m
Nov 22 03:09:55 np0005531888 nova_compute[186788]: 2025-11-22 08:09:55.495 186792 INFO nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Creating config drive at /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk.config#033[00m
Nov 22 03:09:55 np0005531888 nova_compute[186788]: 2025-11-22 08:09:55.501 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsq8fllrt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:55 np0005531888 nova_compute[186788]: 2025-11-22 08:09:55.589 186792 DEBUG nova.network.neutron [req-baf4ec3a-41f6-4587-b611-c7aea0c6376e req-05869ac3-d0a1-4974-b7a9-54ab18da1f30 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updated VIF entry in instance network info cache for port bf553f99-0dbc-4a65-a64a-54074e8070f1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:09:55 np0005531888 nova_compute[186788]: 2025-11-22 08:09:55.590 186792 DEBUG nova.network.neutron [req-baf4ec3a-41f6-4587-b611-c7aea0c6376e req-05869ac3-d0a1-4974-b7a9-54ab18da1f30 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updating instance_info_cache with network_info: [{"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:55 np0005531888 nova_compute[186788]: 2025-11-22 08:09:55.617 186792 DEBUG oslo_concurrency.lockutils [req-baf4ec3a-41f6-4587-b611-c7aea0c6376e req-05869ac3-d0a1-4974-b7a9-54ab18da1f30 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:55 np0005531888 nova_compute[186788]: 2025-11-22 08:09:55.627 186792 DEBUG oslo_concurrency.processutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsq8fllrt" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:56 np0005531888 podman[233834]: 2025-11-22 08:09:56.080438828 +0000 UTC m=+1.092794316 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 03:09:56 np0005531888 kernel: tap8bb24240-cb: entered promiscuous mode
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.0924] manager: (tap8bb24240-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/220)
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.0939] manager: (tap42febaff-ab): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Nov 22 03:09:56 np0005531888 kernel: tap42febaff-ab: entered promiscuous mode
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00459|binding|INFO|Claiming lport 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 for this chassis.
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.096 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00460|binding|INFO|8bb24240-cb32-4c05-a4f7-1d73a46b1b71: Claiming fa:16:3e:73:b6:1e 10.100.0.5
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00461|binding|INFO|Claiming lport 42febaff-ab08-422c-921c-4d29969e13f0 for this chassis.
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00462|binding|INFO|42febaff-ab08-422c-921c-4d29969e13f0: Claiming fa:16:3e:76:67:04 10.100.0.10
Nov 22 03:09:56 np0005531888 kernel: tapbf553f99-0d: entered promiscuous mode
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1065] manager: (tapbf553f99-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.110 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.115 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1191] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00463|if_status|INFO|Not updating pb chassis for bf553f99-0dbc-4a65-a64a-54074e8070f1 now as sb is readonly
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1200] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.121 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 systemd-udevd[233886]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:56 np0005531888 systemd-udevd[233889]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:56 np0005531888 systemd-udevd[233888]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.129 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:67:04 10.100.0.10'], port_security=['fa:16:3e:76:67:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fd71e2ab-4255-4855-bc4a-28045582ce90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b67388009f754931a62cbdd391fb4f53', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e23cfd74-a57b-4610-ab28-51062b779dc9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b005592-2b67-4b5e-87ed-f6d87ca37498, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=42febaff-ab08-422c-921c-4d29969e13f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.131 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:b6:1e 10.100.0.5'], port_security=['fa:16:3e:73:b6:1e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '8bd0c27f-4042-4314-9eee-7939d2dd2f99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fecc702f-680b-424c-83ef-3f9c6214c28e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c8e809e-e81c-4dfc-8977-f974433d5b3a, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=8bb24240-cb32-4c05-a4f7-1d73a46b1b71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.132 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 42febaff-ab08-422c-921c-4d29969e13f0 in datapath 390460fe-fb7f-40ce-abb7-9e99dea93a54 bound to our chassis#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.134 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 390460fe-fb7f-40ce-abb7-9e99dea93a54#033[00m
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1384] device (tapbf553f99-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1398] device (tap42febaff-ab): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1407] device (tapbf553f99-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1417] device (tap42febaff-ab): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1464] device (tap8bb24240-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.1478] device (tap8bb24240-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.149 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fa86ed10-e343-497a-a3fa-ab6a8f44261c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.150 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap390460fe-f1 in ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.155 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap390460fe-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.155 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d1692c95-2ad2-4241-b196-5e2bc6763978]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.156 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3724f5b1-814f-4c47-a9d1-55f7b58266a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 systemd-machined[153106]: New machine qemu-61-instance-0000007c.
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.168 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8e5a97-65de-4e86-8d28-bfa8c6d9afe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 systemd[1]: Started Virtual Machine qemu-61-instance-0000007c.
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.199 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e8744e87-25d5-438c-ade7-0796842854c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 systemd[1]: Started Virtual Machine qemu-60-instance-0000007a.
Nov 22 03:09:56 np0005531888 systemd-machined[153106]: New machine qemu-60-instance-0000007a.
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.230 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[977a8be4-8189-4fec-aea2-76561c951bc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.251 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d397dbfe-1c25-4026-83b7-b05d0e88c909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.2548] manager: (tap390460fe-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.298 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8939aa-bf43-491f-b540-695a4c3da5cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.301 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c228c4-c2d1-481c-a3c3-85f4befc9079]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00464|binding|INFO|Claiming lport bf553f99-0dbc-4a65-a64a-54074e8070f1 for this chassis.
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.3260] device (tap390460fe-f0): carrier: link connected
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00465|binding|INFO|bf553f99-0dbc-4a65-a64a-54074e8070f1: Claiming fa:16:3e:b2:4d:e2 2001:db8::f816:3eff:feb2:4de2
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00466|binding|INFO|Setting lport 42febaff-ab08-422c-921c-4d29969e13f0 ovn-installed in OVS
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00467|binding|INFO|Setting lport 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 ovn-installed in OVS
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.343 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.347 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8c2b62eb-5ca5-4fd6-b33a-5edceaed3d47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00468|binding|INFO|Setting lport bf553f99-0dbc-4a65-a64a-54074e8070f1 ovn-installed in OVS
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.364 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.367 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[14a7e8eb-79e0-4a2c-a611-9b8e5b03d8fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap390460fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:0a:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573696, 'reachable_time': 24637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233932, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.383 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[54ff9b57-7f5c-4051-ad36-6abf2a902732]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:a50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 573696, 'tstamp': 573696}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233933, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.401 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eaab46e1-f80b-43d9-9183-07ce741a0dbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap390460fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:0a:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573696, 'reachable_time': 24637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233936, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.429 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea38e37-e90e-483e-84fa-525db4d0a348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00469|binding|INFO|Setting lport 42febaff-ab08-422c-921c-4d29969e13f0 up in Southbound
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00470|binding|INFO|Setting lport 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 up in Southbound
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00471|binding|INFO|Setting lport bf553f99-0dbc-4a65-a64a-54074e8070f1 up in Southbound
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.449 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:4d:e2 2001:db8::f816:3eff:feb2:4de2'], port_security=['fa:16:3e:b2:4d:e2 2001:db8::f816:3eff:feb2:4de2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb2:4de2/64', 'neutron:device_id': '8bd0c27f-4042-4314-9eee-7939d2dd2f99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fecc702f-680b-424c-83ef-3f9c6214c28e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a4afbec-9e59-4ffa-9128-10dc4f025189, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=bf553f99-0dbc-4a65-a64a-54074e8070f1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.482 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798996.4821503, fd71e2ab-4255-4855-bc4a-28045582ce90 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.482 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.491 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2dbd8513-8d11-471b-a0cb-c06c1c23d60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.493 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap390460fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.493 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.494 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap390460fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.495 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 NetworkManager[55166]: <info>  [1763798996.4961] manager: (tap390460fe-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Nov 22 03:09:56 np0005531888 kernel: tap390460fe-f0: entered promiscuous mode
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.498 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.499 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap390460fe-f0, col_values=(('external_ids', {'iface-id': '71a8d1b1-af34-4bcb-98ae-9fcab10d0f3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.500 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.502 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:56Z|00472|binding|INFO|Releasing lport 71a8d1b1-af34-4bcb-98ae-9fcab10d0f3b from this chassis (sb_readonly=0)
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.502 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.503 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[28224599-56e6-4c29-b0c0-5643074f8da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.505 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-390460fe-fb7f-40ce-abb7-9e99dea93a54
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 390460fe-fb7f-40ce-abb7-9e99dea93a54
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:09:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:56.507 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'env', 'PROCESS_TAG=haproxy-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/390460fe-fb7f-40ce-abb7-9e99dea93a54.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.512 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.513 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.517 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798996.484448, fd71e2ab-4255-4855-bc4a-28045582ce90 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.517 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.534 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.538 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.558 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.560 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798996.559974, 8bd0c27f-4042-4314-9eee-7939d2dd2f99 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.560 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.580 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.583 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798996.5601184, 8bd0c27f-4042-4314-9eee-7939d2dd2f99 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.584 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.600 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.602 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.624 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.730 186792 DEBUG nova.compute.manager [req-5ff28269-2d09-4a81-8db7-1ce108a53d8e req-d9ec36c0-f1ce-418a-905a-c7b0a7146f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-vif-plugged-bf553f99-0dbc-4a65-a64a-54074e8070f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.730 186792 DEBUG oslo_concurrency.lockutils [req-5ff28269-2d09-4a81-8db7-1ce108a53d8e req-d9ec36c0-f1ce-418a-905a-c7b0a7146f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.731 186792 DEBUG oslo_concurrency.lockutils [req-5ff28269-2d09-4a81-8db7-1ce108a53d8e req-d9ec36c0-f1ce-418a-905a-c7b0a7146f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.731 186792 DEBUG oslo_concurrency.lockutils [req-5ff28269-2d09-4a81-8db7-1ce108a53d8e req-d9ec36c0-f1ce-418a-905a-c7b0a7146f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.731 186792 DEBUG nova.compute.manager [req-5ff28269-2d09-4a81-8db7-1ce108a53d8e req-d9ec36c0-f1ce-418a-905a-c7b0a7146f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Processing event network-vif-plugged-bf553f99-0dbc-4a65-a64a-54074e8070f1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:09:56 np0005531888 podman[233982]: 2025-11-22 08:09:56.871593485 +0000 UTC m=+0.049616671 container create 05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:09:56 np0005531888 systemd[1]: Started libpod-conmon-05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79.scope.
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.921 186792 DEBUG nova.network.neutron [req-ffde8196-29cc-4bdc-9f82-c1e4183e0ed5 req-e784f914-16fb-4321-a0cb-119db935c1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Updated VIF entry in instance network info cache for port 42febaff-ab08-422c-921c-4d29969e13f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.923 186792 DEBUG nova.network.neutron [req-ffde8196-29cc-4bdc-9f82-c1e4183e0ed5 req-e784f914-16fb-4321-a0cb-119db935c1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Updating instance_info_cache with network_info: [{"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:56 np0005531888 nova_compute[186788]: 2025-11-22 08:09:56.938 186792 DEBUG oslo_concurrency.lockutils [req-ffde8196-29cc-4bdc-9f82-c1e4183e0ed5 req-e784f914-16fb-4321-a0cb-119db935c1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-fd71e2ab-4255-4855-bc4a-28045582ce90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:56 np0005531888 podman[233982]: 2025-11-22 08:09:56.844407336 +0000 UTC m=+0.022430552 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:09:56 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:09:56 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be26ca35661d824834bba69c3f19c1412d844aa93fde8a9b208a2190feaeb5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:09:56 np0005531888 podman[233982]: 2025-11-22 08:09:56.961102176 +0000 UTC m=+0.139125382 container init 05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:09:56 np0005531888 podman[233982]: 2025-11-22 08:09:56.966299704 +0000 UTC m=+0.144322890 container start 05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 03:09:56 np0005531888 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[233997]: [NOTICE]   (234001) : New worker (234003) forked
Nov 22 03:09:56 np0005531888 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[233997]: [NOTICE]   (234001) : Loading success.
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.028 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 in datapath 8591a8a4-c35f-454b-ba4c-4ec37a8765b2 unbound from our chassis#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.030 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8591a8a4-c35f-454b-ba4c-4ec37a8765b2#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.042 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[948b3bbe-f59c-4fe5-8f6b-079ee9ac1892]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.043 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8591a8a4-c1 in ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.045 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8591a8a4-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.045 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[58b7b9d7-3c4d-4fbf-ab56-875a926cea41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.046 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0e414c-d4a0-4691-b735-d6d02f3bf2e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.060 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[b4cbd50a-4633-43c4-bf3e-6a247f991145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.073 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b9fa092c-0c34-4ee5-8ff9-513778ab7adb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.098 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4534dcc1-32e0-4810-b4d5-8df231a9a5a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.103 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[77b6ab03-c11a-412a-bf19-aec335afa6bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 NetworkManager[55166]: <info>  [1763798997.1049] manager: (tap8591a8a4-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/227)
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.133 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd9c0c6-0567-4aec-b5e5-d9418629f5e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.137 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5adff7b5-21e0-44eb-92d8-deda79bfc95a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 NetworkManager[55166]: <info>  [1763798997.1580] device (tap8591a8a4-c0): carrier: link connected
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.164 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3566f0e5-ebb5-48e5-ae73-27334c839da6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.180 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7843537e-ec4a-4b03-a953-5d258bb4454b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8591a8a4-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:5e:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573780, 'reachable_time': 40961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234022, 'error': None, 'target': 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.196 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7e262133-db6a-467d-be61-ca19026c408c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:5ece'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 573780, 'tstamp': 573780}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234023, 'error': None, 'target': 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.212 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb6abc3-f68b-4734-95a6-6762948a4a07]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8591a8a4-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:5e:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573780, 'reachable_time': 40961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234024, 'error': None, 'target': 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.238 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6f250c-6501-4d0e-8af1-3b72e243cab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.292 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8b9e9653-3c0b-44b9-81e3-42db4af5aac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.294 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8591a8a4-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.295 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.295 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8591a8a4-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:57 np0005531888 nova_compute[186788]: 2025-11-22 08:09:57.297 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:57 np0005531888 kernel: tap8591a8a4-c0: entered promiscuous mode
Nov 22 03:09:57 np0005531888 NetworkManager[55166]: <info>  [1763798997.2978] manager: (tap8591a8a4-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Nov 22 03:09:57 np0005531888 nova_compute[186788]: 2025-11-22 08:09:57.299 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.300 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8591a8a4-c0, col_values=(('external_ids', {'iface-id': 'ec231e2a-1042-4a3a-b541-060f5a121bb8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:57 np0005531888 nova_compute[186788]: 2025-11-22 08:09:57.302 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:57Z|00473|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:09:57 np0005531888 nova_compute[186788]: 2025-11-22 08:09:57.314 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.315 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8591a8a4-c35f-454b-ba4c-4ec37a8765b2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8591a8a4-c35f-454b-ba4c-4ec37a8765b2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.316 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[919e4676-b48e-422e-84b8-f6557a82c4e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.317 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-8591a8a4-c35f-454b-ba4c-4ec37a8765b2
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/8591a8a4-c35f-454b-ba4c-4ec37a8765b2.pid.haproxy
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 8591a8a4-c35f-454b-ba4c-4ec37a8765b2
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.318 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'env', 'PROCESS_TAG=haproxy-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8591a8a4-c35f-454b-ba4c-4ec37a8765b2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:09:57 np0005531888 podman[234053]: 2025-11-22 08:09:57.713831728 +0000 UTC m=+0.056093330 container create 8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:09:57 np0005531888 systemd[1]: Started libpod-conmon-8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f.scope.
Nov 22 03:09:57 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:09:57 np0005531888 podman[234053]: 2025-11-22 08:09:57.680377206 +0000 UTC m=+0.022638838 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:09:57 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b46d5f55b8bef239d3a6a44edabb9a9f94ec06b28b0fcc292bce410b47230fbe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:09:57 np0005531888 podman[234053]: 2025-11-22 08:09:57.790545695 +0000 UTC m=+0.132807327 container init 8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 03:09:57 np0005531888 podman[234053]: 2025-11-22 08:09:57.797073325 +0000 UTC m=+0.139334927 container start 8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:57 np0005531888 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[234068]: [NOTICE]   (234072) : New worker (234074) forked
Nov 22 03:09:57 np0005531888 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[234068]: [NOTICE]   (234072) : Loading success.
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.873 104023 INFO neutron.agent.ovn.metadata.agent [-] Port bf553f99-0dbc-4a65-a64a-54074e8070f1 in datapath 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad unbound from our chassis#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.876 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.888 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2b8bb5eb-af46-49d0-9824-c6b66d8cbbc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.889 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a8e7fc1-61 in ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.892 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a8e7fc1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.892 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[da493d53-9090-462b-bfd9-6f8dabef52e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.893 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dc26aac3-46f5-4c1a-8956-3a1b50ff24df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.903 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[b094c539-81b0-4874-a377-03b2f5ff9904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.916 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7d30e770-fe6b-4026-b257-1f1a5148ee40]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 nova_compute[186788]: 2025-11-22 08:09:57.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.943 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef5b963-de32-4951-80e8-f738b8fe31ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 nova_compute[186788]: 2025-11-22 08:09:57.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:57 np0005531888 NetworkManager[55166]: <info>  [1763798997.9513] manager: (tap6a8e7fc1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/229)
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.951 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2de08e12-e9f7-4025-83c5-f1baccae2565]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 nova_compute[186788]: 2025-11-22 08:09:57.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.983 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d138bad1-74e2-4067-83e6-7a9eb7d2ef7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:57.986 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[ac142bd6-90fb-4a6c-aeda-2e37ebf6fc51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:58 np0005531888 NetworkManager[55166]: <info>  [1763798998.0090] device (tap6a8e7fc1-60): carrier: link connected
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.014 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b0cef9b7-e320-4502-b30c-bf90fbcc30be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.032 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b33b04f5-a2a5-4e3f-b3a5-6f7bd50c6a62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a8e7fc1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:e2:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573865, 'reachable_time': 36662, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234096, 'error': None, 'target': 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.048 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d1adbba6-dc4f-486d-bb31-f6db9f327793]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:e22a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 573865, 'tstamp': 573865}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234097, 'error': None, 'target': 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.068 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3e10955c-9b21-4d2d-8e92-1e71f17c3ccf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a8e7fc1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:e2:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 149], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573865, 'reachable_time': 36662, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234098, 'error': None, 'target': 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.097 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3fd6fd9b-a78a-437a-b258-71071e2221a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.125 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6b745842-565e-4d11-8a5d-ff63fbc708d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.127 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a8e7fc1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.127 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.128 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a8e7fc1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.129 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:58 np0005531888 NetworkManager[55166]: <info>  [1763798998.1304] manager: (tap6a8e7fc1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Nov 22 03:09:58 np0005531888 kernel: tap6a8e7fc1-60: entered promiscuous mode
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.135 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a8e7fc1-60, col_values=(('external_ids', {'iface-id': '288f6565-c1a7-412f-8593-8864123e2215'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.136 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:09:58Z|00474|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.137 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.140 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.141 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[73b4b714-0eb5-41f9-ab26-be23ecc0881f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.141 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad.pid.haproxy
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:09:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:09:58.142 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'env', 'PROCESS_TAG=haproxy-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.149 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:58 np0005531888 podman[234128]: 2025-11-22 08:09:58.486775988 +0000 UTC m=+0.049163910 container create 62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 03:09:58 np0005531888 systemd[1]: Started libpod-conmon-62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112.scope.
Nov 22 03:09:58 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:09:58 np0005531888 podman[234128]: 2025-11-22 08:09:58.462368327 +0000 UTC m=+0.024756279 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:09:58 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3389f4b4c4d86fe38df56e8e7d90229f0b62948d1ce55388bb30812be1bbd40c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:09:58 np0005531888 podman[234128]: 2025-11-22 08:09:58.576197027 +0000 UTC m=+0.138584979 container init 62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:58 np0005531888 podman[234128]: 2025-11-22 08:09:58.581765554 +0000 UTC m=+0.144153486 container start 62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:09:58 np0005531888 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[234143]: [NOTICE]   (234147) : New worker (234149) forked
Nov 22 03:09:58 np0005531888 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[234143]: [NOTICE]   (234147) : Loading success.
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.732 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.841 186792 DEBUG nova.compute.manager [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-vif-plugged-bf553f99-0dbc-4a65-a64a-54074e8070f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.841 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.842 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.842 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.842 186792 DEBUG nova.compute.manager [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] No event matching network-vif-plugged-bf553f99-0dbc-4a65-a64a-54074e8070f1 in dict_keys([('network-vif-plugged', '8bb24240-cb32-4c05-a4f7-1d73a46b1b71')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.842 186792 WARNING nova.compute.manager [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received unexpected event network-vif-plugged-bf553f99-0dbc-4a65-a64a-54074e8070f1 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.842 186792 DEBUG nova.compute.manager [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received event network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.843 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.843 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.843 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.843 186792 DEBUG nova.compute.manager [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Processing event network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.843 186792 DEBUG nova.compute.manager [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received event network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.844 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.844 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.844 186792 DEBUG oslo_concurrency.lockutils [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.844 186792 DEBUG nova.compute.manager [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] No waiting events found dispatching network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.844 186792 WARNING nova.compute.manager [req-8577a790-b2f9-4f7d-b175-cefc8a5c17e1 req-611aeebe-f92b-403f-bef6-d3f1faf7e43b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received unexpected event network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.845 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.852 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763798998.851878, fd71e2ab-4255-4855-bc4a-28045582ce90 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.852 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.855 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.860 186792 INFO nova.virt.libvirt.driver [-] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Instance spawned successfully.#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.861 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.889 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.895 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.896 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.896 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.897 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.897 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.898 186792 DEBUG nova.virt.libvirt.driver [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.904 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.946 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.980 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.980 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 03:09:58 np0005531888 nova_compute[186788]: 2025-11-22 08:09:58.980 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:09:59 np0005531888 nova_compute[186788]: 2025-11-22 08:09:59.036 186792 INFO nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Took 7.77 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:09:59 np0005531888 nova_compute[186788]: 2025-11-22 08:09:59.036 186792 DEBUG nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:59 np0005531888 nova_compute[186788]: 2025-11-22 08:09:59.116 186792 INFO nova.compute.manager [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Took 8.35 seconds to build instance.#033[00m
Nov 22 03:09:59 np0005531888 nova_compute[186788]: 2025-11-22 08:09:59.147 186792 DEBUG oslo_concurrency.lockutils [None req-679cc533-8959-42fd-8ce9-0d646cdc097b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:59 np0005531888 nova_compute[186788]: 2025-11-22 08:09:59.866 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:00 np0005531888 nova_compute[186788]: 2025-11-22 08:10:00.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:00 np0005531888 nova_compute[186788]: 2025-11-22 08:10:00.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.234 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "fd71e2ab-4255-4855-bc4a-28045582ce90" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.234 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.235 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.235 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.235 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.243 186792 INFO nova.compute.manager [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Terminating instance#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.249 186792 DEBUG nova.compute.manager [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:10:01 np0005531888 kernel: tap42febaff-ab (unregistering): left promiscuous mode
Nov 22 03:10:01 np0005531888 NetworkManager[55166]: <info>  [1763799001.2796] device (tap42febaff-ab): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.285 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:01Z|00475|binding|INFO|Releasing lport 42febaff-ab08-422c-921c-4d29969e13f0 from this chassis (sb_readonly=0)
Nov 22 03:10:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:01Z|00476|binding|INFO|Setting lport 42febaff-ab08-422c-921c-4d29969e13f0 down in Southbound
Nov 22 03:10:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:01Z|00477|binding|INFO|Removing iface tap42febaff-ab ovn-installed in OVS
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.288 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.301 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:67:04 10.100.0.10'], port_security=['fa:16:3e:76:67:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fd71e2ab-4255-4855-bc4a-28045582ce90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b67388009f754931a62cbdd391fb4f53', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e23cfd74-a57b-4610-ab28-51062b779dc9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b005592-2b67-4b5e-87ed-f6d87ca37498, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=42febaff-ab08-422c-921c-4d29969e13f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.302 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 42febaff-ab08-422c-921c-4d29969e13f0 in datapath 390460fe-fb7f-40ce-abb7-9e99dea93a54 unbound from our chassis#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.306 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 390460fe-fb7f-40ce-abb7-9e99dea93a54, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.306 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a97b0b57-4c12-430a-9463-f17d8e9ac7b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.307 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 namespace which is not needed anymore#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.308 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Nov 22 03:10:01 np0005531888 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000007c.scope: Consumed 2.698s CPU time.
Nov 22 03:10:01 np0005531888 systemd-machined[153106]: Machine qemu-61-instance-0000007c terminated.
Nov 22 03:10:01 np0005531888 podman[234160]: 2025-11-22 08:10:01.390689294 +0000 UTC m=+0.088832056 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:01 np0005531888 podman[234161]: 2025-11-22 08:10:01.39092457 +0000 UTC m=+0.088383605 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:01 np0005531888 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[233997]: [NOTICE]   (234001) : haproxy version is 2.8.14-c23fe91
Nov 22 03:10:01 np0005531888 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[233997]: [NOTICE]   (234001) : path to executable is /usr/sbin/haproxy
Nov 22 03:10:01 np0005531888 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[233997]: [WARNING]  (234001) : Exiting Master process...
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.473 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[233997]: [ALERT]    (234001) : Current worker (234003) exited with code 143 (Terminated)
Nov 22 03:10:01 np0005531888 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[233997]: [WARNING]  (234001) : All workers exited. Exiting... (0)
Nov 22 03:10:01 np0005531888 systemd[1]: libpod-05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79.scope: Deactivated successfully.
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.480 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 podman[234223]: 2025-11-22 08:10:01.483366933 +0000 UTC m=+0.095015108 container died 05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.522 186792 INFO nova.virt.libvirt.driver [-] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Instance destroyed successfully.#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.523 186792 DEBUG nova.objects.instance [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'resources' on Instance uuid fd71e2ab-4255-4855-bc4a-28045582ce90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.536 186792 DEBUG nova.virt.libvirt.vif [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-921620997',display_name='tempest-TestServerMultinode-server-921620997',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-921620997',id=124,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-yx28502t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:59Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=fd71e2ab-4255-4855-bc4a-28045582ce90,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.537 186792 DEBUG nova.network.os_vif_util [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "42febaff-ab08-422c-921c-4d29969e13f0", "address": "fa:16:3e:76:67:04", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap42febaff-ab", "ovs_interfaceid": "42febaff-ab08-422c-921c-4d29969e13f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.538 186792 DEBUG nova.network.os_vif_util [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:67:04,bridge_name='br-int',has_traffic_filtering=True,id=42febaff-ab08-422c-921c-4d29969e13f0,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42febaff-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.538 186792 DEBUG os_vif [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:67:04,bridge_name='br-int',has_traffic_filtering=True,id=42febaff-ab08-422c-921c-4d29969e13f0,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42febaff-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.540 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.541 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42febaff-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.542 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.545 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.548 186792 INFO os_vif [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:67:04,bridge_name='br-int',has_traffic_filtering=True,id=42febaff-ab08-422c-921c-4d29969e13f0,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap42febaff-ab')#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.549 186792 INFO nova.virt.libvirt.driver [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Deleting instance files /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90_del#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.550 186792 INFO nova.virt.libvirt.driver [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Deletion of /var/lib/nova/instances/fd71e2ab-4255-4855-bc4a-28045582ce90_del complete#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.641 186792 INFO nova.compute.manager [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.642 186792 DEBUG oslo.service.loopingcall [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.643 186792 DEBUG nova.compute.manager [-] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.643 186792 DEBUG nova.network.neutron [-] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:10:01 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79-userdata-shm.mount: Deactivated successfully.
Nov 22 03:10:01 np0005531888 systemd[1]: var-lib-containers-storage-overlay-2be26ca35661d824834bba69c3f19c1412d844aa93fde8a9b208a2190feaeb5f-merged.mount: Deactivated successfully.
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.666 186792 DEBUG nova.compute.manager [req-a44d6f5b-8e20-4417-b05b-7d3fc200f5b7 req-859ae5df-7017-4637-b897-1722ef57292b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received event network-vif-unplugged-42febaff-ab08-422c-921c-4d29969e13f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.667 186792 DEBUG oslo_concurrency.lockutils [req-a44d6f5b-8e20-4417-b05b-7d3fc200f5b7 req-859ae5df-7017-4637-b897-1722ef57292b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.667 186792 DEBUG oslo_concurrency.lockutils [req-a44d6f5b-8e20-4417-b05b-7d3fc200f5b7 req-859ae5df-7017-4637-b897-1722ef57292b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.667 186792 DEBUG oslo_concurrency.lockutils [req-a44d6f5b-8e20-4417-b05b-7d3fc200f5b7 req-859ae5df-7017-4637-b897-1722ef57292b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.668 186792 DEBUG nova.compute.manager [req-a44d6f5b-8e20-4417-b05b-7d3fc200f5b7 req-859ae5df-7017-4637-b897-1722ef57292b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] No waiting events found dispatching network-vif-unplugged-42febaff-ab08-422c-921c-4d29969e13f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.668 186792 DEBUG nova.compute.manager [req-a44d6f5b-8e20-4417-b05b-7d3fc200f5b7 req-859ae5df-7017-4637-b897-1722ef57292b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received event network-vif-unplugged-42febaff-ab08-422c-921c-4d29969e13f0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:10:01 np0005531888 podman[234223]: 2025-11-22 08:10:01.75769194 +0000 UTC m=+0.369340115 container cleanup 05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:10:01 np0005531888 systemd[1]: libpod-conmon-05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79.scope: Deactivated successfully.
Nov 22 03:10:01 np0005531888 podman[234272]: 2025-11-22 08:10:01.818504135 +0000 UTC m=+0.040669551 container remove 05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.824 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[90b54ace-a0ab-41c4-9d51-01c05da65710]: (4, ('Sat Nov 22 08:10:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 (05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79)\n05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79\nSat Nov 22 08:10:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 (05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79)\n05fa279687943c54d09e07d6e8a44116fdfc9d2ad5585fb2605a07b9312adc79\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.826 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[003fe134-bef2-4230-aed9-fd46550875bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.827 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap390460fe-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:01 np0005531888 kernel: tap390460fe-f0: left promiscuous mode
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.828 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 nova_compute[186788]: 2025-11-22 08:10:01.841 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.844 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7fef311c-387a-4eff-8634-a121cbb2c33b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.857 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6c94a4ab-55cf-4701-840e-d43ed7e0f6eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.858 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2febc19a-e420-4480-9b7f-6ea5e0fa56f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.874 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[43b94e32-ba8f-4701-adda-73b9a7cbc505]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573687, 'reachable_time': 19724, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234286, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.877 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:10:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:01.877 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[a3bf8fba-7892-4c89-8f1f-1bd4b2907211]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:01 np0005531888 systemd[1]: run-netns-ovnmeta\x2d390460fe\x2dfb7f\x2d40ce\x2dabb7\x2d9e99dea93a54.mount: Deactivated successfully.
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.474 186792 DEBUG nova.network.neutron [-] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.508 186792 INFO nova.compute.manager [-] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Took 0.86 seconds to deallocate network for instance.#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.570 186792 DEBUG nova.compute.manager [req-836c3d3f-a4eb-4212-9be3-93892605cda9 req-0d2793e4-aede-4ea7-bb38-9ec185cb7f51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received event network-vif-deleted-42febaff-ab08-422c-921c-4d29969e13f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.592 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.593 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.685 186792 DEBUG nova.compute.provider_tree [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.699 186792 DEBUG nova.scheduler.client.report [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.721 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.767 186792 INFO nova.scheduler.client.report [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Deleted allocations for instance fd71e2ab-4255-4855-bc4a-28045582ce90#033[00m
Nov 22 03:10:02 np0005531888 nova_compute[186788]: 2025-11-22 08:10:02.840 186792 DEBUG oslo_concurrency.lockutils [None req-190f54db-b41d-4492-b259-65c37f57b69a 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.734 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.744 186792 DEBUG nova.compute.manager [req-f2281b0f-cb69-4bfd-8f49-a08fd5958e95 req-98fb9c77-3093-4ee2-9317-c9e3f968dbc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received event network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.744 186792 DEBUG oslo_concurrency.lockutils [req-f2281b0f-cb69-4bfd-8f49-a08fd5958e95 req-98fb9c77-3093-4ee2-9317-c9e3f968dbc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.744 186792 DEBUG oslo_concurrency.lockutils [req-f2281b0f-cb69-4bfd-8f49-a08fd5958e95 req-98fb9c77-3093-4ee2-9317-c9e3f968dbc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.744 186792 DEBUG oslo_concurrency.lockutils [req-f2281b0f-cb69-4bfd-8f49-a08fd5958e95 req-98fb9c77-3093-4ee2-9317-c9e3f968dbc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fd71e2ab-4255-4855-bc4a-28045582ce90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.745 186792 DEBUG nova.compute.manager [req-f2281b0f-cb69-4bfd-8f49-a08fd5958e95 req-98fb9c77-3093-4ee2-9317-c9e3f968dbc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] No waiting events found dispatching network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.745 186792 WARNING nova.compute.manager [req-f2281b0f-cb69-4bfd-8f49-a08fd5958e95 req-98fb9c77-3093-4ee2-9317-c9e3f968dbc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Received unexpected event network-vif-plugged-42febaff-ab08-422c-921c-4d29969e13f0 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.820 186792 DEBUG nova.compute.manager [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-vif-plugged-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.820 186792 DEBUG oslo_concurrency.lockutils [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.820 186792 DEBUG oslo_concurrency.lockutils [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.820 186792 DEBUG oslo_concurrency.lockutils [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.820 186792 DEBUG nova.compute.manager [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Processing event network-vif-plugged-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.821 186792 DEBUG nova.compute.manager [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-vif-plugged-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.821 186792 DEBUG oslo_concurrency.lockutils [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.821 186792 DEBUG oslo_concurrency.lockutils [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.821 186792 DEBUG oslo_concurrency.lockutils [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.821 186792 DEBUG nova.compute.manager [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] No waiting events found dispatching network-vif-plugged-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.821 186792 WARNING nova.compute.manager [req-e407edbf-9dd0-43fe-a951-d5888556cf53 req-daaad856-618f-418e-93de-103b15fb24aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received unexpected event network-vif-plugged-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.822 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Instance event wait completed in 7 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.826 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799003.8259683, 8bd0c27f-4042-4314-9eee-7939d2dd2f99 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.826 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.828 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.831 186792 INFO nova.virt.libvirt.driver [-] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Instance spawned successfully.#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.831 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.847 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.851 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.859 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.859 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.860 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.860 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.861 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.861 186792 DEBUG nova.virt.libvirt.driver [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.885 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.922 186792 INFO nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Took 17.46 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.922 186792 DEBUG nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:03 np0005531888 nova_compute[186788]: 2025-11-22 08:10:03.998 186792 INFO nova.compute.manager [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Took 18.08 seconds to build instance.#033[00m
Nov 22 03:10:04 np0005531888 nova_compute[186788]: 2025-11-22 08:10:04.012 186792 DEBUG oslo_concurrency.lockutils [None req-8a803689-bb5a-43b5-b354-cc07e76f9e08 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:06 np0005531888 nova_compute[186788]: 2025-11-22 08:10:06.544 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:06 np0005531888 nova_compute[186788]: 2025-11-22 08:10:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:06 np0005531888 nova_compute[186788]: 2025-11-22 08:10:06.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:10:06 np0005531888 nova_compute[186788]: 2025-11-22 08:10:06.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:06 np0005531888 nova_compute[186788]: 2025-11-22 08:10:06.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:06 np0005531888 nova_compute[186788]: 2025-11-22 08:10:06.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:06 np0005531888 nova_compute[186788]: 2025-11-22 08:10:06.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:06 np0005531888 nova_compute[186788]: 2025-11-22 08:10:06.984 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.048 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.108 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.109 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.125 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.169 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.347 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.349 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5558MB free_disk=73.27328109741211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.349 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.349 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.413 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798992.412372, f585652d-c90a-4001-bd55-2ffc90c6bab2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.414 186792 INFO nova.compute.manager [-] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.454 186792 DEBUG nova.compute.manager [None req-ba3c9136-e0c5-448e-81ec-447b00cfbfd6 - - - - - -] [instance: f585652d-c90a-4001-bd55-2ffc90c6bab2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.486 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 8bd0c27f-4042-4314-9eee-7939d2dd2f99 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.487 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.487 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.514 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.541 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.542 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.566 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.593 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.658 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.683 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.734 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:10:07 np0005531888 nova_compute[186788]: 2025-11-22 08:10:07.734 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:08 np0005531888 nova_compute[186788]: 2025-11-22 08:10:08.736 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:10 np0005531888 podman[234295]: 2025-11-22 08:10:10.677471556 +0000 UTC m=+0.050084573 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:10:10 np0005531888 podman[234296]: 2025-11-22 08:10:10.699836756 +0000 UTC m=+0.071342206 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:10:11 np0005531888 nova_compute[186788]: 2025-11-22 08:10:11.547 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:11 np0005531888 nova_compute[186788]: 2025-11-22 08:10:11.735 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:11 np0005531888 nova_compute[186788]: 2025-11-22 08:10:11.909 186792 DEBUG nova.compute.manager [req-f2734fc4-3f75-4e8d-a5db-f0fdef13ccfa req-64003102-3867-442f-8dd3-9d39d928870a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-changed-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:11 np0005531888 nova_compute[186788]: 2025-11-22 08:10:11.909 186792 DEBUG nova.compute.manager [req-f2734fc4-3f75-4e8d-a5db-f0fdef13ccfa req-64003102-3867-442f-8dd3-9d39d928870a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Refreshing instance network info cache due to event network-changed-8bb24240-cb32-4c05-a4f7-1d73a46b1b71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:10:11 np0005531888 nova_compute[186788]: 2025-11-22 08:10:11.909 186792 DEBUG oslo_concurrency.lockutils [req-f2734fc4-3f75-4e8d-a5db-f0fdef13ccfa req-64003102-3867-442f-8dd3-9d39d928870a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:10:11 np0005531888 nova_compute[186788]: 2025-11-22 08:10:11.910 186792 DEBUG oslo_concurrency.lockutils [req-f2734fc4-3f75-4e8d-a5db-f0fdef13ccfa req-64003102-3867-442f-8dd3-9d39d928870a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:10:11 np0005531888 nova_compute[186788]: 2025-11-22 08:10:11.910 186792 DEBUG nova.network.neutron [req-f2734fc4-3f75-4e8d-a5db-f0fdef13ccfa req-64003102-3867-442f-8dd3-9d39d928870a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Refreshing network info cache for port 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:10:12 np0005531888 nova_compute[186788]: 2025-11-22 08:10:12.250 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:13 np0005531888 nova_compute[186788]: 2025-11-22 08:10:13.738 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:14 np0005531888 nova_compute[186788]: 2025-11-22 08:10:14.690 186792 DEBUG nova.network.neutron [req-f2734fc4-3f75-4e8d-a5db-f0fdef13ccfa req-64003102-3867-442f-8dd3-9d39d928870a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updated VIF entry in instance network info cache for port 8bb24240-cb32-4c05-a4f7-1d73a46b1b71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:10:14 np0005531888 nova_compute[186788]: 2025-11-22 08:10:14.690 186792 DEBUG nova.network.neutron [req-f2734fc4-3f75-4e8d-a5db-f0fdef13ccfa req-64003102-3867-442f-8dd3-9d39d928870a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updating instance_info_cache with network_info: [{"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:14 np0005531888 nova_compute[186788]: 2025-11-22 08:10:14.709 186792 DEBUG oslo_concurrency.lockutils [req-f2734fc4-3f75-4e8d-a5db-f0fdef13ccfa req-64003102-3867-442f-8dd3-9d39d928870a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:10:14 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:14Z|00478|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:10:14 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:14Z|00479|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:10:14 np0005531888 nova_compute[186788]: 2025-11-22 08:10:14.961 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:16 np0005531888 nova_compute[186788]: 2025-11-22 08:10:16.522 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799001.520338, fd71e2ab-4255-4855-bc4a-28045582ce90 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:16 np0005531888 nova_compute[186788]: 2025-11-22 08:10:16.523 186792 INFO nova.compute.manager [-] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:10:16 np0005531888 nova_compute[186788]: 2025-11-22 08:10:16.543 186792 DEBUG nova.compute.manager [None req-641bce7b-d96d-4538-a362-f544df75f27a - - - - - -] [instance: fd71e2ab-4255-4855-bc4a-28045582ce90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:16 np0005531888 nova_compute[186788]: 2025-11-22 08:10:16.548 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:16.775 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:16 np0005531888 nova_compute[186788]: 2025-11-22 08:10:16.776 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:16.776 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:10:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:17.779 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:17 np0005531888 nova_compute[186788]: 2025-11-22 08:10:17.950 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:18Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:73:b6:1e 10.100.0.5
Nov 22 03:10:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:18Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:73:b6:1e 10.100.0.5
Nov 22 03:10:18 np0005531888 nova_compute[186788]: 2025-11-22 08:10:18.740 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:18Z|00480|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:10:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:18Z|00481|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:10:18 np0005531888 nova_compute[186788]: 2025-11-22 08:10:18.976 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:20Z|00482|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:10:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:20Z|00483|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:10:20 np0005531888 nova_compute[186788]: 2025-11-22 08:10:20.062 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:20 np0005531888 podman[234358]: 2025-11-22 08:10:20.683647098 +0000 UTC m=+0.059953256 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Nov 22 03:10:21 np0005531888 nova_compute[186788]: 2025-11-22 08:10:21.551 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:22 np0005531888 podman[234378]: 2025-11-22 08:10:22.683224164 +0000 UTC m=+0.058462788 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:10:23 np0005531888 nova_compute[186788]: 2025-11-22 08:10:23.742 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:26 np0005531888 nova_compute[186788]: 2025-11-22 08:10:26.554 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:26 np0005531888 podman[234402]: 2025-11-22 08:10:26.701042306 +0000 UTC m=+0.063603376 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:10:26 np0005531888 nova_compute[186788]: 2025-11-22 08:10:26.970 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:28 np0005531888 nova_compute[186788]: 2025-11-22 08:10:28.746 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:29 np0005531888 nova_compute[186788]: 2025-11-22 08:10:29.533 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.829 186792 DEBUG nova.compute.manager [req-8e9cf218-aefe-4864-a458-cb2f016b41b2 req-a6b9e075-0620-426f-8052-e59d63bec367 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-changed-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.830 186792 DEBUG nova.compute.manager [req-8e9cf218-aefe-4864-a458-cb2f016b41b2 req-a6b9e075-0620-426f-8052-e59d63bec367 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Refreshing instance network info cache due to event network-changed-8bb24240-cb32-4c05-a4f7-1d73a46b1b71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.830 186792 DEBUG oslo_concurrency.lockutils [req-8e9cf218-aefe-4864-a458-cb2f016b41b2 req-a6b9e075-0620-426f-8052-e59d63bec367 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.831 186792 DEBUG oslo_concurrency.lockutils [req-8e9cf218-aefe-4864-a458-cb2f016b41b2 req-a6b9e075-0620-426f-8052-e59d63bec367 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.831 186792 DEBUG nova.network.neutron [req-8e9cf218-aefe-4864-a458-cb2f016b41b2 req-a6b9e075-0620-426f-8052-e59d63bec367 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Refreshing network info cache for port 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.978 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.979 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.979 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.980 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.980 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.988 186792 INFO nova.compute.manager [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Terminating instance#033[00m
Nov 22 03:10:30 np0005531888 nova_compute[186788]: 2025-11-22 08:10:30.994 186792 DEBUG nova.compute.manager [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:10:31 np0005531888 kernel: tap8bb24240-cb (unregistering): left promiscuous mode
Nov 22 03:10:31 np0005531888 NetworkManager[55166]: <info>  [1763799031.0193] device (tap8bb24240-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.028 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:31Z|00484|binding|INFO|Releasing lport 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 from this chassis (sb_readonly=0)
Nov 22 03:10:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:31Z|00485|binding|INFO|Setting lport 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 down in Southbound
Nov 22 03:10:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:31Z|00486|binding|INFO|Removing iface tap8bb24240-cb ovn-installed in OVS
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.030 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.041 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 kernel: tapbf553f99-0d (unregistering): left promiscuous mode
Nov 22 03:10:31 np0005531888 NetworkManager[55166]: <info>  [1763799031.0494] device (tapbf553f99-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:10:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:31Z|00487|binding|INFO|Releasing lport bf553f99-0dbc-4a65-a64a-54074e8070f1 from this chassis (sb_readonly=1)
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.061 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:31Z|00488|binding|INFO|Removing iface tapbf553f99-0d ovn-installed in OVS
Nov 22 03:10:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:31Z|00489|if_status|INFO|Dropped 5 log messages in last 250 seconds (most recently, 250 seconds ago) due to excessive rate
Nov 22 03:10:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:31Z|00490|if_status|INFO|Not setting lport bf553f99-0dbc-4a65-a64a-54074e8070f1 down as sb is readonly
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.063 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.081 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:31Z|00491|binding|INFO|Setting lport bf553f99-0dbc-4a65-a64a-54074e8070f1 down in Southbound
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.082 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:b6:1e 10.100.0.5'], port_security=['fa:16:3e:73:b6:1e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '8bd0c27f-4042-4314-9eee-7939d2dd2f99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fecc702f-680b-424c-83ef-3f9c6214c28e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c8e809e-e81c-4dfc-8977-f974433d5b3a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=8bb24240-cb32-4c05-a4f7-1d73a46b1b71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.083 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb24240-cb32-4c05-a4f7-1d73a46b1b71 in datapath 8591a8a4-c35f-454b-ba4c-4ec37a8765b2 unbound from our chassis#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.085 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8591a8a4-c35f-454b-ba4c-4ec37a8765b2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.086 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca11236-3ce1-48fe-95df-f4ac47bd406e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.086 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 namespace which is not needed anymore#033[00m
Nov 22 03:10:31 np0005531888 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Nov 22 03:10:31 np0005531888 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007a.scope: Consumed 14.788s CPU time.
Nov 22 03:10:31 np0005531888 systemd-machined[153106]: Machine qemu-60-instance-0000007a terminated.
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.175 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:4d:e2 2001:db8::f816:3eff:feb2:4de2'], port_security=['fa:16:3e:b2:4d:e2 2001:db8::f816:3eff:feb2:4de2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb2:4de2/64', 'neutron:device_id': '8bd0c27f-4042-4314-9eee-7939d2dd2f99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fecc702f-680b-424c-83ef-3f9c6214c28e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a4afbec-9e59-4ffa-9128-10dc4f025189, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=bf553f99-0dbc-4a65-a64a-54074e8070f1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[234068]: [NOTICE]   (234072) : haproxy version is 2.8.14-c23fe91
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[234068]: [NOTICE]   (234072) : path to executable is /usr/sbin/haproxy
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[234068]: [WARNING]  (234072) : Exiting Master process...
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[234068]: [WARNING]  (234072) : Exiting Master process...
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[234068]: [ALERT]    (234072) : Current worker (234074) exited with code 143 (Terminated)
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[234068]: [WARNING]  (234072) : All workers exited. Exiting... (0)
Nov 22 03:10:31 np0005531888 systemd[1]: libpod-8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f.scope: Deactivated successfully.
Nov 22 03:10:31 np0005531888 podman[234454]: 2025-11-22 08:10:31.219276155 +0000 UTC m=+0.051579330 container died 8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:10:31 np0005531888 NetworkManager[55166]: <info>  [1763799031.2262] manager: (tapbf553f99-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/231)
Nov 22 03:10:31 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f-userdata-shm.mount: Deactivated successfully.
Nov 22 03:10:31 np0005531888 systemd[1]: var-lib-containers-storage-overlay-b46d5f55b8bef239d3a6a44edabb9a9f94ec06b28b0fcc292bce410b47230fbe-merged.mount: Deactivated successfully.
Nov 22 03:10:31 np0005531888 podman[234454]: 2025-11-22 08:10:31.282086079 +0000 UTC m=+0.114389264 container cleanup 8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.282 186792 INFO nova.virt.libvirt.driver [-] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Instance destroyed successfully.#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.283 186792 DEBUG nova.objects.instance [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 8bd0c27f-4042-4314-9eee-7939d2dd2f99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:31 np0005531888 systemd[1]: libpod-conmon-8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f.scope: Deactivated successfully.
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.293 186792 DEBUG nova.virt.libvirt.vif [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1540631613',display_name='tempest-TestGettingAddress-server-1540631613',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1540631613',id=122,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:10:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-j0pvysn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:10:03Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=8bd0c27f-4042-4314-9eee-7939d2dd2f99,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.294 186792 DEBUG nova.network.os_vif_util [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.294 186792 DEBUG nova.network.os_vif_util [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:b6:1e,bridge_name='br-int',has_traffic_filtering=True,id=8bb24240-cb32-4c05-a4f7-1d73a46b1b71,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb24240-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.295 186792 DEBUG os_vif [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:b6:1e,bridge_name='br-int',has_traffic_filtering=True,id=8bb24240-cb32-4c05-a4f7-1d73a46b1b71,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb24240-cb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.296 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.296 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bb24240-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.298 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.300 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.303 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.305 186792 INFO os_vif [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:b6:1e,bridge_name='br-int',has_traffic_filtering=True,id=8bb24240-cb32-4c05-a4f7-1d73a46b1b71,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb24240-cb')#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.306 186792 DEBUG nova.virt.libvirt.vif [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1540631613',display_name='tempest-TestGettingAddress-server-1540631613',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1540631613',id=122,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:10:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-j0pvysn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:10:03Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=8bd0c27f-4042-4314-9eee-7939d2dd2f99,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.306 186792 DEBUG nova.network.os_vif_util [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.307 186792 DEBUG nova.network.os_vif_util [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:4d:e2,bridge_name='br-int',has_traffic_filtering=True,id=bf553f99-0dbc-4a65-a64a-54074e8070f1,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf553f99-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.308 186792 DEBUG os_vif [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:4d:e2,bridge_name='br-int',has_traffic_filtering=True,id=bf553f99-0dbc-4a65-a64a-54074e8070f1,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf553f99-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.309 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf553f99-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.311 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.313 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.315 186792 INFO os_vif [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:4d:e2,bridge_name='br-int',has_traffic_filtering=True,id=bf553f99-0dbc-4a65-a64a-54074e8070f1,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf553f99-0d')#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.316 186792 INFO nova.virt.libvirt.driver [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Deleting instance files /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99_del#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.316 186792 INFO nova.virt.libvirt.driver [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Deletion of /var/lib/nova/instances/8bd0c27f-4042-4314-9eee-7939d2dd2f99_del complete#033[00m
Nov 22 03:10:31 np0005531888 podman[234510]: 2025-11-22 08:10:31.347081768 +0000 UTC m=+0.042895507 container remove 8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.353 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9844139d-9a63-4cde-a625-42847560e355]: (4, ('Sat Nov 22 08:10:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 (8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f)\n8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f\nSat Nov 22 08:10:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 (8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f)\n8a1d6ba92137bce0fc11664488e99d5b750f26ce7454e75e2160ece7f771673f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.354 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfd1ce7-d8ab-472a-8352-2aa762fafa8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.355 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8591a8a4-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.357 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 kernel: tap8591a8a4-c0: left promiscuous mode
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.368 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.371 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4ddcb253-5a28-4cf2-bd3d-a31acb436f45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.381 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ca055b56-579f-4745-9be5-925aa05bfff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.382 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[98322867-1ecf-4ebc-8eb9-0b3370574a9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.397 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a796a2-6736-45bf-a371-aef1567dc41a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573773, 'reachable_time': 37801, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234526, 'error': None, 'target': 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.399 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.399 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[74dd4e86-2a2c-411e-a5ef-71ed238b4975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 systemd[1]: run-netns-ovnmeta\x2d8591a8a4\x2dc35f\x2d454b\x2dba4c\x2d4ec37a8765b2.mount: Deactivated successfully.
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.399 104023 INFO neutron.agent.ovn.metadata.agent [-] Port bf553f99-0dbc-4a65-a64a-54074e8070f1 in datapath 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad unbound from our chassis#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.401 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.401 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2a5cc7ab-f2f7-46be-8375-efd02896c705]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.402 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad namespace which is not needed anymore#033[00m
Nov 22 03:10:31 np0005531888 podman[234527]: 2025-11-22 08:10:31.496912032 +0000 UTC m=+0.070594616 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:10:31 np0005531888 podman[234531]: 2025-11-22 08:10:31.533383199 +0000 UTC m=+0.102372188 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[234143]: [NOTICE]   (234147) : haproxy version is 2.8.14-c23fe91
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[234143]: [NOTICE]   (234147) : path to executable is /usr/sbin/haproxy
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[234143]: [WARNING]  (234147) : Exiting Master process...
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[234143]: [ALERT]    (234147) : Current worker (234149) exited with code 143 (Terminated)
Nov 22 03:10:31 np0005531888 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[234143]: [WARNING]  (234147) : All workers exited. Exiting... (0)
Nov 22 03:10:31 np0005531888 systemd[1]: libpod-62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112.scope: Deactivated successfully.
Nov 22 03:10:31 np0005531888 podman[234572]: 2025-11-22 08:10:31.548301786 +0000 UTC m=+0.060201671 container died 62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:10:31 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112-userdata-shm.mount: Deactivated successfully.
Nov 22 03:10:31 np0005531888 systemd[1]: var-lib-containers-storage-overlay-3389f4b4c4d86fe38df56e8e7d90229f0b62948d1ce55388bb30812be1bbd40c-merged.mount: Deactivated successfully.
Nov 22 03:10:31 np0005531888 podman[234572]: 2025-11-22 08:10:31.58545441 +0000 UTC m=+0.097354295 container cleanup 62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:10:31 np0005531888 systemd[1]: libpod-conmon-62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112.scope: Deactivated successfully.
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.607 186792 INFO nova.compute.manager [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Took 0.61 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.608 186792 DEBUG oslo.service.loopingcall [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.608 186792 DEBUG nova.compute.manager [-] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.608 186792 DEBUG nova.network.neutron [-] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:10:31 np0005531888 podman[234615]: 2025-11-22 08:10:31.646138093 +0000 UTC m=+0.042801854 container remove 62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.651 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c64f5356-adb1-4148-809b-3bb6817bb10c]: (4, ('Sat Nov 22 08:10:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad (62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112)\n62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112\nSat Nov 22 08:10:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad (62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112)\n62fda811fa5c1050d696004179d37a357cf2bc78f1905ce974eb7fc702caa112\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.653 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[32d1e4ff-fc22-4b1b-add7-e90cef5e9315]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.654 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a8e7fc1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.656 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 kernel: tap6a8e7fc1-60: left promiscuous mode
Nov 22 03:10:31 np0005531888 nova_compute[186788]: 2025-11-22 08:10:31.668 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.671 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b977fd-f519-42f9-a524-37b0967a552a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.690 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ca2d0b-c8f1-4374-81b3-145137006ad0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.691 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[da3a93f7-1c5e-4461-bd42-7e3578c49c72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.706 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4145d76f-da5a-412f-bd1c-833bc759bbf2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573858, 'reachable_time': 37071, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234629, 'error': None, 'target': 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.710 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:10:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:31.710 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae0784b-7ce5-49c1-93a3-c8dd78e6ae7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:32 np0005531888 systemd[1]: run-netns-ovnmeta\x2d6a8e7fc1\x2d6ea3\x2d4bc9\x2d85d9\x2df62acc4ca9ad.mount: Deactivated successfully.
Nov 22 03:10:33 np0005531888 nova_compute[186788]: 2025-11-22 08:10:33.739 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:33 np0005531888 nova_compute[186788]: 2025-11-22 08:10:33.747 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:34 np0005531888 nova_compute[186788]: 2025-11-22 08:10:34.955 186792 DEBUG nova.network.neutron [req-8e9cf218-aefe-4864-a458-cb2f016b41b2 req-a6b9e075-0620-426f-8052-e59d63bec367 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updated VIF entry in instance network info cache for port 8bb24240-cb32-4c05-a4f7-1d73a46b1b71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:10:34 np0005531888 nova_compute[186788]: 2025-11-22 08:10:34.956 186792 DEBUG nova.network.neutron [req-8e9cf218-aefe-4864-a458-cb2f016b41b2 req-a6b9e075-0620-426f-8052-e59d63bec367 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updating instance_info_cache with network_info: [{"id": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "address": "fa:16:3e:73:b6:1e", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb24240-cb", "ovs_interfaceid": "8bb24240-cb32-4c05-a4f7-1d73a46b1b71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "address": "fa:16:3e:b2:4d:e2", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb2:4de2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf553f99-0d", "ovs_interfaceid": "bf553f99-0dbc-4a65-a64a-54074e8070f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:34 np0005531888 nova_compute[186788]: 2025-11-22 08:10:34.976 186792 DEBUG oslo_concurrency.lockutils [req-8e9cf218-aefe-4864-a458-cb2f016b41b2 req-a6b9e075-0620-426f-8052-e59d63bec367 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8bd0c27f-4042-4314-9eee-7939d2dd2f99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.313 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.779 186792 DEBUG nova.network.neutron [-] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.812 186792 INFO nova.compute.manager [-] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Took 5.20 seconds to deallocate network for instance.#033[00m
Nov 22 03:10:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:36.825 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:36.826 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:36.826 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.845 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.845 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.845 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.845 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.845 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.846 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.846 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.846 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.846 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.846 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.846 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.846 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.847 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.847 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.847 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.847 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.847 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.847 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.847 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:10:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.873 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.874 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.932 186792 DEBUG nova.compute.provider_tree [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.948 186792 DEBUG nova.scheduler.client.report [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.966 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.988 186792 INFO nova.scheduler.client.report [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 8bd0c27f-4042-4314-9eee-7939d2dd2f99#033[00m
Nov 22 03:10:36 np0005531888 nova_compute[186788]: 2025-11-22 08:10:36.991 186792 DEBUG nova.compute.manager [req-79b10896-9956-4c84-9e28-d2096612fb37 req-bd039a9c-eeb2-44a1-967d-71ec5d5ffc99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-vif-deleted-8bb24240-cb32-4c05-a4f7-1d73a46b1b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:37 np0005531888 nova_compute[186788]: 2025-11-22 08:10:37.064 186792 DEBUG oslo_concurrency.lockutils [None req-48aa4dcc-69c1-4456-af83-64e00fe3bef9 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "8bd0c27f-4042-4314-9eee-7939d2dd2f99" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:37 np0005531888 nova_compute[186788]: 2025-11-22 08:10:37.120 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:37 np0005531888 nova_compute[186788]: 2025-11-22 08:10:37.470 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:37 np0005531888 nova_compute[186788]: 2025-11-22 08:10:37.718 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.383 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.383 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.400 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.483 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.484 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.490 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.490 186792 INFO nova.compute.claims [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.605 186792 DEBUG nova.compute.provider_tree [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.618 186792 DEBUG nova.scheduler.client.report [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.638 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.638 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.691 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.692 186792 DEBUG nova.network.neutron [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.710 186792 INFO nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.726 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.748 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.887 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.889 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.889 186792 INFO nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Creating image(s)#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.890 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "/var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.890 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "/var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.891 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "/var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.907 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.967 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.968 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.968 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:38 np0005531888 nova_compute[186788]: 2025-11-22 08:10:38.985 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.042 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.043 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.088 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.090 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.090 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.138 186792 DEBUG nova.policy [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b428b7f3de34b3eb007bbf92c75f340', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9efe4ed59a624fa4b5d2bc27fe981f24', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.157 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.158 186792 DEBUG nova.virt.disk.api [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Checking if we can resize image /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.158 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.227 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.228 186792 DEBUG nova.virt.disk.api [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Cannot resize image /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.229 186792 DEBUG nova.objects.instance [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lazy-loading 'migration_context' on Instance uuid 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.246 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.247 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Ensure instance console log exists: /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.247 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.247 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.248 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:39 np0005531888 nova_compute[186788]: 2025-11-22 08:10:39.251 186792 DEBUG nova.compute.manager [req-6b7b4885-3851-4542-a6f5-2cf943cd0785 req-52ced9a3-2a7c-4b92-92c3-d6816465ad61 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Received event network-vif-deleted-bf553f99-0dbc-4a65-a64a-54074e8070f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:40 np0005531888 nova_compute[186788]: 2025-11-22 08:10:40.648 186792 DEBUG nova.network.neutron [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Successfully created port: d4f01247-c774-48c3-a540-be77c998237b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:10:41 np0005531888 nova_compute[186788]: 2025-11-22 08:10:41.316 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:41 np0005531888 podman[234647]: 2025-11-22 08:10:41.681256968 +0000 UTC m=+0.053981499 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:10:41 np0005531888 podman[234646]: 2025-11-22 08:10:41.681265308 +0000 UTC m=+0.055405904 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:10:42 np0005531888 nova_compute[186788]: 2025-11-22 08:10:42.316 186792 DEBUG nova.network.neutron [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Successfully updated port: d4f01247-c774-48c3-a540-be77c998237b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:10:42 np0005531888 nova_compute[186788]: 2025-11-22 08:10:42.332 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "refresh_cache-7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:10:42 np0005531888 nova_compute[186788]: 2025-11-22 08:10:42.332 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquired lock "refresh_cache-7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:10:42 np0005531888 nova_compute[186788]: 2025-11-22 08:10:42.332 186792 DEBUG nova.network.neutron [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:10:42 np0005531888 nova_compute[186788]: 2025-11-22 08:10:42.589 186792 DEBUG nova.network.neutron [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.340 186792 DEBUG nova.compute.manager [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received event network-changed-d4f01247-c774-48c3-a540-be77c998237b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.340 186792 DEBUG nova.compute.manager [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Refreshing instance network info cache due to event network-changed-d4f01247-c774-48c3-a540-be77c998237b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.341 186792 DEBUG oslo_concurrency.lockutils [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.593 186792 DEBUG nova.network.neutron [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Updating instance_info_cache with network_info: [{"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.614 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Releasing lock "refresh_cache-7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.615 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Instance network_info: |[{"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.615 186792 DEBUG oslo_concurrency.lockutils [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.615 186792 DEBUG nova.network.neutron [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Refreshing network info cache for port d4f01247-c774-48c3-a540-be77c998237b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.619 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Start _get_guest_xml network_info=[{"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.623 186792 WARNING nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.627 186792 DEBUG nova.virt.libvirt.host [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.628 186792 DEBUG nova.virt.libvirt.host [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.634 186792 DEBUG nova.virt.libvirt.host [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.635 186792 DEBUG nova.virt.libvirt.host [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.636 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.637 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.637 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.637 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.638 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.638 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.638 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.639 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.639 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.639 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.639 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.640 186792 DEBUG nova.virt.hardware [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.644 186792 DEBUG nova.virt.libvirt.vif [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-2069296665',display_name='tempest-NoVNCConsoleTestJSON-server-2069296665',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-2069296665',id=128,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9efe4ed59a624fa4b5d2bc27fe981f24',ramdisk_id='',reservation_id='r-r0t46ty2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-1376948029',owner_user_name='tempest-NoVNCConsoleTestJSON-1376948029-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:10:38Z,user_data=None,user_id='9b428b7f3de34b3eb007bbf92c75f340',uuid=7afef8c7-aef7-4c8a-a7a5-452a8d1283ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.645 186792 DEBUG nova.network.os_vif_util [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Converting VIF {"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.646 186792 DEBUG nova.network.os_vif_util [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:b3:5d,bridge_name='br-int',has_traffic_filtering=True,id=d4f01247-c774-48c3-a540-be77c998237b,network=Network(fdd4a70d-b5e2-4660-bde8-e340d7113b23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f01247-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.647 186792 DEBUG nova.objects.instance [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.678 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <uuid>7afef8c7-aef7-4c8a-a7a5-452a8d1283ee</uuid>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <name>instance-00000080</name>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <nova:name>tempest-NoVNCConsoleTestJSON-server-2069296665</nova:name>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:10:43</nova:creationTime>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        <nova:user uuid="9b428b7f3de34b3eb007bbf92c75f340">tempest-NoVNCConsoleTestJSON-1376948029-project-member</nova:user>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        <nova:project uuid="9efe4ed59a624fa4b5d2bc27fe981f24">tempest-NoVNCConsoleTestJSON-1376948029</nova:project>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        <nova:port uuid="d4f01247-c774-48c3-a540-be77c998237b">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <entry name="serial">7afef8c7-aef7-4c8a-a7a5-452a8d1283ee</entry>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <entry name="uuid">7afef8c7-aef7-4c8a-a7a5-452a8d1283ee</entry>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk.config"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:ab:b3:5d"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <target dev="tapd4f01247-c7"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/console.log" append="off"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:10:43 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:10:43 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:10:43 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:10:43 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.679 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Preparing to wait for external event network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.679 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.680 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.680 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.682 186792 DEBUG nova.virt.libvirt.vif [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-2069296665',display_name='tempest-NoVNCConsoleTestJSON-server-2069296665',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-2069296665',id=128,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9efe4ed59a624fa4b5d2bc27fe981f24',ramdisk_id='',reservation_id='r-r0t46ty2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-1376948029',owner_user_name='tempest-NoVNCConsoleTestJSON-1376948029-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:10:38Z,user_data=None,user_id='9b428b7f3de34b3eb007bbf92c75f340',uuid=7afef8c7-aef7-4c8a-a7a5-452a8d1283ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.682 186792 DEBUG nova.network.os_vif_util [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Converting VIF {"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.683 186792 DEBUG nova.network.os_vif_util [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:b3:5d,bridge_name='br-int',has_traffic_filtering=True,id=d4f01247-c774-48c3-a540-be77c998237b,network=Network(fdd4a70d-b5e2-4660-bde8-e340d7113b23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f01247-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.683 186792 DEBUG os_vif [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:b3:5d,bridge_name='br-int',has_traffic_filtering=True,id=d4f01247-c774-48c3-a540-be77c998237b,network=Network(fdd4a70d-b5e2-4660-bde8-e340d7113b23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f01247-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.684 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.684 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.685 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.687 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.687 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4f01247-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.688 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4f01247-c7, col_values=(('external_ids', {'iface-id': 'd4f01247-c774-48c3-a540-be77c998237b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:b3:5d', 'vm-uuid': '7afef8c7-aef7-4c8a-a7a5-452a8d1283ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.690 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.691 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:43 np0005531888 NetworkManager[55166]: <info>  [1763799043.6928] manager: (tapd4f01247-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.693 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.699 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.700 186792 INFO os_vif [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:b3:5d,bridge_name='br-int',has_traffic_filtering=True,id=d4f01247-c774-48c3-a540-be77c998237b,network=Network(fdd4a70d-b5e2-4660-bde8-e340d7113b23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f01247-c7')#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.749 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.754 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.754 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.754 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] No VIF found with MAC fa:16:3e:ab:b3:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:10:43 np0005531888 nova_compute[186788]: 2025-11-22 08:10:43.755 186792 INFO nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Using config drive#033[00m
Nov 22 03:10:44 np0005531888 nova_compute[186788]: 2025-11-22 08:10:44.514 186792 INFO nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Creating config drive at /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk.config#033[00m
Nov 22 03:10:44 np0005531888 nova_compute[186788]: 2025-11-22 08:10:44.518 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvwj9_asn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:44 np0005531888 nova_compute[186788]: 2025-11-22 08:10:44.646 186792 DEBUG oslo_concurrency.processutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvwj9_asn" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:44 np0005531888 kernel: tapd4f01247-c7: entered promiscuous mode
Nov 22 03:10:44 np0005531888 NetworkManager[55166]: <info>  [1763799044.7091] manager: (tapd4f01247-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/233)
Nov 22 03:10:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:44Z|00492|binding|INFO|Claiming lport d4f01247-c774-48c3-a540-be77c998237b for this chassis.
Nov 22 03:10:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:44Z|00493|binding|INFO|d4f01247-c774-48c3-a540-be77c998237b: Claiming fa:16:3e:ab:b3:5d 10.100.0.14
Nov 22 03:10:44 np0005531888 nova_compute[186788]: 2025-11-22 08:10:44.710 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:44 np0005531888 nova_compute[186788]: 2025-11-22 08:10:44.715 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.727 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:b3:5d 10.100.0.14'], port_security=['fa:16:3e:ab:b3:5d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7afef8c7-aef7-4c8a-a7a5-452a8d1283ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fdd4a70d-b5e2-4660-bde8-e340d7113b23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9efe4ed59a624fa4b5d2bc27fe981f24', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3642cac-086c-43f0-b5a6-6cbe95c25c84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af88b1d0-b5ce-43c7-b82e-49d200ccbffa, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=d4f01247-c774-48c3-a540-be77c998237b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.730 104023 INFO neutron.agent.ovn.metadata.agent [-] Port d4f01247-c774-48c3-a540-be77c998237b in datapath fdd4a70d-b5e2-4660-bde8-e340d7113b23 bound to our chassis#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.732 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fdd4a70d-b5e2-4660-bde8-e340d7113b23#033[00m
Nov 22 03:10:44 np0005531888 systemd-udevd[234706]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.742 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[de547b48-17d4-40ed-9974-a1b68ec3fe64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.744 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfdd4a70d-b1 in ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.746 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfdd4a70d-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.746 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3c807746-8081-4220-afac-afdf954c9ddf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.747 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[966f706b-502b-4fd3-aaab-1d097889ec37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 NetworkManager[55166]: <info>  [1763799044.7500] device (tapd4f01247-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:10:44 np0005531888 systemd-machined[153106]: New machine qemu-62-instance-00000080.
Nov 22 03:10:44 np0005531888 NetworkManager[55166]: <info>  [1763799044.7517] device (tapd4f01247-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.759 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0f95dda1-e227-4c20-b8d4-fc52093c1d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 nova_compute[186788]: 2025-11-22 08:10:44.767 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:44Z|00494|binding|INFO|Setting lport d4f01247-c774-48c3-a540-be77c998237b ovn-installed in OVS
Nov 22 03:10:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:44Z|00495|binding|INFO|Setting lport d4f01247-c774-48c3-a540-be77c998237b up in Southbound
Nov 22 03:10:44 np0005531888 systemd[1]: Started Virtual Machine qemu-62-instance-00000080.
Nov 22 03:10:44 np0005531888 nova_compute[186788]: 2025-11-22 08:10:44.773 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.778 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3d753f-8883-4594-83d4-2e396e6cb508]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.807 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2013f3f0-7f3c-4a65-9bfd-a166d33ee073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.812 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[372e9993-0be2-4963-a89c-5746cbfc8bea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 NetworkManager[55166]: <info>  [1763799044.8137] manager: (tapfdd4a70d-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/234)
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.844 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1b11cb-7aef-4dbc-ad74-cfc6d3c4a476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.847 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[52fc4c82-a602-498c-a064-d551321de172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 NetworkManager[55166]: <info>  [1763799044.8701] device (tapfdd4a70d-b0): carrier: link connected
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.875 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe359d8-adb2-4399-8cc0-0e7bc31a7035]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.894 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b6805105-f52f-424e-9bef-5813de5a5f0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfdd4a70d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:5e:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578551, 'reachable_time': 23700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234740, 'error': None, 'target': 'ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.913 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[be0f132a-1005-4e5f-81b0-b9eb4dce61a6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5a:5e60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 578551, 'tstamp': 578551}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234741, 'error': None, 'target': 'ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.931 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[664c6af0-c6fd-44ff-abeb-43f49cdf90e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfdd4a70d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:5e:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578551, 'reachable_time': 23700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234742, 'error': None, 'target': 'ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:44.969 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[490c5682-eb5f-4fdf-baee-a5670df22d53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.035 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ca03f0b3-560c-47e7-b9a3-abe6295e20b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.039 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfdd4a70d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.040 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.040 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfdd4a70d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:45 np0005531888 NetworkManager[55166]: <info>  [1763799045.0433] manager: (tapfdd4a70d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Nov 22 03:10:45 np0005531888 kernel: tapfdd4a70d-b0: entered promiscuous mode
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.042 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.045 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.046 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfdd4a70d-b0, col_values=(('external_ids', {'iface-id': '68cba04c-b85c-4dfc-a385-2d9391f39967'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.048 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:45Z|00496|binding|INFO|Releasing lport 68cba04c-b85c-4dfc-a385-2d9391f39967 from this chassis (sb_readonly=0)
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.051 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.052 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fdd4a70d-b5e2-4660-bde8-e340d7113b23.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fdd4a70d-b5e2-4660-bde8-e340d7113b23.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.053 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d2b227-8d6c-4bc9-978d-e6d1d03338cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.054 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-fdd4a70d-b5e2-4660-bde8-e340d7113b23
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/fdd4a70d-b5e2-4660-bde8-e340d7113b23.pid.haproxy
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID fdd4a70d-b5e2-4660-bde8-e340d7113b23
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:10:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:45.054 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23', 'env', 'PROCESS_TAG=haproxy-fdd4a70d-b5e2-4660-bde8-e340d7113b23', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fdd4a70d-b5e2-4660-bde8-e340d7113b23.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.068 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.200 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799045.2000349, 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.201 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] VM Started (Lifecycle Event)#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.235 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.241 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799045.2001433, 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.241 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.260 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.263 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.282 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:10:45 np0005531888 podman[234780]: 2025-11-22 08:10:45.4157247 +0000 UTC m=+0.049514788 container create 959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:45 np0005531888 systemd[1]: Started libpod-conmon-959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30.scope.
Nov 22 03:10:45 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:10:45 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62d517033d38f210a97d41fca76366e253f1094e99ef069ca35936ff8d1b4552/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:10:45 np0005531888 podman[234780]: 2025-11-22 08:10:45.387602019 +0000 UTC m=+0.021392137 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.489 186792 DEBUG nova.network.neutron [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Updated VIF entry in instance network info cache for port d4f01247-c774-48c3-a540-be77c998237b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.490 186792 DEBUG nova.network.neutron [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Updating instance_info_cache with network_info: [{"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:45 np0005531888 podman[234780]: 2025-11-22 08:10:45.490321215 +0000 UTC m=+0.124111323 container init 959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 03:10:45 np0005531888 podman[234780]: 2025-11-22 08:10:45.495609295 +0000 UTC m=+0.129399383 container start 959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:45 np0005531888 neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23[234795]: [NOTICE]   (234799) : New worker (234801) forked
Nov 22 03:10:45 np0005531888 neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23[234795]: [NOTICE]   (234799) : Loading success.
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.521 186792 DEBUG oslo_concurrency.lockutils [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.549 186792 DEBUG nova.compute.manager [req-36bc2d4f-d5bc-4f55-a536-321a8bb0c837 req-2007ce78-8ae0-4d30-be4c-0fd8b756f393 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received event network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.550 186792 DEBUG oslo_concurrency.lockutils [req-36bc2d4f-d5bc-4f55-a536-321a8bb0c837 req-2007ce78-8ae0-4d30-be4c-0fd8b756f393 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.550 186792 DEBUG oslo_concurrency.lockutils [req-36bc2d4f-d5bc-4f55-a536-321a8bb0c837 req-2007ce78-8ae0-4d30-be4c-0fd8b756f393 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.550 186792 DEBUG oslo_concurrency.lockutils [req-36bc2d4f-d5bc-4f55-a536-321a8bb0c837 req-2007ce78-8ae0-4d30-be4c-0fd8b756f393 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.551 186792 DEBUG nova.compute.manager [req-36bc2d4f-d5bc-4f55-a536-321a8bb0c837 req-2007ce78-8ae0-4d30-be4c-0fd8b756f393 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Processing event network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.551 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.555 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799045.5555484, 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.556 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.557 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.561 186792 INFO nova.virt.libvirt.driver [-] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Instance spawned successfully.#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.562 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.586 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.591 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.592 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.592 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.592 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.593 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.593 186792 DEBUG nova.virt.libvirt.driver [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.597 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.643 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.687 186792 INFO nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Took 6.80 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.687 186792 DEBUG nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.759 186792 INFO nova.compute.manager [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Took 7.30 seconds to build instance.#033[00m
Nov 22 03:10:45 np0005531888 nova_compute[186788]: 2025-11-22 08:10:45.784 186792 DEBUG oslo_concurrency.lockutils [None req-8a2b2b1b-b926-4e65-993e-0eb44f4afa41 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:46 np0005531888 nova_compute[186788]: 2025-11-22 08:10:46.279 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799031.2788448, 8bd0c27f-4042-4314-9eee-7939d2dd2f99 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:46 np0005531888 nova_compute[186788]: 2025-11-22 08:10:46.280 186792 INFO nova.compute.manager [-] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:10:46 np0005531888 nova_compute[186788]: 2025-11-22 08:10:46.299 186792 DEBUG nova.compute.manager [None req-e5f085c9-0d36-48b5-a847-4866c03334db - - - - - -] [instance: 8bd0c27f-4042-4314-9eee-7939d2dd2f99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:47 np0005531888 nova_compute[186788]: 2025-11-22 08:10:47.794 186792 DEBUG nova.compute.manager [req-d9340c07-bf7a-41d4-978a-263e46a2eadd req-6de90bb6-b5e4-47db-a631-d2f338c8c0a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received event network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:47 np0005531888 nova_compute[186788]: 2025-11-22 08:10:47.794 186792 DEBUG oslo_concurrency.lockutils [req-d9340c07-bf7a-41d4-978a-263e46a2eadd req-6de90bb6-b5e4-47db-a631-d2f338c8c0a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:47 np0005531888 nova_compute[186788]: 2025-11-22 08:10:47.794 186792 DEBUG oslo_concurrency.lockutils [req-d9340c07-bf7a-41d4-978a-263e46a2eadd req-6de90bb6-b5e4-47db-a631-d2f338c8c0a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:47 np0005531888 nova_compute[186788]: 2025-11-22 08:10:47.795 186792 DEBUG oslo_concurrency.lockutils [req-d9340c07-bf7a-41d4-978a-263e46a2eadd req-6de90bb6-b5e4-47db-a631-d2f338c8c0a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:47 np0005531888 nova_compute[186788]: 2025-11-22 08:10:47.795 186792 DEBUG nova.compute.manager [req-d9340c07-bf7a-41d4-978a-263e46a2eadd req-6de90bb6-b5e4-47db-a631-d2f338c8c0a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] No waiting events found dispatching network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:47 np0005531888 nova_compute[186788]: 2025-11-22 08:10:47.795 186792 WARNING nova.compute.manager [req-d9340c07-bf7a-41d4-978a-263e46a2eadd req-6de90bb6-b5e4-47db-a631-d2f338c8c0a1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received unexpected event network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b for instance with vm_state active and task_state None.#033[00m
Nov 22 03:10:48 np0005531888 nova_compute[186788]: 2025-11-22 08:10:48.458 186792 DEBUG nova.compute.manager [None req-e6dba46c-ac99-44df-802a-a2e8e99b8c72 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Nov 22 03:10:48 np0005531888 nova_compute[186788]: 2025-11-22 08:10:48.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:48 np0005531888 nova_compute[186788]: 2025-11-22 08:10:48.751 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.071 186792 DEBUG nova.compute.manager [None req-d289fcac-c625-4b72-8d9c-a630a1cd5062 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.573 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.574 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.575 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.575 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.576 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.585 186792 INFO nova.compute.manager [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Terminating instance#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.593 186792 DEBUG nova.compute.manager [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:10:49 np0005531888 kernel: tapd4f01247-c7 (unregistering): left promiscuous mode
Nov 22 03:10:49 np0005531888 NetworkManager[55166]: <info>  [1763799049.6123] device (tapd4f01247-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:10:49 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:49Z|00497|binding|INFO|Releasing lport d4f01247-c774-48c3-a540-be77c998237b from this chassis (sb_readonly=0)
Nov 22 03:10:49 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:49Z|00498|binding|INFO|Setting lport d4f01247-c774-48c3-a540-be77c998237b down in Southbound
Nov 22 03:10:49 np0005531888 ovn_controller[95067]: 2025-11-22T08:10:49Z|00499|binding|INFO|Removing iface tapd4f01247-c7 ovn-installed in OVS
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.623 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.626 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.637 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:b3:5d 10.100.0.14'], port_security=['fa:16:3e:ab:b3:5d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7afef8c7-aef7-4c8a-a7a5-452a8d1283ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fdd4a70d-b5e2-4660-bde8-e340d7113b23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9efe4ed59a624fa4b5d2bc27fe981f24', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c3642cac-086c-43f0-b5a6-6cbe95c25c84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af88b1d0-b5ce-43c7-b82e-49d200ccbffa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=d4f01247-c774-48c3-a540-be77c998237b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.639 104023 INFO neutron.agent.ovn.metadata.agent [-] Port d4f01247-c774-48c3-a540-be77c998237b in datapath fdd4a70d-b5e2-4660-bde8-e340d7113b23 unbound from our chassis#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.641 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.641 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fdd4a70d-b5e2-4660-bde8-e340d7113b23, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.643 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3f92e9-57b3-4dec-be0b-83dda5dfbdee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.644 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23 namespace which is not needed anymore#033[00m
Nov 22 03:10:49 np0005531888 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000080.scope: Deactivated successfully.
Nov 22 03:10:49 np0005531888 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000080.scope: Consumed 4.497s CPU time.
Nov 22 03:10:49 np0005531888 systemd-machined[153106]: Machine qemu-62-instance-00000080 terminated.
Nov 22 03:10:49 np0005531888 neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23[234795]: [NOTICE]   (234799) : haproxy version is 2.8.14-c23fe91
Nov 22 03:10:49 np0005531888 neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23[234795]: [NOTICE]   (234799) : path to executable is /usr/sbin/haproxy
Nov 22 03:10:49 np0005531888 neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23[234795]: [WARNING]  (234799) : Exiting Master process...
Nov 22 03:10:49 np0005531888 neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23[234795]: [ALERT]    (234799) : Current worker (234801) exited with code 143 (Terminated)
Nov 22 03:10:49 np0005531888 neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23[234795]: [WARNING]  (234799) : All workers exited. Exiting... (0)
Nov 22 03:10:49 np0005531888 systemd[1]: libpod-959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30.scope: Deactivated successfully.
Nov 22 03:10:49 np0005531888 podman[234834]: 2025-11-22 08:10:49.779129971 +0000 UTC m=+0.048201327 container died 959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:49 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30-userdata-shm.mount: Deactivated successfully.
Nov 22 03:10:49 np0005531888 systemd[1]: var-lib-containers-storage-overlay-62d517033d38f210a97d41fca76366e253f1094e99ef069ca35936ff8d1b4552-merged.mount: Deactivated successfully.
Nov 22 03:10:49 np0005531888 podman[234834]: 2025-11-22 08:10:49.819701889 +0000 UTC m=+0.088773245 container cleanup 959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:10:49 np0005531888 systemd[1]: libpod-conmon-959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30.scope: Deactivated successfully.
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.857 186792 INFO nova.virt.libvirt.driver [-] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Instance destroyed successfully.#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.860 186792 DEBUG nova.objects.instance [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lazy-loading 'resources' on Instance uuid 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.877 186792 DEBUG nova.virt.libvirt.vif [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:10:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-2069296665',display_name='tempest-NoVNCConsoleTestJSON-server-2069296665',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-2069296665',id=128,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:10:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9efe4ed59a624fa4b5d2bc27fe981f24',ramdisk_id='',reservation_id='r-r0t46ty2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NoVNCConsoleTestJSON-1376948029',owner_user_name='tempest-NoVNCConsoleTestJSON-1376948029-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:10:45Z,user_data=None,user_id='9b428b7f3de34b3eb007bbf92c75f340',uuid=7afef8c7-aef7-4c8a-a7a5-452a8d1283ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.877 186792 DEBUG nova.network.os_vif_util [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Converting VIF {"id": "d4f01247-c774-48c3-a540-be77c998237b", "address": "fa:16:3e:ab:b3:5d", "network": {"id": "fdd4a70d-b5e2-4660-bde8-e340d7113b23", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-640787309-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9efe4ed59a624fa4b5d2bc27fe981f24", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f01247-c7", "ovs_interfaceid": "d4f01247-c774-48c3-a540-be77c998237b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.878 186792 DEBUG nova.network.os_vif_util [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:b3:5d,bridge_name='br-int',has_traffic_filtering=True,id=d4f01247-c774-48c3-a540-be77c998237b,network=Network(fdd4a70d-b5e2-4660-bde8-e340d7113b23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f01247-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.879 186792 DEBUG os_vif [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:b3:5d,bridge_name='br-int',has_traffic_filtering=True,id=d4f01247-c774-48c3-a540-be77c998237b,network=Network(fdd4a70d-b5e2-4660-bde8-e340d7113b23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f01247-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.880 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.881 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4f01247-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.882 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.885 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:10:49 np0005531888 podman[234869]: 2025-11-22 08:10:49.886513321 +0000 UTC m=+0.043467370 container remove 959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.887 186792 INFO os_vif [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:b3:5d,bridge_name='br-int',has_traffic_filtering=True,id=d4f01247-c774-48c3-a540-be77c998237b,network=Network(fdd4a70d-b5e2-4660-bde8-e340d7113b23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f01247-c7')#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.888 186792 INFO nova.virt.libvirt.driver [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Deleting instance files /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee_del#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.889 186792 INFO nova.virt.libvirt.driver [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Deletion of /var/lib/nova/instances/7afef8c7-aef7-4c8a-a7a5-452a8d1283ee_del complete#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.893 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[25120fd8-7438-4162-9aba-0832a8b82ce4]: (4, ('Sat Nov 22 08:10:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23 (959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30)\n959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30\nSat Nov 22 08:10:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23 (959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30)\n959792fa165f9174169bca95ed47d94a4c1dc8c3c8e5ba3e1b4be0666235ff30\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.895 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eafe67ba-3714-48d6-927f-11d11b0a64d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.896 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfdd4a70d-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:49 np0005531888 kernel: tapfdd4a70d-b0: left promiscuous mode
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.898 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.909 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.912 186792 DEBUG nova.compute.manager [req-3af30de7-2485-4025-b9ed-6d8ef65f9942 req-1ae13570-a1bc-4f90-a3b1-b9bd92dbc7b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received event network-vif-unplugged-d4f01247-c774-48c3-a540-be77c998237b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.912 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2776492a-cf80-4b5a-b232-50776f586475]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.913 186792 DEBUG oslo_concurrency.lockutils [req-3af30de7-2485-4025-b9ed-6d8ef65f9942 req-1ae13570-a1bc-4f90-a3b1-b9bd92dbc7b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.913 186792 DEBUG oslo_concurrency.lockutils [req-3af30de7-2485-4025-b9ed-6d8ef65f9942 req-1ae13570-a1bc-4f90-a3b1-b9bd92dbc7b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.913 186792 DEBUG oslo_concurrency.lockutils [req-3af30de7-2485-4025-b9ed-6d8ef65f9942 req-1ae13570-a1bc-4f90-a3b1-b9bd92dbc7b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.913 186792 DEBUG nova.compute.manager [req-3af30de7-2485-4025-b9ed-6d8ef65f9942 req-1ae13570-a1bc-4f90-a3b1-b9bd92dbc7b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] No waiting events found dispatching network-vif-unplugged-d4f01247-c774-48c3-a540-be77c998237b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.914 186792 DEBUG nova.compute.manager [req-3af30de7-2485-4025-b9ed-6d8ef65f9942 req-1ae13570-a1bc-4f90-a3b1-b9bd92dbc7b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received event network-vif-unplugged-d4f01247-c774-48c3-a540-be77c998237b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.930 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4bfb0e-120d-4cc9-b63c-de737befc7d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.931 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7b7656-4ab9-4d91-acc8-de8799d07999]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.950 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[142bd6f9-50f5-4fdd-a0c5-a7dcacad999b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578544, 'reachable_time': 36103, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234895, 'error': None, 'target': 'ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.952 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fdd4a70d-b5e2-4660-bde8-e340d7113b23 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:10:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:10:49.952 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[3e6612ce-3389-4e1b-94ad-e5295a226231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:49 np0005531888 systemd[1]: run-netns-ovnmeta\x2dfdd4a70d\x2db5e2\x2d4660\x2dbde8\x2de340d7113b23.mount: Deactivated successfully.
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.962 186792 INFO nova.compute.manager [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.963 186792 DEBUG oslo.service.loopingcall [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.963 186792 DEBUG nova.compute.manager [-] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:10:49 np0005531888 nova_compute[186788]: 2025-11-22 08:10:49.963 186792 DEBUG nova.network.neutron [-] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:10:50 np0005531888 nova_compute[186788]: 2025-11-22 08:10:50.790 186792 DEBUG nova.network.neutron [-] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:50 np0005531888 nova_compute[186788]: 2025-11-22 08:10:50.827 186792 INFO nova.compute.manager [-] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Took 0.86 seconds to deallocate network for instance.#033[00m
Nov 22 03:10:50 np0005531888 nova_compute[186788]: 2025-11-22 08:10:50.894 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:50 np0005531888 nova_compute[186788]: 2025-11-22 08:10:50.894 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:50 np0005531888 nova_compute[186788]: 2025-11-22 08:10:50.960 186792 DEBUG nova.compute.provider_tree [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:50 np0005531888 nova_compute[186788]: 2025-11-22 08:10:50.978 186792 DEBUG nova.scheduler.client.report [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:50 np0005531888 nova_compute[186788]: 2025-11-22 08:10:50.996 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:51 np0005531888 nova_compute[186788]: 2025-11-22 08:10:51.031 186792 INFO nova.scheduler.client.report [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Deleted allocations for instance 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee#033[00m
Nov 22 03:10:51 np0005531888 nova_compute[186788]: 2025-11-22 08:10:51.107 186792 DEBUG oslo_concurrency.lockutils [None req-13b8aebc-aeab-42a2-9076-e73685a87a38 9b428b7f3de34b3eb007bbf92c75f340 9efe4ed59a624fa4b5d2bc27fe981f24 - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:51 np0005531888 podman[234896]: 2025-11-22 08:10:51.678594984 +0000 UTC m=+0.050245067 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:10:52 np0005531888 nova_compute[186788]: 2025-11-22 08:10:52.257 186792 DEBUG nova.compute.manager [req-cfab20ee-db0e-4c23-bbe8-56f7270abedc req-fdc424b8-bdad-40e8-8282-baa2b3a9687a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received event network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:52 np0005531888 nova_compute[186788]: 2025-11-22 08:10:52.257 186792 DEBUG oslo_concurrency.lockutils [req-cfab20ee-db0e-4c23-bbe8-56f7270abedc req-fdc424b8-bdad-40e8-8282-baa2b3a9687a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:52 np0005531888 nova_compute[186788]: 2025-11-22 08:10:52.258 186792 DEBUG oslo_concurrency.lockutils [req-cfab20ee-db0e-4c23-bbe8-56f7270abedc req-fdc424b8-bdad-40e8-8282-baa2b3a9687a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:52 np0005531888 nova_compute[186788]: 2025-11-22 08:10:52.258 186792 DEBUG oslo_concurrency.lockutils [req-cfab20ee-db0e-4c23-bbe8-56f7270abedc req-fdc424b8-bdad-40e8-8282-baa2b3a9687a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7afef8c7-aef7-4c8a-a7a5-452a8d1283ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:52 np0005531888 nova_compute[186788]: 2025-11-22 08:10:52.258 186792 DEBUG nova.compute.manager [req-cfab20ee-db0e-4c23-bbe8-56f7270abedc req-fdc424b8-bdad-40e8-8282-baa2b3a9687a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] No waiting events found dispatching network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:52 np0005531888 nova_compute[186788]: 2025-11-22 08:10:52.258 186792 WARNING nova.compute.manager [req-cfab20ee-db0e-4c23-bbe8-56f7270abedc req-fdc424b8-bdad-40e8-8282-baa2b3a9687a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received unexpected event network-vif-plugged-d4f01247-c774-48c3-a540-be77c998237b for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:10:52 np0005531888 nova_compute[186788]: 2025-11-22 08:10:52.258 186792 DEBUG nova.compute.manager [req-cfab20ee-db0e-4c23-bbe8-56f7270abedc req-fdc424b8-bdad-40e8-8282-baa2b3a9687a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Received event network-vif-deleted-d4f01247-c774-48c3-a540-be77c998237b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:53 np0005531888 podman[234917]: 2025-11-22 08:10:53.680661161 +0000 UTC m=+0.048353729 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:10:53 np0005531888 nova_compute[186788]: 2025-11-22 08:10:53.752 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:54 np0005531888 nova_compute[186788]: 2025-11-22 08:10:54.759 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:54 np0005531888 nova_compute[186788]: 2025-11-22 08:10:54.883 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:57 np0005531888 podman[234942]: 2025-11-22 08:10:57.68164001 +0000 UTC m=+0.057625659 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 22 03:10:57 np0005531888 nova_compute[186788]: 2025-11-22 08:10:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:58 np0005531888 nova_compute[186788]: 2025-11-22 08:10:58.754 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:58 np0005531888 nova_compute[186788]: 2025-11-22 08:10:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:58 np0005531888 nova_compute[186788]: 2025-11-22 08:10:58.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:10:58 np0005531888 nova_compute[186788]: 2025-11-22 08:10:58.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:10:58 np0005531888 nova_compute[186788]: 2025-11-22 08:10:58.966 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:10:59 np0005531888 nova_compute[186788]: 2025-11-22 08:10:59.885 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:59 np0005531888 nova_compute[186788]: 2025-11-22 08:10:59.961 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:01 np0005531888 podman[234963]: 2025-11-22 08:11:01.696648081 +0000 UTC m=+0.067679755 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:11:01 np0005531888 podman[234964]: 2025-11-22 08:11:01.755620121 +0000 UTC m=+0.121117899 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:11:01 np0005531888 nova_compute[186788]: 2025-11-22 08:11:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:02 np0005531888 nova_compute[186788]: 2025-11-22 08:11:02.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:03 np0005531888 nova_compute[186788]: 2025-11-22 08:11:03.756 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:04 np0005531888 nova_compute[186788]: 2025-11-22 08:11:04.857 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799049.8559794, 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:04 np0005531888 nova_compute[186788]: 2025-11-22 08:11:04.857 186792 INFO nova.compute.manager [-] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:11:04 np0005531888 nova_compute[186788]: 2025-11-22 08:11:04.889 186792 DEBUG nova.compute.manager [None req-7ee2ff78-6c3e-4aee-aa86-425660b7a474 - - - - - -] [instance: 7afef8c7-aef7-4c8a-a7a5-452a8d1283ee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:04 np0005531888 nova_compute[186788]: 2025-11-22 08:11:04.889 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:05 np0005531888 nova_compute[186788]: 2025-11-22 08:11:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:07 np0005531888 nova_compute[186788]: 2025-11-22 08:11:07.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:07 np0005531888 nova_compute[186788]: 2025-11-22 08:11:07.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:07 np0005531888 nova_compute[186788]: 2025-11-22 08:11:07.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:07 np0005531888 nova_compute[186788]: 2025-11-22 08:11:07.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:07 np0005531888 nova_compute[186788]: 2025-11-22 08:11:07.983 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.178 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.179 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5724MB free_disk=73.27410507202148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.180 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.180 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.275 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.275 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.299 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.324 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.359 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.360 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:08 np0005531888 nova_compute[186788]: 2025-11-22 08:11:08.758 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:09 np0005531888 nova_compute[186788]: 2025-11-22 08:11:09.360 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:09 np0005531888 nova_compute[186788]: 2025-11-22 08:11:09.361 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:11:09 np0005531888 nova_compute[186788]: 2025-11-22 08:11:09.891 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:10 np0005531888 nova_compute[186788]: 2025-11-22 08:11:10.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:12 np0005531888 podman[235011]: 2025-11-22 08:11:12.67978914 +0000 UTC m=+0.046032502 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:11:12 np0005531888 podman[235012]: 2025-11-22 08:11:12.686471365 +0000 UTC m=+0.048122074 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 03:11:13 np0005531888 nova_compute[186788]: 2025-11-22 08:11:13.760 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:13 np0005531888 nova_compute[186788]: 2025-11-22 08:11:13.969 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "3fcd21fa-f23a-474f-8980-8dbcbded9238" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:13 np0005531888 nova_compute[186788]: 2025-11-22 08:11:13.970 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.001 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.093 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.093 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.100 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.100 186792 INFO nova.compute.claims [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.215 186792 DEBUG nova.compute.provider_tree [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.234 186792 DEBUG nova.scheduler.client.report [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.281 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.281 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.354 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.355 186792 DEBUG nova.network.neutron [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.372 186792 INFO nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.391 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.496 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.497 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.498 186792 INFO nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Creating image(s)#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.498 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "/var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.498 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "/var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.499 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "/var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.511 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.569 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.570 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.571 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.583 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.616 186792 DEBUG nova.policy [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7ac36fa0a934eec9a5ced482bdc3e78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2490428c0ca1403591486dc168517841', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.639 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.640 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.793 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk 1073741824" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.795 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.796 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.857 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.858 186792 DEBUG nova.virt.disk.api [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Checking if we can resize image /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.858 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.893 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.925 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.926 186792 DEBUG nova.virt.disk.api [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Cannot resize image /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.926 186792 DEBUG nova.objects.instance [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lazy-loading 'migration_context' on Instance uuid 3fcd21fa-f23a-474f-8980-8dbcbded9238 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.940 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.941 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Ensure instance console log exists: /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.942 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.942 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:14 np0005531888 nova_compute[186788]: 2025-11-22 08:11:14.942 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:15.624 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:2c:69 2001:db8:0:1:f816:3eff:fe30:2c69 2001:db8::f816:3eff:fe30:2c69'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe30:2c69/64 2001:db8::f816:3eff:fe30:2c69/64', 'neutron:device_id': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d81a98b9-7f60-4da8-a82f-30c94c08d498, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f86e6fc7-3969-4922-9612-9c86d85f21ec) old=Port_Binding(mac=['fa:16:3e:30:2c:69 2001:db8::f816:3eff:fe30:2c69'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe30:2c69/64', 'neutron:device_id': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:15.625 104023 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f86e6fc7-3969-4922-9612-9c86d85f21ec in datapath 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 updated#033[00m
Nov 22 03:11:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:15.627 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:11:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:15.628 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cd0ab0-596e-409c-bb44-aa558aa6910d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:15 np0005531888 nova_compute[186788]: 2025-11-22 08:11:15.698 186792 DEBUG nova.network.neutron [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Successfully created port: eebcc821-cea3-4304-b856-eb388ff10624 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.391 186792 DEBUG nova.network.neutron [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Successfully updated port: eebcc821-cea3-4304-b856-eb388ff10624 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.406 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "refresh_cache-3fcd21fa-f23a-474f-8980-8dbcbded9238" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.407 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquired lock "refresh_cache-3fcd21fa-f23a-474f-8980-8dbcbded9238" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.407 186792 DEBUG nova.network.neutron [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.524 186792 DEBUG nova.compute.manager [req-1e89c50b-3c6a-49c5-8315-8bf7d7235fff req-d8f23039-ebc5-4417-a201-a5239d1bc7d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received event network-changed-eebcc821-cea3-4304-b856-eb388ff10624 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.524 186792 DEBUG nova.compute.manager [req-1e89c50b-3c6a-49c5-8315-8bf7d7235fff req-d8f23039-ebc5-4417-a201-a5239d1bc7d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Refreshing instance network info cache due to event network-changed-eebcc821-cea3-4304-b856-eb388ff10624. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.525 186792 DEBUG oslo_concurrency.lockutils [req-1e89c50b-3c6a-49c5-8315-8bf7d7235fff req-d8f23039-ebc5-4417-a201-a5239d1bc7d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3fcd21fa-f23a-474f-8980-8dbcbded9238" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.621 186792 DEBUG nova.network.neutron [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:11:17 np0005531888 nova_compute[186788]: 2025-11-22 08:11:17.826 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:17.826 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:17.828 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.692 186792 DEBUG nova.network.neutron [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Updating instance_info_cache with network_info: [{"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.715 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Releasing lock "refresh_cache-3fcd21fa-f23a-474f-8980-8dbcbded9238" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.716 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Instance network_info: |[{"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.716 186792 DEBUG oslo_concurrency.lockutils [req-1e89c50b-3c6a-49c5-8315-8bf7d7235fff req-d8f23039-ebc5-4417-a201-a5239d1bc7d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3fcd21fa-f23a-474f-8980-8dbcbded9238" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.716 186792 DEBUG nova.network.neutron [req-1e89c50b-3c6a-49c5-8315-8bf7d7235fff req-d8f23039-ebc5-4417-a201-a5239d1bc7d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Refreshing network info cache for port eebcc821-cea3-4304-b856-eb388ff10624 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.719 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Start _get_guest_xml network_info=[{"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.724 186792 WARNING nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.733 186792 DEBUG nova.virt.libvirt.host [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.734 186792 DEBUG nova.virt.libvirt.host [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.737 186792 DEBUG nova.virt.libvirt.host [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.738 186792 DEBUG nova.virt.libvirt.host [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.741 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.741 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.742 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.742 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.742 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.743 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.743 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.743 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.743 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.744 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.744 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.744 186792 DEBUG nova.virt.hardware [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.748 186792 DEBUG nova.virt.libvirt.vif [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:11:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1041928057',display_name='tempest-ServerAddressesNegativeTestJSON-server-1041928057',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1041928057',id=130,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2490428c0ca1403591486dc168517841',ramdisk_id='',reservation_id='r-tt3mn4gt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-129173326',owner_user_name='tempest-ServerAddressesNegativeTestJSON-129173326-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:14Z,user_data=None,user_id='c7ac36fa0a934eec9a5ced482bdc3e78',uuid=3fcd21fa-f23a-474f-8980-8dbcbded9238,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.748 186792 DEBUG nova.network.os_vif_util [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Converting VIF {"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.749 186792 DEBUG nova.network.os_vif_util [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:c2:fd,bridge_name='br-int',has_traffic_filtering=True,id=eebcc821-cea3-4304-b856-eb388ff10624,network=Network(992fd996-37d5-4773-a99d-dcf5c136f735),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeebcc821-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.750 186792 DEBUG nova.objects.instance [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3fcd21fa-f23a-474f-8980-8dbcbded9238 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.761 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.765 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <uuid>3fcd21fa-f23a-474f-8980-8dbcbded9238</uuid>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <name>instance-00000082</name>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerAddressesNegativeTestJSON-server-1041928057</nova:name>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:11:18</nova:creationTime>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        <nova:user uuid="c7ac36fa0a934eec9a5ced482bdc3e78">tempest-ServerAddressesNegativeTestJSON-129173326-project-member</nova:user>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        <nova:project uuid="2490428c0ca1403591486dc168517841">tempest-ServerAddressesNegativeTestJSON-129173326</nova:project>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        <nova:port uuid="eebcc821-cea3-4304-b856-eb388ff10624">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <entry name="serial">3fcd21fa-f23a-474f-8980-8dbcbded9238</entry>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <entry name="uuid">3fcd21fa-f23a-474f-8980-8dbcbded9238</entry>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk.config"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:da:c2:fd"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <target dev="tapeebcc821-ce"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/console.log" append="off"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:11:18 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:11:18 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:11:18 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:11:18 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.765 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Preparing to wait for external event network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.766 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.766 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.766 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.767 186792 DEBUG nova.virt.libvirt.vif [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:11:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1041928057',display_name='tempest-ServerAddressesNegativeTestJSON-server-1041928057',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1041928057',id=130,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2490428c0ca1403591486dc168517841',ramdisk_id='',reservation_id='r-tt3mn4gt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-129173326',owner_user_name='tempest-ServerAddressesNegativeTestJSON-129173326-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:14Z,user_data=None,user_id='c7ac36fa0a934eec9a5ced482bdc3e78',uuid=3fcd21fa-f23a-474f-8980-8dbcbded9238,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.767 186792 DEBUG nova.network.os_vif_util [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Converting VIF {"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.768 186792 DEBUG nova.network.os_vif_util [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:c2:fd,bridge_name='br-int',has_traffic_filtering=True,id=eebcc821-cea3-4304-b856-eb388ff10624,network=Network(992fd996-37d5-4773-a99d-dcf5c136f735),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeebcc821-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.768 186792 DEBUG os_vif [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:c2:fd,bridge_name='br-int',has_traffic_filtering=True,id=eebcc821-cea3-4304-b856-eb388ff10624,network=Network(992fd996-37d5-4773-a99d-dcf5c136f735),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeebcc821-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.769 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.769 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.770 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.772 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.772 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeebcc821-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.773 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeebcc821-ce, col_values=(('external_ids', {'iface-id': 'eebcc821-cea3-4304-b856-eb388ff10624', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:c2:fd', 'vm-uuid': '3fcd21fa-f23a-474f-8980-8dbcbded9238'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.774 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:18 np0005531888 NetworkManager[55166]: <info>  [1763799078.7753] manager: (tapeebcc821-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.776 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.781 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.782 186792 INFO os_vif [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:c2:fd,bridge_name='br-int',has_traffic_filtering=True,id=eebcc821-cea3-4304-b856-eb388ff10624,network=Network(992fd996-37d5-4773-a99d-dcf5c136f735),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeebcc821-ce')#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.838 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.839 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.839 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] No VIF found with MAC fa:16:3e:da:c2:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:11:18 np0005531888 nova_compute[186788]: 2025-11-22 08:11:18.840 186792 INFO nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Using config drive#033[00m
Nov 22 03:11:19 np0005531888 nova_compute[186788]: 2025-11-22 08:11:19.563 186792 INFO nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Creating config drive at /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk.config#033[00m
Nov 22 03:11:19 np0005531888 nova_compute[186788]: 2025-11-22 08:11:19.569 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpemew5k1s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:19 np0005531888 nova_compute[186788]: 2025-11-22 08:11:19.699 186792 DEBUG oslo_concurrency.processutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpemew5k1s" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:19 np0005531888 kernel: tapeebcc821-ce: entered promiscuous mode
Nov 22 03:11:19 np0005531888 NetworkManager[55166]: <info>  [1763799079.7779] manager: (tapeebcc821-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/237)
Nov 22 03:11:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:19Z|00500|binding|INFO|Claiming lport eebcc821-cea3-4304-b856-eb388ff10624 for this chassis.
Nov 22 03:11:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:19Z|00501|binding|INFO|eebcc821-cea3-4304-b856-eb388ff10624: Claiming fa:16:3e:da:c2:fd 10.100.0.6
Nov 22 03:11:19 np0005531888 nova_compute[186788]: 2025-11-22 08:11:19.783 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:19 np0005531888 nova_compute[186788]: 2025-11-22 08:11:19.787 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:19 np0005531888 nova_compute[186788]: 2025-11-22 08:11:19.789 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:19 np0005531888 systemd-machined[153106]: New machine qemu-63-instance-00000082.
Nov 22 03:11:19 np0005531888 systemd-udevd[235085]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:11:19 np0005531888 NetworkManager[55166]: <info>  [1763799079.8239] device (tapeebcc821-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:11:19 np0005531888 NetworkManager[55166]: <info>  [1763799079.8249] device (tapeebcc821-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:11:19 np0005531888 systemd[1]: Started Virtual Machine qemu-63-instance-00000082.
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.840 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:c2:fd 10.100.0.6'], port_security=['fa:16:3e:da:c2:fd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3fcd21fa-f23a-474f-8980-8dbcbded9238', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-992fd996-37d5-4773-a99d-dcf5c136f735', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2490428c0ca1403591486dc168517841', 'neutron:revision_number': '2', 'neutron:security_group_ids': '56841950-dcc3-44e3-806c-87ee36227df9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b267ea6-827e-4dcd-9237-5e03570a02a9, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=eebcc821-cea3-4304-b856-eb388ff10624) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:19 np0005531888 nova_compute[186788]: 2025-11-22 08:11:19.842 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.843 104023 INFO neutron.agent.ovn.metadata.agent [-] Port eebcc821-cea3-4304-b856-eb388ff10624 in datapath 992fd996-37d5-4773-a99d-dcf5c136f735 bound to our chassis#033[00m
Nov 22 03:11:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:19Z|00502|binding|INFO|Setting lport eebcc821-cea3-4304-b856-eb388ff10624 ovn-installed in OVS
Nov 22 03:11:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:19Z|00503|binding|INFO|Setting lport eebcc821-cea3-4304-b856-eb388ff10624 up in Southbound
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.845 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 992fd996-37d5-4773-a99d-dcf5c136f735#033[00m
Nov 22 03:11:19 np0005531888 nova_compute[186788]: 2025-11-22 08:11:19.846 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.855 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e163529f-19e8-4dfb-afcd-8045526fef1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.856 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap992fd996-31 in ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.857 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap992fd996-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.858 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[de289468-7c43-4246-bb3a-f6770ed34970]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.859 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2213d489-d01b-4bea-991b-e1d6b3c14468]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.870 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b37808-cd09-4324-939f-8a75536de8fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.887 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d468b895-d414-4519-8a73-edc05bcae396]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.918 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[656792ca-631d-4fa4-8f2a-bdd45946bcc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 NetworkManager[55166]: <info>  [1763799079.9258] manager: (tap992fd996-30): new Veth device (/org/freedesktop/NetworkManager/Devices/238)
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.925 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[744ed9fa-20b4-4a9d-83cc-d1e4bc5295df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.958 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f669bec2-ee1c-4e65-9234-227425399e5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.961 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[52c48d81-ce3b-4f4b-ac1f-b37f4d51a7ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:19 np0005531888 NetworkManager[55166]: <info>  [1763799079.9881] device (tap992fd996-30): carrier: link connected
Nov 22 03:11:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:19.994 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[9de5c454-58a0-4087-8db8-1764ce15cc65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.009 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9409e8-f131-4e31-ae8a-d1d8338982e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap992fd996-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:4b:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582063, 'reachable_time': 31409, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235118, 'error': None, 'target': 'ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.022 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7e5019-4080-472c-9295-3213df412ca8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe39:4b7a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 582063, 'tstamp': 582063}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235119, 'error': None, 'target': 'ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.038 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[776768ed-ec71-4402-ad69-a611621627e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap992fd996-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:4b:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582063, 'reachable_time': 31409, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235120, 'error': None, 'target': 'ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.072 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[905805f7-1f70-4d1a-8c74-6deab7309b1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.123 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[386e9a46-21fd-4fb1-8854-b5df26ef79d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.125 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap992fd996-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.125 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.125 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap992fd996-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.127 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:20 np0005531888 kernel: tap992fd996-30: entered promiscuous mode
Nov 22 03:11:20 np0005531888 NetworkManager[55166]: <info>  [1763799080.1287] manager: (tap992fd996-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.129 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.130 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap992fd996-30, col_values=(('external_ids', {'iface-id': 'fcc126e0-c36b-4757-b482-4e4193296f66'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.131 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:20Z|00504|binding|INFO|Releasing lport fcc126e0-c36b-4757-b482-4e4193296f66 from this chassis (sb_readonly=0)
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.145 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.146 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/992fd996-37d5-4773-a99d-dcf5c136f735.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/992fd996-37d5-4773-a99d-dcf5c136f735.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.147 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[58521b35-b574-453b-b5fd-e8bea43de75a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.147 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-992fd996-37d5-4773-a99d-dcf5c136f735
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/992fd996-37d5-4773-a99d-dcf5c136f735.pid.haproxy
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 992fd996-37d5-4773-a99d-dcf5c136f735
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:11:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:20.148 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735', 'env', 'PROCESS_TAG=haproxy-992fd996-37d5-4773-a99d-dcf5c136f735', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/992fd996-37d5-4773-a99d-dcf5c136f735.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.210 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799080.2101452, 3fcd21fa-f23a-474f-8980-8dbcbded9238 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.211 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] VM Started (Lifecycle Event)#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.228 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.232 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799080.2103748, 3fcd21fa-f23a-474f-8980-8dbcbded9238 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.232 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.250 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.254 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.270 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:11:20 np0005531888 podman[235159]: 2025-11-22 08:11:20.520948161 +0000 UTC m=+0.051216601 container create f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.546 186792 DEBUG nova.network.neutron [req-1e89c50b-3c6a-49c5-8315-8bf7d7235fff req-d8f23039-ebc5-4417-a201-a5239d1bc7d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Updated VIF entry in instance network info cache for port eebcc821-cea3-4304-b856-eb388ff10624. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.546 186792 DEBUG nova.network.neutron [req-1e89c50b-3c6a-49c5-8315-8bf7d7235fff req-d8f23039-ebc5-4417-a201-a5239d1bc7d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Updating instance_info_cache with network_info: [{"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:20 np0005531888 systemd[1]: Started libpod-conmon-f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e.scope.
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.559 186792 DEBUG oslo_concurrency.lockutils [req-1e89c50b-3c6a-49c5-8315-8bf7d7235fff req-d8f23039-ebc5-4417-a201-a5239d1bc7d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3fcd21fa-f23a-474f-8980-8dbcbded9238" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:11:20 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:11:20 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260d714a3cf522b612b07116eb2fa2daa57a6c3b873500bb5e885efc5ce9bffe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:11:20 np0005531888 podman[235159]: 2025-11-22 08:11:20.493021584 +0000 UTC m=+0.023290044 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:11:20 np0005531888 podman[235159]: 2025-11-22 08:11:20.595702899 +0000 UTC m=+0.125971339 container init f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:11:20 np0005531888 podman[235159]: 2025-11-22 08:11:20.601480011 +0000 UTC m=+0.131748451 container start f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:11:20 np0005531888 neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735[235174]: [NOTICE]   (235178) : New worker (235180) forked
Nov 22 03:11:20 np0005531888 neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735[235174]: [NOTICE]   (235178) : Loading success.
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.656 186792 DEBUG nova.compute.manager [req-60c91898-0608-4b49-8286-e2c23165caf9 req-72da0cf7-2d0e-4128-8525-37ab4ecc7938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received event network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.657 186792 DEBUG oslo_concurrency.lockutils [req-60c91898-0608-4b49-8286-e2c23165caf9 req-72da0cf7-2d0e-4128-8525-37ab4ecc7938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.657 186792 DEBUG oslo_concurrency.lockutils [req-60c91898-0608-4b49-8286-e2c23165caf9 req-72da0cf7-2d0e-4128-8525-37ab4ecc7938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.657 186792 DEBUG oslo_concurrency.lockutils [req-60c91898-0608-4b49-8286-e2c23165caf9 req-72da0cf7-2d0e-4128-8525-37ab4ecc7938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.657 186792 DEBUG nova.compute.manager [req-60c91898-0608-4b49-8286-e2c23165caf9 req-72da0cf7-2d0e-4128-8525-37ab4ecc7938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Processing event network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.658 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.662 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799080.6623847, 3fcd21fa-f23a-474f-8980-8dbcbded9238 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.663 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.664 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.667 186792 INFO nova.virt.libvirt.driver [-] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Instance spawned successfully.#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.668 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.683 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.688 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.691 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.691 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.692 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.692 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.693 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.693 186792 DEBUG nova.virt.libvirt.driver [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.722 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.857 186792 INFO nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Took 6.36 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:11:20 np0005531888 nova_compute[186788]: 2025-11-22 08:11:20.857 186792 DEBUG nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:21 np0005531888 nova_compute[186788]: 2025-11-22 08:11:21.122 186792 INFO nova.compute.manager [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Took 7.07 seconds to build instance.#033[00m
Nov 22 03:11:21 np0005531888 nova_compute[186788]: 2025-11-22 08:11:21.175 186792 DEBUG oslo_concurrency.lockutils [None req-971fc3ed-68e7-4bc0-8688-befbef7b3a2b c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.516 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "3fcd21fa-f23a-474f-8980-8dbcbded9238" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.517 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.517 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.517 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.518 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.525 186792 INFO nova.compute.manager [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Terminating instance#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.533 186792 DEBUG nova.compute.manager [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:11:22 np0005531888 kernel: tapeebcc821-ce (unregistering): left promiscuous mode
Nov 22 03:11:22 np0005531888 NetworkManager[55166]: <info>  [1763799082.5528] device (tapeebcc821-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.558 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:22Z|00505|binding|INFO|Releasing lport eebcc821-cea3-4304-b856-eb388ff10624 from this chassis (sb_readonly=0)
Nov 22 03:11:22 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:22Z|00506|binding|INFO|Setting lport eebcc821-cea3-4304-b856-eb388ff10624 down in Southbound
Nov 22 03:11:22 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:22Z|00507|binding|INFO|Removing iface tapeebcc821-ce ovn-installed in OVS
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.560 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.570 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:c2:fd 10.100.0.6'], port_security=['fa:16:3e:da:c2:fd 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3fcd21fa-f23a-474f-8980-8dbcbded9238', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-992fd996-37d5-4773-a99d-dcf5c136f735', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2490428c0ca1403591486dc168517841', 'neutron:revision_number': '4', 'neutron:security_group_ids': '56841950-dcc3-44e3-806c-87ee36227df9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b267ea6-827e-4dcd-9237-5e03570a02a9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=eebcc821-cea3-4304-b856-eb388ff10624) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.572 104023 INFO neutron.agent.ovn.metadata.agent [-] Port eebcc821-cea3-4304-b856-eb388ff10624 in datapath 992fd996-37d5-4773-a99d-dcf5c136f735 unbound from our chassis#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.574 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.575 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 992fd996-37d5-4773-a99d-dcf5c136f735, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.576 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3c5e2f-72aa-411e-8deb-c2d53b71bc80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.576 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735 namespace which is not needed anymore#033[00m
Nov 22 03:11:22 np0005531888 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000082.scope: Deactivated successfully.
Nov 22 03:11:22 np0005531888 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000082.scope: Consumed 2.236s CPU time.
Nov 22 03:11:22 np0005531888 systemd-machined[153106]: Machine qemu-63-instance-00000082 terminated.
Nov 22 03:11:22 np0005531888 podman[235189]: 2025-11-22 08:11:22.64143554 +0000 UTC m=+0.055408083 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:11:22 np0005531888 neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735[235174]: [NOTICE]   (235178) : haproxy version is 2.8.14-c23fe91
Nov 22 03:11:22 np0005531888 neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735[235174]: [NOTICE]   (235178) : path to executable is /usr/sbin/haproxy
Nov 22 03:11:22 np0005531888 neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735[235174]: [WARNING]  (235178) : Exiting Master process...
Nov 22 03:11:22 np0005531888 neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735[235174]: [ALERT]    (235178) : Current worker (235180) exited with code 143 (Terminated)
Nov 22 03:11:22 np0005531888 neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735[235174]: [WARNING]  (235178) : All workers exited. Exiting... (0)
Nov 22 03:11:22 np0005531888 systemd[1]: libpod-f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e.scope: Deactivated successfully.
Nov 22 03:11:22 np0005531888 podman[235228]: 2025-11-22 08:11:22.699931049 +0000 UTC m=+0.043318037 container died f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:11:22 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e-userdata-shm.mount: Deactivated successfully.
Nov 22 03:11:22 np0005531888 systemd[1]: var-lib-containers-storage-overlay-260d714a3cf522b612b07116eb2fa2daa57a6c3b873500bb5e885efc5ce9bffe-merged.mount: Deactivated successfully.
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.733 186792 DEBUG nova.compute.manager [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received event network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.735 186792 DEBUG oslo_concurrency.lockutils [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.735 186792 DEBUG oslo_concurrency.lockutils [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.735 186792 DEBUG oslo_concurrency.lockutils [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.735 186792 DEBUG nova.compute.manager [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] No waiting events found dispatching network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.736 186792 WARNING nova.compute.manager [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received unexpected event network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.736 186792 DEBUG nova.compute.manager [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received event network-vif-unplugged-eebcc821-cea3-4304-b856-eb388ff10624 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.736 186792 DEBUG oslo_concurrency.lockutils [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.738 186792 DEBUG oslo_concurrency.lockutils [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.739 186792 DEBUG oslo_concurrency.lockutils [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:22 np0005531888 podman[235228]: 2025-11-22 08:11:22.739314127 +0000 UTC m=+0.082701115 container cleanup f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.739 186792 DEBUG nova.compute.manager [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] No waiting events found dispatching network-vif-unplugged-eebcc821-cea3-4304-b856-eb388ff10624 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.739 186792 DEBUG nova.compute.manager [req-2546d2db-c431-41bb-9dc3-ae3ad432f80b req-7c247d0a-1cfd-4239-8bc3-c1c9403b80b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received event network-vif-unplugged-eebcc821-cea3-4304-b856-eb388ff10624 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:11:22 np0005531888 systemd[1]: libpod-conmon-f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e.scope: Deactivated successfully.
Nov 22 03:11:22 np0005531888 NetworkManager[55166]: <info>  [1763799082.7587] manager: (tapeebcc821-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Nov 22 03:11:22 np0005531888 podman[235257]: 2025-11-22 08:11:22.808363745 +0000 UTC m=+0.048448642 container remove f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.813 186792 INFO nova.virt.libvirt.driver [-] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Instance destroyed successfully.#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.815 186792 DEBUG nova.objects.instance [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lazy-loading 'resources' on Instance uuid 3fcd21fa-f23a-474f-8980-8dbcbded9238 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.815 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea054c7-7b2b-4c7e-9174-33059341b2e4]: (4, ('Sat Nov 22 08:11:22 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735 (f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e)\nf0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e\nSat Nov 22 08:11:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735 (f0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e)\nf0076d05135fe683756995eecc69b03456f173aaa131ddbdb8cf39d425700f7e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.817 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7d25561b-12d0-4de3-894f-0c6db620571e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.819 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap992fd996-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.820 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531888 kernel: tap992fd996-30: left promiscuous mode
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.840 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.842 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[af18c06c-4aba-4882-b221-1e1e905ccda0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.857 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[05f4a645-e800-4526-933a-92f97a8880ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.860 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7015aa5b-7f67-4a0a-aeb6-e99eb743dd94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.879 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9e6e5327-b69b-460a-bc86-d6d9f56011d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582055, 'reachable_time': 43410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235294, 'error': None, 'target': 'ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.882 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-992fd996-37d5-4773-a99d-dcf5c136f735 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:11:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:22.883 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[c88f563e-7daf-4049-bf06-3c7f7187acb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:22 np0005531888 systemd[1]: run-netns-ovnmeta\x2d992fd996\x2d37d5\x2d4773\x2da99d\x2ddcf5c136f735.mount: Deactivated successfully.
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.931 186792 DEBUG nova.virt.libvirt.vif [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:11:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1041928057',display_name='tempest-ServerAddressesNegativeTestJSON-server-1041928057',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1041928057',id=130,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:11:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2490428c0ca1403591486dc168517841',ramdisk_id='',reservation_id='r-tt3mn4gt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-129173326',owner_user_name='tempest-ServerAddressesNegativeTestJSON-129173326-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:11:21Z,user_data=None,user_id='c7ac36fa0a934eec9a5ced482bdc3e78',uuid=3fcd21fa-f23a-474f-8980-8dbcbded9238,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.932 186792 DEBUG nova.network.os_vif_util [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Converting VIF {"id": "eebcc821-cea3-4304-b856-eb388ff10624", "address": "fa:16:3e:da:c2:fd", "network": {"id": "992fd996-37d5-4773-a99d-dcf5c136f735", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-198467550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2490428c0ca1403591486dc168517841", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeebcc821-ce", "ovs_interfaceid": "eebcc821-cea3-4304-b856-eb388ff10624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.932 186792 DEBUG nova.network.os_vif_util [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:c2:fd,bridge_name='br-int',has_traffic_filtering=True,id=eebcc821-cea3-4304-b856-eb388ff10624,network=Network(992fd996-37d5-4773-a99d-dcf5c136f735),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeebcc821-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.933 186792 DEBUG os_vif [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:c2:fd,bridge_name='br-int',has_traffic_filtering=True,id=eebcc821-cea3-4304-b856-eb388ff10624,network=Network(992fd996-37d5-4773-a99d-dcf5c136f735),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeebcc821-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.934 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.934 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeebcc821-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.936 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.938 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.940 186792 INFO os_vif [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:c2:fd,bridge_name='br-int',has_traffic_filtering=True,id=eebcc821-cea3-4304-b856-eb388ff10624,network=Network(992fd996-37d5-4773-a99d-dcf5c136f735),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeebcc821-ce')#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.941 186792 INFO nova.virt.libvirt.driver [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Deleting instance files /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238_del#033[00m
Nov 22 03:11:22 np0005531888 nova_compute[186788]: 2025-11-22 08:11:22.942 186792 INFO nova.virt.libvirt.driver [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Deletion of /var/lib/nova/instances/3fcd21fa-f23a-474f-8980-8dbcbded9238_del complete#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.005 186792 INFO nova.compute.manager [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.005 186792 DEBUG oslo.service.loopingcall [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.006 186792 DEBUG nova.compute.manager [-] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.006 186792 DEBUG nova.network.neutron [-] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.760 186792 DEBUG nova.network.neutron [-] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.763 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.779 186792 INFO nova.compute.manager [-] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Took 0.77 seconds to deallocate network for instance.#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.868 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.868 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.884 186792 DEBUG nova.compute.manager [req-89786a6f-2769-494b-94ac-40e4b2d47b30 req-3f4d16fc-bd6f-4173-9e1b-6ff300110370 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received event network-vif-deleted-eebcc821-cea3-4304-b856-eb388ff10624 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.938 186792 DEBUG nova.compute.provider_tree [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.953 186792 DEBUG nova.scheduler.client.report [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:11:23 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.972 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:24 np0005531888 nova_compute[186788]: 2025-11-22 08:11:23.999 186792 INFO nova.scheduler.client.report [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Deleted allocations for instance 3fcd21fa-f23a-474f-8980-8dbcbded9238#033[00m
Nov 22 03:11:24 np0005531888 nova_compute[186788]: 2025-11-22 08:11:24.065 186792 DEBUG oslo_concurrency.lockutils [None req-ab33b9ea-a86f-46f7-9ab7-af163bdc3110 c7ac36fa0a934eec9a5ced482bdc3e78 2490428c0ca1403591486dc168517841 - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:24 np0005531888 podman[235295]: 2025-11-22 08:11:24.682691061 +0000 UTC m=+0.055611149 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:11:25 np0005531888 nova_compute[186788]: 2025-11-22 08:11:25.057 186792 DEBUG nova.compute.manager [req-394038bf-c7c0-42f5-ae22-d7acb06d6e74 req-988300f4-ee05-4d3b-8cbe-f9d7a6eb76e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received event network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:25 np0005531888 nova_compute[186788]: 2025-11-22 08:11:25.057 186792 DEBUG oslo_concurrency.lockutils [req-394038bf-c7c0-42f5-ae22-d7acb06d6e74 req-988300f4-ee05-4d3b-8cbe-f9d7a6eb76e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:25 np0005531888 nova_compute[186788]: 2025-11-22 08:11:25.057 186792 DEBUG oslo_concurrency.lockutils [req-394038bf-c7c0-42f5-ae22-d7acb06d6e74 req-988300f4-ee05-4d3b-8cbe-f9d7a6eb76e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:25 np0005531888 nova_compute[186788]: 2025-11-22 08:11:25.057 186792 DEBUG oslo_concurrency.lockutils [req-394038bf-c7c0-42f5-ae22-d7acb06d6e74 req-988300f4-ee05-4d3b-8cbe-f9d7a6eb76e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3fcd21fa-f23a-474f-8980-8dbcbded9238-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:25 np0005531888 nova_compute[186788]: 2025-11-22 08:11:25.057 186792 DEBUG nova.compute.manager [req-394038bf-c7c0-42f5-ae22-d7acb06d6e74 req-988300f4-ee05-4d3b-8cbe-f9d7a6eb76e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] No waiting events found dispatching network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:11:25 np0005531888 nova_compute[186788]: 2025-11-22 08:11:25.058 186792 WARNING nova.compute.manager [req-394038bf-c7c0-42f5-ae22-d7acb06d6e74 req-988300f4-ee05-4d3b-8cbe-f9d7a6eb76e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Received unexpected event network-vif-plugged-eebcc821-cea3-4304-b856-eb388ff10624 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:11:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:26.830 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:27 np0005531888 nova_compute[186788]: 2025-11-22 08:11:27.936 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:28 np0005531888 podman[235319]: 2025-11-22 08:11:28.671246222 +0000 UTC m=+0.046627378 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:11:28 np0005531888 nova_compute[186788]: 2025-11-22 08:11:28.765 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:29 np0005531888 nova_compute[186788]: 2025-11-22 08:11:29.398 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:32 np0005531888 podman[235340]: 2025-11-22 08:11:32.704260126 +0000 UTC m=+0.081516995 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:11:32 np0005531888 podman[235341]: 2025-11-22 08:11:32.733433274 +0000 UTC m=+0.106077180 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 03:11:32 np0005531888 nova_compute[186788]: 2025-11-22 08:11:32.938 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:33 np0005531888 nova_compute[186788]: 2025-11-22 08:11:33.767 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:36.826 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:36.827 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:36.827 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:37 np0005531888 nova_compute[186788]: 2025-11-22 08:11:37.811 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799082.8090692, 3fcd21fa-f23a-474f-8980-8dbcbded9238 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:37 np0005531888 nova_compute[186788]: 2025-11-22 08:11:37.811 186792 INFO nova.compute.manager [-] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:11:37 np0005531888 nova_compute[186788]: 2025-11-22 08:11:37.858 186792 DEBUG nova.compute.manager [None req-c2336617-5780-4f54-b9f7-b72987d64819 - - - - - -] [instance: 3fcd21fa-f23a-474f-8980-8dbcbded9238] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:37 np0005531888 nova_compute[186788]: 2025-11-22 08:11:37.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:38 np0005531888 nova_compute[186788]: 2025-11-22 08:11:38.769 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:42 np0005531888 nova_compute[186788]: 2025-11-22 08:11:42.943 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:43 np0005531888 podman[235383]: 2025-11-22 08:11:43.681476612 +0000 UTC m=+0.051113728 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:11:43 np0005531888 podman[235382]: 2025-11-22 08:11:43.701584557 +0000 UTC m=+0.075060987 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:11:43 np0005531888 nova_compute[186788]: 2025-11-22 08:11:43.770 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:47 np0005531888 nova_compute[186788]: 2025-11-22 08:11:47.944 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:48 np0005531888 nova_compute[186788]: 2025-11-22 08:11:48.772 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.184 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.185 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.254 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.666 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.668 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.684 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.685 186792 INFO nova.compute.claims [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.820 186792 DEBUG nova.compute.provider_tree [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.844 186792 DEBUG nova.scheduler.client.report [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.875 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.923 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "45e2fd66-2565-490a-b65e-b3f6cce6c327" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.923 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "45e2fd66-2565-490a-b65e-b3f6cce6c327" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.933 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "45e2fd66-2565-490a-b65e-b3f6cce6c327" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.934 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.945 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.990 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:11:52 np0005531888 nova_compute[186788]: 2025-11-22 08:11:52.990 186792 DEBUG nova.network.neutron [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.007 186792 INFO nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.022 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.121 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.122 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.123 186792 INFO nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Creating image(s)#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.123 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "/var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.124 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "/var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.124 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "/var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.137 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.202 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.203 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.204 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.218 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.279 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.280 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.317 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.318 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.319 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.386 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.387 186792 DEBUG nova.virt.disk.api [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Checking if we can resize image /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.387 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.459 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.461 186792 DEBUG nova.virt.disk.api [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Cannot resize image /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.461 186792 DEBUG nova.objects.instance [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lazy-loading 'migration_context' on Instance uuid ecd0246b-9a9f-4d53-b53c-f97d54322a27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.475 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.475 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Ensure instance console log exists: /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.476 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.476 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.477 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.541 186792 DEBUG nova.policy [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd98783dea56411987b8da37b0a5ae70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e322c5cfadab4c3c9b407a0b64c130ec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:11:53 np0005531888 podman[235440]: 2025-11-22 08:11:53.670874702 +0000 UTC m=+0.047353545 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 03:11:53 np0005531888 nova_compute[186788]: 2025-11-22 08:11:53.773 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:54 np0005531888 nova_compute[186788]: 2025-11-22 08:11:54.502 186792 DEBUG nova.network.neutron [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Successfully created port: 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:11:55 np0005531888 nova_compute[186788]: 2025-11-22 08:11:55.536 186792 DEBUG nova.network.neutron [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Successfully updated port: 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:11:55 np0005531888 nova_compute[186788]: 2025-11-22 08:11:55.548 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "refresh_cache-ecd0246b-9a9f-4d53-b53c-f97d54322a27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:11:55 np0005531888 nova_compute[186788]: 2025-11-22 08:11:55.549 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquired lock "refresh_cache-ecd0246b-9a9f-4d53-b53c-f97d54322a27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:11:55 np0005531888 nova_compute[186788]: 2025-11-22 08:11:55.549 186792 DEBUG nova.network.neutron [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:11:55 np0005531888 nova_compute[186788]: 2025-11-22 08:11:55.638 186792 DEBUG nova.compute.manager [req-78fc6191-75b2-42d4-845e-4765816c89b6 req-3b1e3336-6283-4f81-b0a0-0c04fcd8e742 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received event network-changed-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:55 np0005531888 nova_compute[186788]: 2025-11-22 08:11:55.639 186792 DEBUG nova.compute.manager [req-78fc6191-75b2-42d4-845e-4765816c89b6 req-3b1e3336-6283-4f81-b0a0-0c04fcd8e742 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Refreshing instance network info cache due to event network-changed-1c7dcfb7-98a2-49e9-8e53-69c63b34c801. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:11:55 np0005531888 nova_compute[186788]: 2025-11-22 08:11:55.639 186792 DEBUG oslo_concurrency.lockutils [req-78fc6191-75b2-42d4-845e-4765816c89b6 req-3b1e3336-6283-4f81-b0a0-0c04fcd8e742 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ecd0246b-9a9f-4d53-b53c-f97d54322a27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:11:55 np0005531888 podman[235460]: 2025-11-22 08:11:55.702163479 +0000 UTC m=+0.073118970 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:11:55 np0005531888 nova_compute[186788]: 2025-11-22 08:11:55.762 186792 DEBUG nova.network.neutron [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:11:56 np0005531888 nova_compute[186788]: 2025-11-22 08:11:56.976 186792 DEBUG nova.network.neutron [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Updating instance_info_cache with network_info: [{"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:56 np0005531888 nova_compute[186788]: 2025-11-22 08:11:56.995 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Releasing lock "refresh_cache-ecd0246b-9a9f-4d53-b53c-f97d54322a27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:11:56 np0005531888 nova_compute[186788]: 2025-11-22 08:11:56.995 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Instance network_info: |[{"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:11:56 np0005531888 nova_compute[186788]: 2025-11-22 08:11:56.996 186792 DEBUG oslo_concurrency.lockutils [req-78fc6191-75b2-42d4-845e-4765816c89b6 req-3b1e3336-6283-4f81-b0a0-0c04fcd8e742 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ecd0246b-9a9f-4d53-b53c-f97d54322a27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:11:56 np0005531888 nova_compute[186788]: 2025-11-22 08:11:56.996 186792 DEBUG nova.network.neutron [req-78fc6191-75b2-42d4-845e-4765816c89b6 req-3b1e3336-6283-4f81-b0a0-0c04fcd8e742 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Refreshing network info cache for port 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:56.999 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Start _get_guest_xml network_info=[{"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.006 186792 WARNING nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.015 186792 DEBUG nova.virt.libvirt.host [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.016 186792 DEBUG nova.virt.libvirt.host [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.020 186792 DEBUG nova.virt.libvirt.host [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.021 186792 DEBUG nova.virt.libvirt.host [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.022 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.022 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.023 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.023 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.023 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.023 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.024 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.024 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.024 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.024 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.025 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.025 186792 DEBUG nova.virt.hardware [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.029 186792 DEBUG nova.virt.libvirt.vif [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:11:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-281405040',display_name='tempest-ServerGroupTestJSON-server-281405040',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-281405040',id=132,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e322c5cfadab4c3c9b407a0b64c130ec',ramdisk_id='',reservation_id='r-gulkew7c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1542689531',owner_user_name='tempest-ServerGroupTestJSON-1542689531-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:53Z,user_data=None,user_id='dd98783dea56411987b8da37b0a5ae70',uuid=ecd0246b-9a9f-4d53-b53c-f97d54322a27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.029 186792 DEBUG nova.network.os_vif_util [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Converting VIF {"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.030 186792 DEBUG nova.network.os_vif_util [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:6b:ca,bridge_name='br-int',has_traffic_filtering=True,id=1c7dcfb7-98a2-49e9-8e53-69c63b34c801,network=Network(1ce589d6-10ee-4975-970e-a6e77bdebee0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c7dcfb7-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.031 186792 DEBUG nova.objects.instance [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lazy-loading 'pci_devices' on Instance uuid ecd0246b-9a9f-4d53-b53c-f97d54322a27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.043 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <uuid>ecd0246b-9a9f-4d53-b53c-f97d54322a27</uuid>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <name>instance-00000084</name>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerGroupTestJSON-server-281405040</nova:name>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:11:57</nova:creationTime>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        <nova:user uuid="dd98783dea56411987b8da37b0a5ae70">tempest-ServerGroupTestJSON-1542689531-project-member</nova:user>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        <nova:project uuid="e322c5cfadab4c3c9b407a0b64c130ec">tempest-ServerGroupTestJSON-1542689531</nova:project>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        <nova:port uuid="1c7dcfb7-98a2-49e9-8e53-69c63b34c801">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <entry name="serial">ecd0246b-9a9f-4d53-b53c-f97d54322a27</entry>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <entry name="uuid">ecd0246b-9a9f-4d53-b53c-f97d54322a27</entry>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk.config"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:52:6b:ca"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <target dev="tap1c7dcfb7-98"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/console.log" append="off"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:11:57 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:11:57 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:11:57 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:11:57 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.044 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Preparing to wait for external event network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.045 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.045 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.045 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.046 186792 DEBUG nova.virt.libvirt.vif [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:11:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-281405040',display_name='tempest-ServerGroupTestJSON-server-281405040',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-281405040',id=132,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e322c5cfadab4c3c9b407a0b64c130ec',ramdisk_id='',reservation_id='r-gulkew7c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1542689531',owner_user_name='tempest-ServerGroupTestJSON-1542689531-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:53Z,user_data=None,user_id='dd98783dea56411987b8da37b0a5ae70',uuid=ecd0246b-9a9f-4d53-b53c-f97d54322a27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.046 186792 DEBUG nova.network.os_vif_util [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Converting VIF {"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.047 186792 DEBUG nova.network.os_vif_util [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:6b:ca,bridge_name='br-int',has_traffic_filtering=True,id=1c7dcfb7-98a2-49e9-8e53-69c63b34c801,network=Network(1ce589d6-10ee-4975-970e-a6e77bdebee0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c7dcfb7-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.047 186792 DEBUG os_vif [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:6b:ca,bridge_name='br-int',has_traffic_filtering=True,id=1c7dcfb7-98a2-49e9-8e53-69c63b34c801,network=Network(1ce589d6-10ee-4975-970e-a6e77bdebee0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c7dcfb7-98') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.048 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.048 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.049 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.052 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.053 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c7dcfb7-98, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.053 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1c7dcfb7-98, col_values=(('external_ids', {'iface-id': '1c7dcfb7-98a2-49e9-8e53-69c63b34c801', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:6b:ca', 'vm-uuid': 'ecd0246b-9a9f-4d53-b53c-f97d54322a27'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.055 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:57 np0005531888 NetworkManager[55166]: <info>  [1763799117.0563] manager: (tap1c7dcfb7-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.058 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.060 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.061 186792 INFO os_vif [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:6b:ca,bridge_name='br-int',has_traffic_filtering=True,id=1c7dcfb7-98a2-49e9-8e53-69c63b34c801,network=Network(1ce589d6-10ee-4975-970e-a6e77bdebee0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c7dcfb7-98')#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.129 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.129 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.130 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] No VIF found with MAC fa:16:3e:52:6b:ca, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.130 186792 INFO nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Using config drive#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.740 186792 INFO nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Creating config drive at /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk.config#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.748 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfrt5lq8t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.876 186792 DEBUG oslo_concurrency.processutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfrt5lq8t" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:57 np0005531888 kernel: tap1c7dcfb7-98: entered promiscuous mode
Nov 22 03:11:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:57Z|00508|binding|INFO|Claiming lport 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 for this chassis.
Nov 22 03:11:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:57Z|00509|binding|INFO|1c7dcfb7-98a2-49e9-8e53-69c63b34c801: Claiming fa:16:3e:52:6b:ca 10.100.0.13
Nov 22 03:11:57 np0005531888 NetworkManager[55166]: <info>  [1763799117.9467] manager: (tap1c7dcfb7-98): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.936 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:57 np0005531888 nova_compute[186788]: 2025-11-22 08:11:57.950 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.960 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:6b:ca 10.100.0.13'], port_security=['fa:16:3e:52:6b:ca 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ecd0246b-9a9f-4d53-b53c-f97d54322a27', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ce589d6-10ee-4975-970e-a6e77bdebee0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e322c5cfadab4c3c9b407a0b64c130ec', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cefb6f59-09a9-43c1-a5b7-b7181d49aeba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e5a0f3f-9168-4b8a-993f-e2abfb69d7af, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=1c7dcfb7-98a2-49e9-8e53-69c63b34c801) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.961 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 in datapath 1ce589d6-10ee-4975-970e-a6e77bdebee0 bound to our chassis#033[00m
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.963 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1ce589d6-10ee-4975-970e-a6e77bdebee0#033[00m
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.974 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5732a02f-ed38-478a-a7ab-c062780d5298]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.975 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1ce589d6-11 in ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:11:57 np0005531888 systemd-udevd[235504]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.977 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1ce589d6-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.977 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ba278d71-1a44-49e7-ac3c-5cf867c7481e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.978 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5e701d0f-17bf-481f-b46c-083d381faaaf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:57 np0005531888 systemd-machined[153106]: New machine qemu-64-instance-00000084.
Nov 22 03:11:57 np0005531888 NetworkManager[55166]: <info>  [1763799117.9893] device (tap1c7dcfb7-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:11:57 np0005531888 NetworkManager[55166]: <info>  [1763799117.9901] device (tap1c7dcfb7-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:11:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:57.989 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[3851ca84-c77d-45bb-b628-a78881797fac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.005 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.007 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bf67cb56-3f2e-43be-937d-0e2223edb76d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 systemd[1]: Started Virtual Machine qemu-64-instance-00000084.
Nov 22 03:11:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:58Z|00510|binding|INFO|Setting lport 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 ovn-installed in OVS
Nov 22 03:11:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:58Z|00511|binding|INFO|Setting lport 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 up in Southbound
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.011 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.035 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d0d872-6c01-4960-8ad9-a0262e8d3815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 NetworkManager[55166]: <info>  [1763799118.0416] manager: (tap1ce589d6-10): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.041 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[082e9624-d05a-4053-b96f-3ebcdf4e9417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.073 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[460d5836-3f1c-431c-ba65-a25e00c785bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.077 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d2cd20-2eb9-461c-b32b-9a284f765395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 NetworkManager[55166]: <info>  [1763799118.0982] device (tap1ce589d6-10): carrier: link connected
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.104 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[52cbfbb0-51ed-4a11-846d-572beb725f2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.122 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b03896a1-2e01-458d-9389-22a745b954ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1ce589d6-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:43:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585874, 'reachable_time': 39906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235537, 'error': None, 'target': 'ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.138 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3c71c2-50b5-41c9-be00-1a3c0034034a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:432a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585874, 'tstamp': 585874}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235538, 'error': None, 'target': 'ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.157 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb4be24-a453-4e84-a297-c8a2b1a769b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1ce589d6-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:43:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585874, 'reachable_time': 39906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235539, 'error': None, 'target': 'ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.189 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3e195bef-a3fe-4d7c-b304-dfc011c65398]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.240 186792 DEBUG nova.compute.manager [req-19973d42-b634-4362-8d8e-8e1d45489772 req-7db8e8ed-d753-46c2-988e-2ac291859493 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received event network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.240 186792 DEBUG oslo_concurrency.lockutils [req-19973d42-b634-4362-8d8e-8e1d45489772 req-7db8e8ed-d753-46c2-988e-2ac291859493 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.241 186792 DEBUG oslo_concurrency.lockutils [req-19973d42-b634-4362-8d8e-8e1d45489772 req-7db8e8ed-d753-46c2-988e-2ac291859493 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.241 186792 DEBUG oslo_concurrency.lockutils [req-19973d42-b634-4362-8d8e-8e1d45489772 req-7db8e8ed-d753-46c2-988e-2ac291859493 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.242 186792 DEBUG nova.compute.manager [req-19973d42-b634-4362-8d8e-8e1d45489772 req-7db8e8ed-d753-46c2-988e-2ac291859493 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Processing event network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.258 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a143a363-5d08-4e39-82a2-09728c8920cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.260 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ce589d6-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.260 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.261 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ce589d6-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:58 np0005531888 NetworkManager[55166]: <info>  [1763799118.2640] manager: (tap1ce589d6-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Nov 22 03:11:58 np0005531888 kernel: tap1ce589d6-10: entered promiscuous mode
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.263 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.266 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.267 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1ce589d6-10, col_values=(('external_ids', {'iface-id': 'bb64c6f6-5f36-4806-9330-78d403274f85'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:11:58Z|00512|binding|INFO|Releasing lport bb64c6f6-5f36-4806-9330-78d403274f85 from this chassis (sb_readonly=0)
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.269 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.281 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.283 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1ce589d6-10ee-4975-970e-a6e77bdebee0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1ce589d6-10ee-4975-970e-a6e77bdebee0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.284 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1a475a1f-6850-4aab-bb5e-ed3f00fa1949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.286 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-1ce589d6-10ee-4975-970e-a6e77bdebee0
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/1ce589d6-10ee-4975-970e-a6e77bdebee0.pid.haproxy
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 1ce589d6-10ee-4975-970e-a6e77bdebee0
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:11:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:11:58.289 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0', 'env', 'PROCESS_TAG=haproxy-1ce589d6-10ee-4975-970e-a6e77bdebee0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1ce589d6-10ee-4975-970e-a6e77bdebee0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:11:58 np0005531888 podman[235571]: 2025-11-22 08:11:58.670573591 +0000 UTC m=+0.064847616 container create 734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.697 186792 DEBUG nova.network.neutron [req-78fc6191-75b2-42d4-845e-4765816c89b6 req-3b1e3336-6283-4f81-b0a0-0c04fcd8e742 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Updated VIF entry in instance network info cache for port 1c7dcfb7-98a2-49e9-8e53-69c63b34c801. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.698 186792 DEBUG nova.network.neutron [req-78fc6191-75b2-42d4-845e-4765816c89b6 req-3b1e3336-6283-4f81-b0a0-0c04fcd8e742 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Updating instance_info_cache with network_info: [{"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:58 np0005531888 systemd[1]: Started libpod-conmon-734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37.scope.
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.714 186792 DEBUG oslo_concurrency.lockutils [req-78fc6191-75b2-42d4-845e-4765816c89b6 req-3b1e3336-6283-4f81-b0a0-0c04fcd8e742 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ecd0246b-9a9f-4d53-b53c-f97d54322a27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:11:58 np0005531888 podman[235571]: 2025-11-22 08:11:58.628466976 +0000 UTC m=+0.022741011 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:11:58 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:11:58 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df16eb23b05350d71830f28337b10ae13bb262e8ce6c41baaddbd7b593f0b6cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:11:58 np0005531888 podman[235571]: 2025-11-22 08:11:58.767190438 +0000 UTC m=+0.161464473 container init 734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:11:58 np0005531888 podman[235571]: 2025-11-22 08:11:58.775188544 +0000 UTC m=+0.169462559 container start 734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.777 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:58 np0005531888 neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0[235587]: [NOTICE]   (235601) : New worker (235611) forked
Nov 22 03:11:58 np0005531888 neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0[235587]: [NOTICE]   (235601) : Loading success.
Nov 22 03:11:58 np0005531888 podman[235589]: 2025-11-22 08:11:58.808335029 +0000 UTC m=+0.075390745 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 22 03:11:58 np0005531888 nova_compute[186788]: 2025-11-22 08:11:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.782 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799119.7819586, ecd0246b-9a9f-4d53-b53c-f97d54322a27 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.782 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] VM Started (Lifecycle Event)#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.785 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.789 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.795 186792 INFO nova.virt.libvirt.driver [-] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Instance spawned successfully.#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.796 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.807 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.811 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.832 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.832 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799119.7830322, ecd0246b-9a9f-4d53-b53c-f97d54322a27 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.832 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.837 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.838 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.838 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.838 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.839 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.839 186792 DEBUG nova.virt.libvirt.driver [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.865 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.868 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799119.7881243, ecd0246b-9a9f-4d53-b53c-f97d54322a27 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.868 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.893 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.897 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.913 186792 INFO nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Took 6.79 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.913 186792 DEBUG nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.919 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.980 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 03:11:59 np0005531888 nova_compute[186788]: 2025-11-22 08:11:59.980 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:12:00 np0005531888 nova_compute[186788]: 2025-11-22 08:12:00.010 186792 INFO nova.compute.manager [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Took 7.41 seconds to build instance.#033[00m
Nov 22 03:12:00 np0005531888 nova_compute[186788]: 2025-11-22 08:12:00.027 186792 DEBUG oslo_concurrency.lockutils [None req-6b7f3696-6e8d-4a75-802b-0e667cfb7b42 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:00 np0005531888 nova_compute[186788]: 2025-11-22 08:12:00.358 186792 DEBUG nova.compute.manager [req-91e9ee5b-6e05-4bb5-97d3-7f97197e1759 req-c4ff46c3-8e48-48d3-a6e0-75dc07477dc9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received event network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:00 np0005531888 nova_compute[186788]: 2025-11-22 08:12:00.359 186792 DEBUG oslo_concurrency.lockutils [req-91e9ee5b-6e05-4bb5-97d3-7f97197e1759 req-c4ff46c3-8e48-48d3-a6e0-75dc07477dc9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:00 np0005531888 nova_compute[186788]: 2025-11-22 08:12:00.359 186792 DEBUG oslo_concurrency.lockutils [req-91e9ee5b-6e05-4bb5-97d3-7f97197e1759 req-c4ff46c3-8e48-48d3-a6e0-75dc07477dc9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:00 np0005531888 nova_compute[186788]: 2025-11-22 08:12:00.359 186792 DEBUG oslo_concurrency.lockutils [req-91e9ee5b-6e05-4bb5-97d3-7f97197e1759 req-c4ff46c3-8e48-48d3-a6e0-75dc07477dc9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:00 np0005531888 nova_compute[186788]: 2025-11-22 08:12:00.359 186792 DEBUG nova.compute.manager [req-91e9ee5b-6e05-4bb5-97d3-7f97197e1759 req-c4ff46c3-8e48-48d3-a6e0-75dc07477dc9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] No waiting events found dispatching network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:12:00 np0005531888 nova_compute[186788]: 2025-11-22 08:12:00.360 186792 WARNING nova.compute.manager [req-91e9ee5b-6e05-4bb5-97d3-7f97197e1759 req-c4ff46c3-8e48-48d3-a6e0-75dc07477dc9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received unexpected event network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.002 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.003 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.003 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.003 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.003 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.011 186792 INFO nova.compute.manager [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Terminating instance#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.017 186792 DEBUG nova.compute.manager [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:12:02 np0005531888 kernel: tap1c7dcfb7-98 (unregistering): left promiscuous mode
Nov 22 03:12:02 np0005531888 NetworkManager[55166]: <info>  [1763799122.0397] device (tap1c7dcfb7-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:12:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:02Z|00513|binding|INFO|Releasing lport 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 from this chassis (sb_readonly=0)
Nov 22 03:12:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:02Z|00514|binding|INFO|Setting lport 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 down in Southbound
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.052 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:02Z|00515|binding|INFO|Removing iface tap1c7dcfb7-98 ovn-installed in OVS
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:02.060 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:6b:ca 10.100.0.13'], port_security=['fa:16:3e:52:6b:ca 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ecd0246b-9a9f-4d53-b53c-f97d54322a27', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ce589d6-10ee-4975-970e-a6e77bdebee0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e322c5cfadab4c3c9b407a0b64c130ec', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cefb6f59-09a9-43c1-a5b7-b7181d49aeba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e5a0f3f-9168-4b8a-993f-e2abfb69d7af, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=1c7dcfb7-98a2-49e9-8e53-69c63b34c801) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:12:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:02.062 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 1c7dcfb7-98a2-49e9-8e53-69c63b34c801 in datapath 1ce589d6-10ee-4975-970e-a6e77bdebee0 unbound from our chassis#033[00m
Nov 22 03:12:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:02.063 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1ce589d6-10ee-4975-970e-a6e77bdebee0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:12:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:02.065 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[54131e4e-891e-4c14-90b7-d495f6a9a817]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:02 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:02.066 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0 namespace which is not needed anymore#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.070 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531888 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000084.scope: Deactivated successfully.
Nov 22 03:12:02 np0005531888 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000084.scope: Consumed 3.926s CPU time.
Nov 22 03:12:02 np0005531888 systemd-machined[153106]: Machine qemu-64-instance-00000084 terminated.
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.242 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.246 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.283 186792 INFO nova.virt.libvirt.driver [-] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Instance destroyed successfully.#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.284 186792 DEBUG nova.objects.instance [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lazy-loading 'resources' on Instance uuid ecd0246b-9a9f-4d53-b53c-f97d54322a27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.299 186792 DEBUG nova.virt.libvirt.vif [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:11:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-281405040',display_name='tempest-ServerGroupTestJSON-server-281405040',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-281405040',id=132,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:11:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e322c5cfadab4c3c9b407a0b64c130ec',ramdisk_id='',reservation_id='r-gulkew7c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-1542689531',owner_user_name='tempest-ServerGroupTestJSON-1542689531-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:11:59Z,user_data=None,user_id='dd98783dea56411987b8da37b0a5ae70',uuid=ecd0246b-9a9f-4d53-b53c-f97d54322a27,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.299 186792 DEBUG nova.network.os_vif_util [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Converting VIF {"id": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "address": "fa:16:3e:52:6b:ca", "network": {"id": "1ce589d6-10ee-4975-970e-a6e77bdebee0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-279841684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e322c5cfadab4c3c9b407a0b64c130ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c7dcfb7-98", "ovs_interfaceid": "1c7dcfb7-98a2-49e9-8e53-69c63b34c801", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.300 186792 DEBUG nova.network.os_vif_util [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:6b:ca,bridge_name='br-int',has_traffic_filtering=True,id=1c7dcfb7-98a2-49e9-8e53-69c63b34c801,network=Network(1ce589d6-10ee-4975-970e-a6e77bdebee0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c7dcfb7-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.300 186792 DEBUG os_vif [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:6b:ca,bridge_name='br-int',has_traffic_filtering=True,id=1c7dcfb7-98a2-49e9-8e53-69c63b34c801,network=Network(1ce589d6-10ee-4975-970e-a6e77bdebee0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c7dcfb7-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:12:02 np0005531888 neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0[235587]: [NOTICE]   (235601) : haproxy version is 2.8.14-c23fe91
Nov 22 03:12:02 np0005531888 neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0[235587]: [NOTICE]   (235601) : path to executable is /usr/sbin/haproxy
Nov 22 03:12:02 np0005531888 neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0[235587]: [WARNING]  (235601) : Exiting Master process...
Nov 22 03:12:02 np0005531888 neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0[235587]: [WARNING]  (235601) : Exiting Master process...
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.302 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.302 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c7dcfb7-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:02 np0005531888 neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0[235587]: [ALERT]    (235601) : Current worker (235611) exited with code 143 (Terminated)
Nov 22 03:12:02 np0005531888 neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0[235587]: [WARNING]  (235601) : All workers exited. Exiting... (0)
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.305 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531888 systemd[1]: libpod-734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37.scope: Deactivated successfully.
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.308 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531888 podman[235650]: 2025-11-22 08:12:02.31298246 +0000 UTC m=+0.159131034 container died 734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.312 186792 INFO os_vif [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:6b:ca,bridge_name='br-int',has_traffic_filtering=True,id=1c7dcfb7-98a2-49e9-8e53-69c63b34c801,network=Network(1ce589d6-10ee-4975-970e-a6e77bdebee0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c7dcfb7-98')#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.314 186792 INFO nova.virt.libvirt.driver [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Deleting instance files /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27_del#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.315 186792 INFO nova.virt.libvirt.driver [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Deletion of /var/lib/nova/instances/ecd0246b-9a9f-4d53-b53c-f97d54322a27_del complete#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.388 186792 INFO nova.compute.manager [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.389 186792 DEBUG oslo.service.loopingcall [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.389 186792 DEBUG nova.compute.manager [-] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.389 186792 DEBUG nova.network.neutron [-] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.437 186792 DEBUG nova.compute.manager [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received event network-vif-unplugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.438 186792 DEBUG oslo_concurrency.lockutils [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.438 186792 DEBUG oslo_concurrency.lockutils [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.438 186792 DEBUG oslo_concurrency.lockutils [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.438 186792 DEBUG nova.compute.manager [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] No waiting events found dispatching network-vif-unplugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.439 186792 DEBUG nova.compute.manager [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received event network-vif-unplugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.439 186792 DEBUG nova.compute.manager [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received event network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.439 186792 DEBUG oslo_concurrency.lockutils [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.439 186792 DEBUG oslo_concurrency.lockutils [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.439 186792 DEBUG oslo_concurrency.lockutils [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.439 186792 DEBUG nova.compute.manager [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] No waiting events found dispatching network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.440 186792 WARNING nova.compute.manager [req-e5a9eba6-ba82-49b8-91d9-6796004a6318 req-88b5cc35-dc32-483d-887f-ca48fe52d12a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received unexpected event network-vif-plugged-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:12:02 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37-userdata-shm.mount: Deactivated successfully.
Nov 22 03:12:02 np0005531888 systemd[1]: var-lib-containers-storage-overlay-df16eb23b05350d71830f28337b10ae13bb262e8ce6c41baaddbd7b593f0b6cb-merged.mount: Deactivated successfully.
Nov 22 03:12:02 np0005531888 podman[235694]: 2025-11-22 08:12:02.851045932 +0000 UTC m=+0.062309043 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:12:02 np0005531888 podman[235695]: 2025-11-22 08:12:02.877342309 +0000 UTC m=+0.085614396 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:02 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.977 186792 DEBUG nova.network.neutron [-] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:02.999 186792 INFO nova.compute.manager [-] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Took 0.61 seconds to deallocate network for instance.#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.055 186792 DEBUG nova.compute.manager [req-fa960638-a891-4510-9341-574558881dde req-cc6fa8bc-8c2d-4c2f-9b46-7510f7081b48 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Received event network-vif-deleted-1c7dcfb7-98a2-49e9-8e53-69c63b34c801 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.060 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.061 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.107 186792 DEBUG nova.compute.provider_tree [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.120 186792 DEBUG nova.scheduler.client.report [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.143 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.164 186792 INFO nova.scheduler.client.report [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Deleted allocations for instance ecd0246b-9a9f-4d53-b53c-f97d54322a27#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.234 186792 DEBUG oslo_concurrency.lockutils [None req-f47b379c-4168-405c-a651-5e9afab406b8 dd98783dea56411987b8da37b0a5ae70 e322c5cfadab4c3c9b407a0b64c130ec - - default default] Lock "ecd0246b-9a9f-4d53-b53c-f97d54322a27" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:03 np0005531888 podman[235650]: 2025-11-22 08:12:03.334217216 +0000 UTC m=+1.180365760 container cleanup 734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:12:03 np0005531888 systemd[1]: libpod-conmon-734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37.scope: Deactivated successfully.
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.779 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:03 np0005531888 nova_compute[186788]: 2025-11-22 08:12:03.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:04 np0005531888 podman[235742]: 2025-11-22 08:12:04.015330497 +0000 UTC m=+0.662054274 container remove 734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.020 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2e60a00d-f8b3-4053-9ae9-76ce298043cd]: (4, ('Sat Nov 22 08:12:02 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0 (734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37)\n734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37\nSat Nov 22 08:12:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0 (734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37)\n734467ebfbefe628bf9de54cb134920e48cc77d47334a5830f08848fd0dc3e37\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.022 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9c25aed5-0eef-4bc5-976b-2d51f1a2fab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.023 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ce589d6-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:04 np0005531888 nova_compute[186788]: 2025-11-22 08:12:04.024 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:04 np0005531888 kernel: tap1ce589d6-10: left promiscuous mode
Nov 22 03:12:04 np0005531888 nova_compute[186788]: 2025-11-22 08:12:04.036 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.042 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[00325fea-b3cd-48d6-8d59-2c0a0ddd3502]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.062 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[68c733e7-3ebb-4c6b-ba13-7e9291272323]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.063 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7aabb199-6b38-4965-96a3-6cd3e3ec5054]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.081 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[efcada46-3641-4e4f-92f3-9c23317ec650]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585867, 'reachable_time': 26072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235758, 'error': None, 'target': 'ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.084 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1ce589d6-10ee-4975-970e-a6e77bdebee0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:12:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:04.085 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[14d357ac-1e29-43bf-8648-c1512e54807c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:04 np0005531888 systemd[1]: run-netns-ovnmeta\x2d1ce589d6\x2d10ee\x2d4975\x2d970e\x2da6e77bdebee0.mount: Deactivated successfully.
Nov 22 03:12:05 np0005531888 nova_compute[186788]: 2025-11-22 08:12:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:07 np0005531888 nova_compute[186788]: 2025-11-22 08:12:07.305 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:07 np0005531888 nova_compute[186788]: 2025-11-22 08:12:07.518 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:08 np0005531888 nova_compute[186788]: 2025-11-22 08:12:08.780 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:09 np0005531888 nova_compute[186788]: 2025-11-22 08:12:09.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:09 np0005531888 nova_compute[186788]: 2025-11-22 08:12:09.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:12:09 np0005531888 nova_compute[186788]: 2025-11-22 08:12:09.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:09 np0005531888 nova_compute[186788]: 2025-11-22 08:12:09.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:09 np0005531888 nova_compute[186788]: 2025-11-22 08:12:09.975 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:09 np0005531888 nova_compute[186788]: 2025-11-22 08:12:09.975 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:09 np0005531888 nova_compute[186788]: 2025-11-22 08:12:09.975 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.161 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.163 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5682MB free_disk=73.27408981323242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.163 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.163 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.227 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.227 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.259 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.273 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.298 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:12:10 np0005531888 nova_compute[186788]: 2025-11-22 08:12:10.299 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:11 np0005531888 nova_compute[186788]: 2025-11-22 08:12:11.299 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:12 np0005531888 nova_compute[186788]: 2025-11-22 08:12:12.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:13 np0005531888 nova_compute[186788]: 2025-11-22 08:12:13.782 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:14 np0005531888 podman[235761]: 2025-11-22 08:12:14.681290037 +0000 UTC m=+0.047117811 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:12:14 np0005531888 podman[235760]: 2025-11-22 08:12:14.682420875 +0000 UTC m=+0.051918668 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:12:17 np0005531888 nova_compute[186788]: 2025-11-22 08:12:17.281 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799122.2805066, ecd0246b-9a9f-4d53-b53c-f97d54322a27 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:12:17 np0005531888 nova_compute[186788]: 2025-11-22 08:12:17.281 186792 INFO nova.compute.manager [-] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:12:17 np0005531888 nova_compute[186788]: 2025-11-22 08:12:17.308 186792 DEBUG nova.compute.manager [None req-606b1cd7-dd4b-4db9-9935-c82091d52e49 - - - - - -] [instance: ecd0246b-9a9f-4d53-b53c-f97d54322a27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:12:17 np0005531888 nova_compute[186788]: 2025-11-22 08:12:17.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:18 np0005531888 nova_compute[186788]: 2025-11-22 08:12:18.783 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:18.797 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:12:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:18.798 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:12:18 np0005531888 nova_compute[186788]: 2025-11-22 08:12:18.798 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.260 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.261 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.306 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.401 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.402 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.410 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.411 186792 INFO nova.compute.claims [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.541 186792 DEBUG nova.compute.provider_tree [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.558 186792 DEBUG nova.scheduler.client.report [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.581 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.582 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.647 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.648 186792 DEBUG nova.network.neutron [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.664 186792 INFO nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.692 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.819 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.821 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.821 186792 INFO nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Creating image(s)#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.822 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.822 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.823 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.838 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.896 186792 DEBUG nova.policy [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.903 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.904 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.905 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.917 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.972 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:12:20 np0005531888 nova_compute[186788]: 2025-11-22 08:12:20.973 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.008 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.009 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.010 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.078 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.079 186792 DEBUG nova.virt.disk.api [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.080 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.141 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.142 186792 DEBUG nova.virt.disk.api [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.143 186792 DEBUG nova.objects.instance [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid e5342b10-43ea-4199-8bb3-0aa50f8ddd11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.154 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.155 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Ensure instance console log exists: /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.155 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.156 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.156 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:21 np0005531888 nova_compute[186788]: 2025-11-22 08:12:21.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:22 np0005531888 nova_compute[186788]: 2025-11-22 08:12:22.311 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:22 np0005531888 nova_compute[186788]: 2025-11-22 08:12:22.570 186792 DEBUG nova.network.neutron [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Successfully created port: 77356784-a6b0-4c1e-9094-0881172d3c27 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:12:23 np0005531888 nova_compute[186788]: 2025-11-22 08:12:23.785 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:24 np0005531888 nova_compute[186788]: 2025-11-22 08:12:24.226 186792 DEBUG nova.network.neutron [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Successfully created port: e94797a8-0eb2-455e-aebc-d72f3acfb7a6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:12:24 np0005531888 podman[235814]: 2025-11-22 08:12:24.678789807 +0000 UTC m=+0.053880796 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Nov 22 03:12:25 np0005531888 nova_compute[186788]: 2025-11-22 08:12:25.743 186792 DEBUG nova.network.neutron [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Successfully updated port: 77356784-a6b0-4c1e-9094-0881172d3c27 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:12:25 np0005531888 nova_compute[186788]: 2025-11-22 08:12:25.911 186792 DEBUG nova.compute.manager [req-ca9524ab-77c2-40c9-b7ca-d3fd931beecd req-9798dec1-3a60-4164-8204-792323779304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-changed-77356784-a6b0-4c1e-9094-0881172d3c27 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:25 np0005531888 nova_compute[186788]: 2025-11-22 08:12:25.912 186792 DEBUG nova.compute.manager [req-ca9524ab-77c2-40c9-b7ca-d3fd931beecd req-9798dec1-3a60-4164-8204-792323779304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Refreshing instance network info cache due to event network-changed-77356784-a6b0-4c1e-9094-0881172d3c27. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:12:25 np0005531888 nova_compute[186788]: 2025-11-22 08:12:25.912 186792 DEBUG oslo_concurrency.lockutils [req-ca9524ab-77c2-40c9-b7ca-d3fd931beecd req-9798dec1-3a60-4164-8204-792323779304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:12:25 np0005531888 nova_compute[186788]: 2025-11-22 08:12:25.912 186792 DEBUG oslo_concurrency.lockutils [req-ca9524ab-77c2-40c9-b7ca-d3fd931beecd req-9798dec1-3a60-4164-8204-792323779304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:12:25 np0005531888 nova_compute[186788]: 2025-11-22 08:12:25.913 186792 DEBUG nova.network.neutron [req-ca9524ab-77c2-40c9-b7ca-d3fd931beecd req-9798dec1-3a60-4164-8204-792323779304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Refreshing network info cache for port 77356784-a6b0-4c1e-9094-0881172d3c27 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:12:26 np0005531888 nova_compute[186788]: 2025-11-22 08:12:26.567 186792 DEBUG nova.network.neutron [req-ca9524ab-77c2-40c9-b7ca-d3fd931beecd req-9798dec1-3a60-4164-8204-792323779304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:12:26 np0005531888 podman[235834]: 2025-11-22 08:12:26.668388447 +0000 UTC m=+0.046426652 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:12:26 np0005531888 nova_compute[186788]: 2025-11-22 08:12:26.969 186792 DEBUG nova.network.neutron [req-ca9524ab-77c2-40c9-b7ca-d3fd931beecd req-9798dec1-3a60-4164-8204-792323779304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:12:26 np0005531888 nova_compute[186788]: 2025-11-22 08:12:26.990 186792 DEBUG oslo_concurrency.lockutils [req-ca9524ab-77c2-40c9-b7ca-d3fd931beecd req-9798dec1-3a60-4164-8204-792323779304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:12:27 np0005531888 nova_compute[186788]: 2025-11-22 08:12:27.213 186792 DEBUG nova.network.neutron [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Successfully updated port: e94797a8-0eb2-455e-aebc-d72f3acfb7a6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:12:27 np0005531888 nova_compute[186788]: 2025-11-22 08:12:27.239 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:12:27 np0005531888 nova_compute[186788]: 2025-11-22 08:12:27.239 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:12:27 np0005531888 nova_compute[186788]: 2025-11-22 08:12:27.239 186792 DEBUG nova.network.neutron [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:12:27 np0005531888 nova_compute[186788]: 2025-11-22 08:12:27.312 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:27 np0005531888 nova_compute[186788]: 2025-11-22 08:12:27.433 186792 DEBUG nova.network.neutron [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:12:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:27.801 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:28 np0005531888 nova_compute[186788]: 2025-11-22 08:12:28.000 186792 DEBUG nova.compute.manager [req-f38e13ff-36f2-4d8f-899b-05bef02f56f6 req-3c5d9803-bb5b-4193-8322-6bc0ace9e6b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-changed-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:28 np0005531888 nova_compute[186788]: 2025-11-22 08:12:28.000 186792 DEBUG nova.compute.manager [req-f38e13ff-36f2-4d8f-899b-05bef02f56f6 req-3c5d9803-bb5b-4193-8322-6bc0ace9e6b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Refreshing instance network info cache due to event network-changed-e94797a8-0eb2-455e-aebc-d72f3acfb7a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:12:28 np0005531888 nova_compute[186788]: 2025-11-22 08:12:28.001 186792 DEBUG oslo_concurrency.lockutils [req-f38e13ff-36f2-4d8f-899b-05bef02f56f6 req-3c5d9803-bb5b-4193-8322-6bc0ace9e6b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:12:28 np0005531888 nova_compute[186788]: 2025-11-22 08:12:28.788 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:29 np0005531888 podman[235857]: 2025-11-22 08:12:29.682079903 +0000 UTC m=+0.060206292 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.717 186792 DEBUG nova.network.neutron [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updating instance_info_cache with network_info: [{"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.883 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.884 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Instance network_info: |[{"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.884 186792 DEBUG oslo_concurrency.lockutils [req-f38e13ff-36f2-4d8f-899b-05bef02f56f6 req-3c5d9803-bb5b-4193-8322-6bc0ace9e6b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.885 186792 DEBUG nova.network.neutron [req-f38e13ff-36f2-4d8f-899b-05bef02f56f6 req-3c5d9803-bb5b-4193-8322-6bc0ace9e6b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Refreshing network info cache for port e94797a8-0eb2-455e-aebc-d72f3acfb7a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.888 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Start _get_guest_xml network_info=[{"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.892 186792 WARNING nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.898 186792 DEBUG nova.virt.libvirt.host [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.898 186792 DEBUG nova.virt.libvirt.host [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.904 186792 DEBUG nova.virt.libvirt.host [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.904 186792 DEBUG nova.virt.libvirt.host [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.906 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.906 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.906 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.906 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.907 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.907 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.907 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.907 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.908 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.908 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.908 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.908 186792 DEBUG nova.virt.hardware [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.912 186792 DEBUG nova.virt.libvirt.vif [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:12:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1459174835',display_name='tempest-TestGettingAddress-server-1459174835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1459174835',id=133,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-mrw6wiad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:12:20Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=e5342b10-43ea-4199-8bb3-0aa50f8ddd11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.912 186792 DEBUG nova.network.os_vif_util [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.913 186792 DEBUG nova.network.os_vif_util [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:80:44,bridge_name='br-int',has_traffic_filtering=True,id=77356784-a6b0-4c1e-9094-0881172d3c27,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77356784-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.913 186792 DEBUG nova.virt.libvirt.vif [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:12:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1459174835',display_name='tempest-TestGettingAddress-server-1459174835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1459174835',id=133,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-mrw6wiad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:12:20Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=e5342b10-43ea-4199-8bb3-0aa50f8ddd11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.914 186792 DEBUG nova.network.os_vif_util [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.914 186792 DEBUG nova.network.os_vif_util [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:14:7e,bridge_name='br-int',has_traffic_filtering=True,id=e94797a8-0eb2-455e-aebc-d72f3acfb7a6,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94797a8-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.915 186792 DEBUG nova.objects.instance [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid e5342b10-43ea-4199-8bb3-0aa50f8ddd11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.927 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <uuid>e5342b10-43ea-4199-8bb3-0aa50f8ddd11</uuid>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <name>instance-00000085</name>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestGettingAddress-server-1459174835</nova:name>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:12:30</nova:creationTime>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:port uuid="77356784-a6b0-4c1e-9094-0881172d3c27">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        <nova:port uuid="e94797a8-0eb2-455e-aebc-d72f3acfb7a6">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe7a:147e" ipVersion="6"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe7a:147e" ipVersion="6"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <entry name="serial">e5342b10-43ea-4199-8bb3-0aa50f8ddd11</entry>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <entry name="uuid">e5342b10-43ea-4199-8bb3-0aa50f8ddd11</entry>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.config"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:f5:80:44"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <target dev="tap77356784-a6"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:7a:14:7e"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <target dev="tape94797a8-0e"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/console.log" append="off"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:12:30 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:12:30 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:12:30 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:12:30 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.928 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Preparing to wait for external event network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.929 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.929 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.929 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.929 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Preparing to wait for external event network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.930 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.930 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.930 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.930 186792 DEBUG nova.virt.libvirt.vif [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:12:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1459174835',display_name='tempest-TestGettingAddress-server-1459174835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1459174835',id=133,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-mrw6wiad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:12:20Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=e5342b10-43ea-4199-8bb3-0aa50f8ddd11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.931 186792 DEBUG nova.network.os_vif_util [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.931 186792 DEBUG nova.network.os_vif_util [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:80:44,bridge_name='br-int',has_traffic_filtering=True,id=77356784-a6b0-4c1e-9094-0881172d3c27,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77356784-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.932 186792 DEBUG os_vif [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:80:44,bridge_name='br-int',has_traffic_filtering=True,id=77356784-a6b0-4c1e-9094-0881172d3c27,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77356784-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.932 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.932 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.933 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.936 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.936 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77356784-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.936 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77356784-a6, col_values=(('external_ids', {'iface-id': '77356784-a6b0-4c1e-9094-0881172d3c27', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:80:44', 'vm-uuid': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.938 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:30 np0005531888 NetworkManager[55166]: <info>  [1763799150.9390] manager: (tap77356784-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.940 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.946 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.947 186792 INFO os_vif [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:80:44,bridge_name='br-int',has_traffic_filtering=True,id=77356784-a6b0-4c1e-9094-0881172d3c27,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77356784-a6')#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.948 186792 DEBUG nova.virt.libvirt.vif [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:12:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1459174835',display_name='tempest-TestGettingAddress-server-1459174835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1459174835',id=133,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-mrw6wiad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:12:20Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=e5342b10-43ea-4199-8bb3-0aa50f8ddd11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.948 186792 DEBUG nova.network.os_vif_util [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.949 186792 DEBUG nova.network.os_vif_util [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:14:7e,bridge_name='br-int',has_traffic_filtering=True,id=e94797a8-0eb2-455e-aebc-d72f3acfb7a6,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94797a8-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.949 186792 DEBUG os_vif [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:14:7e,bridge_name='br-int',has_traffic_filtering=True,id=e94797a8-0eb2-455e-aebc-d72f3acfb7a6,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94797a8-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.950 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.950 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.950 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.953 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.953 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape94797a8-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.954 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape94797a8-0e, col_values=(('external_ids', {'iface-id': 'e94797a8-0eb2-455e-aebc-d72f3acfb7a6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:14:7e', 'vm-uuid': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.955 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:30 np0005531888 NetworkManager[55166]: <info>  [1763799150.9562] manager: (tape94797a8-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.962 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:30 np0005531888 nova_compute[186788]: 2025-11-22 08:12:30.963 186792 INFO os_vif [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:14:7e,bridge_name='br-int',has_traffic_filtering=True,id=e94797a8-0eb2-455e-aebc-d72f3acfb7a6,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94797a8-0e')#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.032 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.032 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.033 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:f5:80:44, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.033 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:7a:14:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.033 186792 INFO nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Using config drive#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.705 186792 INFO nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Creating config drive at /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.config#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.710 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdjaq3szx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.839 186792 DEBUG oslo_concurrency.processutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdjaq3szx" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:12:31 np0005531888 kernel: tap77356784-a6: entered promiscuous mode
Nov 22 03:12:31 np0005531888 NetworkManager[55166]: <info>  [1763799151.9270] manager: (tap77356784-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/247)
Nov 22 03:12:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:31Z|00516|binding|INFO|Claiming lport 77356784-a6b0-4c1e-9094-0881172d3c27 for this chassis.
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.934 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:31Z|00517|binding|INFO|77356784-a6b0-4c1e-9094-0881172d3c27: Claiming fa:16:3e:f5:80:44 10.100.0.4
Nov 22 03:12:31 np0005531888 kernel: tape94797a8-0e: entered promiscuous mode
Nov 22 03:12:31 np0005531888 NetworkManager[55166]: <info>  [1763799151.9469] manager: (tape94797a8-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/248)
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.949 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.951 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.954 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:31 np0005531888 nova_compute[186788]: 2025-11-22 08:12:31.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:31 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:31Z|00518|if_status|INFO|Not updating pb chassis for e94797a8-0eb2-455e-aebc-d72f3acfb7a6 now as sb is readonly
Nov 22 03:12:31 np0005531888 NetworkManager[55166]: <info>  [1763799151.9591] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Nov 22 03:12:31 np0005531888 NetworkManager[55166]: <info>  [1763799151.9599] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/250)
Nov 22 03:12:31 np0005531888 systemd-udevd[235902]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:12:31 np0005531888 systemd-udevd[235903]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:12:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:31.967 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:80:44 10.100.0.4'], port_security=['fa:16:3e:f5:80:44 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04f3fbae-1178-425a-a955-30dcd392a3d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd57662f9-c343-413b-940d-39a2648160cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79e2fe83-1ab0-49c1-acb4-3bc86f0137dc, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=77356784-a6b0-4c1e-9094-0881172d3c27) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:12:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:31.968 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 77356784-a6b0-4c1e-9094-0881172d3c27 in datapath 04f3fbae-1178-425a-a955-30dcd392a3d3 bound to our chassis#033[00m
Nov 22 03:12:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:31.970 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04f3fbae-1178-425a-a955-30dcd392a3d3#033[00m
Nov 22 03:12:31 np0005531888 NetworkManager[55166]: <info>  [1763799151.9822] device (tape94797a8-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:12:31 np0005531888 NetworkManager[55166]: <info>  [1763799151.9844] device (tap77356784-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:12:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:31.985 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[04177e5a-e56e-4ea1-aacc-c331e7baf3db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:31.986 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap04f3fbae-11 in ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:12:31 np0005531888 NetworkManager[55166]: <info>  [1763799151.9869] device (tape94797a8-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:12:31 np0005531888 NetworkManager[55166]: <info>  [1763799151.9880] device (tap77356784-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:12:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:31.988 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap04f3fbae-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:12:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:31.989 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1c62db02-628c-436a-bf4b-076acf7fb676]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:31.990 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7514d068-4619-4fe8-a38a-d5cc6caf00fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:31 np0005531888 systemd-machined[153106]: New machine qemu-65-instance-00000085.
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.007 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ede239-f393-4ba7-85cf-b85bff5025be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 systemd[1]: Started Virtual Machine qemu-65-instance-00000085.
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.036 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[316bc1ab-a64d-40aa-a041-9ee125483ecb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.077 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[de8ca303-043f-40eb-ade5-081f3773ea7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 NetworkManager[55166]: <info>  [1763799152.1004] manager: (tap04f3fbae-10): new Veth device (/org/freedesktop/NetworkManager/Devices/251)
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.099 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dafe91ec-292d-4851-a887-d5e26b36153b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.135 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[95e2518b-3ff8-474e-824c-65414bd546bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.138 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9757b8-d653-43fc-8c14-1e37467c9d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 NetworkManager[55166]: <info>  [1763799152.1591] device (tap04f3fbae-10): carrier: link connected
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.162 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.165 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[274fb10b-b46f-42f7-8660-d8714a9147f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:32Z|00519|binding|INFO|Claiming lport e94797a8-0eb2-455e-aebc-d72f3acfb7a6 for this chassis.
Nov 22 03:12:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:32Z|00520|binding|INFO|e94797a8-0eb2-455e-aebc-d72f3acfb7a6: Claiming fa:16:3e:7a:14:7e 2001:db8:0:1:f816:3eff:fe7a:147e 2001:db8::f816:3eff:fe7a:147e
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.183 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[557255cb-d127-4fb5-8d2b-57b5588f3ac7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04f3fbae-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:20:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589280, 'reachable_time': 37652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235941, 'error': None, 'target': 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:32Z|00521|binding|INFO|Setting lport 77356784-a6b0-4c1e-9094-0881172d3c27 ovn-installed in OVS
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.205 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.208 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:14:7e 2001:db8:0:1:f816:3eff:fe7a:147e 2001:db8::f816:3eff:fe7a:147e'], port_security=['fa:16:3e:7a:14:7e 2001:db8:0:1:f816:3eff:fe7a:147e 2001:db8::f816:3eff:fe7a:147e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe7a:147e/64 2001:db8::f816:3eff:fe7a:147e/64', 'neutron:device_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd57662f9-c343-413b-940d-39a2648160cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d81a98b9-7f60-4da8-a82f-30c94c08d498, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e94797a8-0eb2-455e-aebc-d72f3acfb7a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:12:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:32Z|00522|binding|INFO|Setting lport 77356784-a6b0-4c1e-9094-0881172d3c27 up in Southbound
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.213 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[925e102e-31a6-48f3-a656-afdaf1a620d5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:2089'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589280, 'tstamp': 589280}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235947, 'error': None, 'target': 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:32Z|00523|binding|INFO|Setting lport e94797a8-0eb2-455e-aebc-d72f3acfb7a6 ovn-installed in OVS
Nov 22 03:12:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:32Z|00524|binding|INFO|Setting lport e94797a8-0eb2-455e-aebc-d72f3acfb7a6 up in Southbound
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.223 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.235 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[edffdf12-a4f1-4b19-b2be-5297fafec39f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04f3fbae-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:20:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589280, 'reachable_time': 37652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235948, 'error': None, 'target': 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.265 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[188d262c-64c8-4386-a50f-dc03b8e171b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.269 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799152.2693083, e5342b10-43ea-4199-8bb3-0aa50f8ddd11 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.270 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] VM Started (Lifecycle Event)#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.296 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.301 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799152.2694952, e5342b10-43ea-4199-8bb3-0aa50f8ddd11 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.301 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.317 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.320 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.332 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6afeb82e-f62c-4451-b4a0-d021bcd8a9cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.333 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04f3fbae-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.333 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.334 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04f3fbae-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.334 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.335 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:32 np0005531888 kernel: tap04f3fbae-10: entered promiscuous mode
Nov 22 03:12:32 np0005531888 NetworkManager[55166]: <info>  [1763799152.3363] manager: (tap04f3fbae-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.337 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.339 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04f3fbae-10, col_values=(('external_ids', {'iface-id': '725c746c-ac46-482e-8d13-14e88613ed55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.341 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:32Z|00525|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.341 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.342 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/04f3fbae-1178-425a-a955-30dcd392a3d3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/04f3fbae-1178-425a-a955-30dcd392a3d3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.343 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[20e36cbc-621d-4624-b8c9-7bebeb956f57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.344 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-04f3fbae-1178-425a-a955-30dcd392a3d3
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/04f3fbae-1178-425a-a955-30dcd392a3d3.pid.haproxy
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 04f3fbae-1178-425a-a955-30dcd392a3d3
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:12:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:32.344 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'env', 'PROCESS_TAG=haproxy-04f3fbae-1178-425a-a955-30dcd392a3d3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/04f3fbae-1178-425a-a955-30dcd392a3d3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.352 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:32 np0005531888 podman[235981]: 2025-11-22 08:12:32.678202396 +0000 UTC m=+0.020804741 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.783 186792 DEBUG nova.compute.manager [req-cde36dc7-9b92-4f28-beb7-9961a598cec3 req-74e093a2-257f-4255-955a-b97cdc105595 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.784 186792 DEBUG oslo_concurrency.lockutils [req-cde36dc7-9b92-4f28-beb7-9961a598cec3 req-74e093a2-257f-4255-955a-b97cdc105595 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.785 186792 DEBUG oslo_concurrency.lockutils [req-cde36dc7-9b92-4f28-beb7-9961a598cec3 req-74e093a2-257f-4255-955a-b97cdc105595 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.785 186792 DEBUG oslo_concurrency.lockutils [req-cde36dc7-9b92-4f28-beb7-9961a598cec3 req-74e093a2-257f-4255-955a-b97cdc105595 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.786 186792 DEBUG nova.compute.manager [req-cde36dc7-9b92-4f28-beb7-9961a598cec3 req-74e093a2-257f-4255-955a-b97cdc105595 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Processing event network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.865 186792 DEBUG nova.compute.manager [req-81a225ba-a200-4e97-93d5-d176f0f4ca77 req-caaa84e6-9dfb-4df4-9f06-6837be36f416 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.866 186792 DEBUG oslo_concurrency.lockutils [req-81a225ba-a200-4e97-93d5-d176f0f4ca77 req-caaa84e6-9dfb-4df4-9f06-6837be36f416 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.866 186792 DEBUG oslo_concurrency.lockutils [req-81a225ba-a200-4e97-93d5-d176f0f4ca77 req-caaa84e6-9dfb-4df4-9f06-6837be36f416 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.866 186792 DEBUG oslo_concurrency.lockutils [req-81a225ba-a200-4e97-93d5-d176f0f4ca77 req-caaa84e6-9dfb-4df4-9f06-6837be36f416 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.866 186792 DEBUG nova.compute.manager [req-81a225ba-a200-4e97-93d5-d176f0f4ca77 req-caaa84e6-9dfb-4df4-9f06-6837be36f416 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Processing event network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.867 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.870 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799152.8706841, e5342b10-43ea-4199-8bb3-0aa50f8ddd11 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.871 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.874 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.878 186792 INFO nova.virt.libvirt.driver [-] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Instance spawned successfully.#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.878 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.905 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.910 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.915 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.915 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.916 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.917 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.917 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.917 186792 DEBUG nova.virt.libvirt.driver [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.947 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.997 186792 INFO nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Took 12.18 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:12:32 np0005531888 nova_compute[186788]: 2025-11-22 08:12:32.998 186792 DEBUG nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:12:32 np0005531888 podman[235981]: 2025-11-22 08:12:32.998632477 +0000 UTC m=+0.341234802 container create cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:12:33 np0005531888 systemd[1]: Started libpod-conmon-cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8.scope.
Nov 22 03:12:33 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:12:33 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e72f92230cd5d8cae98ace53ed166a07e561be82edfd2e38f1a6695483e0d725/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:12:33 np0005531888 podman[235981]: 2025-11-22 08:12:33.082917171 +0000 UTC m=+0.425519526 container init cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:12:33 np0005531888 nova_compute[186788]: 2025-11-22 08:12:33.085 186792 INFO nova.compute.manager [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Took 12.73 seconds to build instance.#033[00m
Nov 22 03:12:33 np0005531888 podman[235981]: 2025-11-22 08:12:33.088781795 +0000 UTC m=+0.431384120 container start cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:12:33 np0005531888 nova_compute[186788]: 2025-11-22 08:12:33.106 186792 DEBUG oslo_concurrency.lockutils [None req-7e82002b-4465-40e2-9c14-595bd998d6a1 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:33 np0005531888 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[235996]: [NOTICE]   (236000) : New worker (236002) forked
Nov 22 03:12:33 np0005531888 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[235996]: [NOTICE]   (236000) : Loading success.
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.154 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e94797a8-0eb2-455e-aebc-d72f3acfb7a6 in datapath 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 unbound from our chassis#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.157 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.169 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[53b9d5c8-1235-4d02-8b07-6f66f2f473f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.169 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2b7e9f2d-21 in ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.171 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2b7e9f2d-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.171 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8aba0420-b03b-440f-bd72-85b5fc6aa5bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.172 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c34cc45d-ab6f-464e-8f67-ec1d0007a5a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.184 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f2fe50a2-bc07-4957-9a31-8706415088c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.196 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8af78ef7-5624-4a77-9d26-65df72aff9a4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.224 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[92b8950d-ba04-498e-9e37-df3c4786d853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 NetworkManager[55166]: <info>  [1763799153.2320] manager: (tap2b7e9f2d-20): new Veth device (/org/freedesktop/NetworkManager/Devices/253)
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.232 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba5b5f8-cff0-4096-8fc1-3ac3d6030bf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 systemd-udevd[235926]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.262 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[bf298fcf-9446-4c4e-8637-d29d0bf45799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.266 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[985748a6-760e-4387-96f9-6dd08016cc6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 podman[236013]: 2025-11-22 08:12:33.285441381 +0000 UTC m=+0.069692785 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Nov 22 03:12:33 np0005531888 NetworkManager[55166]: <info>  [1763799153.2887] device (tap2b7e9f2d-20): carrier: link connected
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.293 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4112357d-c876-4724-84a3-5ebb97f4fa4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.313 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2fcb4e-ac2f-48ba-aef4-aafce31444f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2b7e9f2d-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:2c:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589393, 'reachable_time': 24400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236061, 'error': None, 'target': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.330 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[398d69e5-b43a-49e2-a91f-5290eb75fbcd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:2c69'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589393, 'tstamp': 589393}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236064, 'error': None, 'target': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 podman[236016]: 2025-11-22 08:12:33.345312073 +0000 UTC m=+0.129174017 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.345 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[203df830-a8ef-4738-99f2-7af504249ece]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2b7e9f2d-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:2c:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589393, 'reachable_time': 24400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236065, 'error': None, 'target': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.377 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dc91a796-008b-4455-8af9-556cba1eaa19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.414 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4e88a65e-bc9e-4056-9c4a-bd776e097501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.415 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b7e9f2d-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.416 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.416 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b7e9f2d-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:33 np0005531888 kernel: tap2b7e9f2d-20: entered promiscuous mode
Nov 22 03:12:33 np0005531888 nova_compute[186788]: 2025-11-22 08:12:33.418 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:33 np0005531888 NetworkManager[55166]: <info>  [1763799153.4214] manager: (tap2b7e9f2d-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.421 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2b7e9f2d-20, col_values=(('external_ids', {'iface-id': 'f86e6fc7-3969-4922-9612-9c86d85f21ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:33 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:33Z|00526|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:12:33 np0005531888 nova_compute[186788]: 2025-11-22 08:12:33.434 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.435 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.436 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f66f1edf-d880-45de-9780-ee8a0b07d8c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.437 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94.pid.haproxy
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:12:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:33.439 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'env', 'PROCESS_TAG=haproxy-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:12:33 np0005531888 nova_compute[186788]: 2025-11-22 08:12:33.614 186792 DEBUG nova.network.neutron [req-f38e13ff-36f2-4d8f-899b-05bef02f56f6 req-3c5d9803-bb5b-4193-8322-6bc0ace9e6b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updated VIF entry in instance network info cache for port e94797a8-0eb2-455e-aebc-d72f3acfb7a6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:12:33 np0005531888 nova_compute[186788]: 2025-11-22 08:12:33.615 186792 DEBUG nova.network.neutron [req-f38e13ff-36f2-4d8f-899b-05bef02f56f6 req-3c5d9803-bb5b-4193-8322-6bc0ace9e6b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updating instance_info_cache with network_info: [{"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:12:33 np0005531888 nova_compute[186788]: 2025-11-22 08:12:33.644 186792 DEBUG oslo_concurrency.lockutils [req-f38e13ff-36f2-4d8f-899b-05bef02f56f6 req-3c5d9803-bb5b-4193-8322-6bc0ace9e6b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:12:33 np0005531888 nova_compute[186788]: 2025-11-22 08:12:33.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:33 np0005531888 podman[236096]: 2025-11-22 08:12:33.825641506 +0000 UTC m=+0.056847419 container create cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:12:33 np0005531888 systemd[1]: Started libpod-conmon-cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e.scope.
Nov 22 03:12:33 np0005531888 podman[236096]: 2025-11-22 08:12:33.789899727 +0000 UTC m=+0.021105650 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:12:33 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:12:33 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/726e13722ce5a9aa5f47af6de48093f58eed4d5aff1858d62843ed0018397fb7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:12:33 np0005531888 podman[236096]: 2025-11-22 08:12:33.935350984 +0000 UTC m=+0.166556917 container init cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:12:33 np0005531888 podman[236096]: 2025-11-22 08:12:33.942035929 +0000 UTC m=+0.173241842 container start cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:12:33 np0005531888 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[236111]: [NOTICE]   (236115) : New worker (236117) forked
Nov 22 03:12:33 np0005531888 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[236111]: [NOTICE]   (236115) : Loading success.
Nov 22 03:12:34 np0005531888 nova_compute[186788]: 2025-11-22 08:12:34.958 186792 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:34 np0005531888 nova_compute[186788]: 2025-11-22 08:12:34.958 186792 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:34 np0005531888 nova_compute[186788]: 2025-11-22 08:12:34.959 186792 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:34 np0005531888 nova_compute[186788]: 2025-11-22 08:12:34.959 186792 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:34 np0005531888 nova_compute[186788]: 2025-11-22 08:12:34.959 186792 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] No waiting events found dispatching network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:12:34 np0005531888 nova_compute[186788]: 2025-11-22 08:12:34.959 186792 WARNING nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received unexpected event network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:12:35 np0005531888 nova_compute[186788]: 2025-11-22 08:12:35.105 186792 DEBUG nova.compute.manager [req-9e7f4ce0-c7cd-4cca-a3f2-a7a15f028856 req-59d42aef-e3f0-4272-be1d-8169b88771e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:35 np0005531888 nova_compute[186788]: 2025-11-22 08:12:35.105 186792 DEBUG oslo_concurrency.lockutils [req-9e7f4ce0-c7cd-4cca-a3f2-a7a15f028856 req-59d42aef-e3f0-4272-be1d-8169b88771e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:35 np0005531888 nova_compute[186788]: 2025-11-22 08:12:35.105 186792 DEBUG oslo_concurrency.lockutils [req-9e7f4ce0-c7cd-4cca-a3f2-a7a15f028856 req-59d42aef-e3f0-4272-be1d-8169b88771e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:35 np0005531888 nova_compute[186788]: 2025-11-22 08:12:35.105 186792 DEBUG oslo_concurrency.lockutils [req-9e7f4ce0-c7cd-4cca-a3f2-a7a15f028856 req-59d42aef-e3f0-4272-be1d-8169b88771e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:35 np0005531888 nova_compute[186788]: 2025-11-22 08:12:35.106 186792 DEBUG nova.compute.manager [req-9e7f4ce0-c7cd-4cca-a3f2-a7a15f028856 req-59d42aef-e3f0-4272-be1d-8169b88771e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] No waiting events found dispatching network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:12:35 np0005531888 nova_compute[186788]: 2025-11-22 08:12:35.106 186792 WARNING nova.compute.manager [req-9e7f4ce0-c7cd-4cca-a3f2-a7a15f028856 req-59d42aef-e3f0-4272-be1d-8169b88771e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received unexpected event network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:12:35 np0005531888 nova_compute[186788]: 2025-11-22 08:12:35.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:36.827 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:36.828 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:12:36.829 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.847 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'name': 'tempest-TestGettingAddress-server-1459174835', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000085', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.847 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.876 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.877 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3760a62-b7bf-4141-bb14-7d7f02bf16de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.847673', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01833e6c-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': '29470fcd90ec3a904ebc2a8889976d026bd677738ce04115a10534b1963831d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.847673', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01834c40-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': 'b5b83f5fba9b7c842eb5e6beb404737f2ec39fc56c43f3049f2b70ffd87d33aa'}]}, 'timestamp': '2025-11-22 08:12:36.877367', '_unique_id': '47c3f68ccc424c6ab953fab80320d9bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.880 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.880 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efd2bfa1-e6ce-4dc4-a0be-3eaa9aa2677e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.879979', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0183bffe-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': '36289ec6bbff5fd0aa66963903e102bbe0b03d6acf7355754c76b140fa827648'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.879979', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0183d21e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': 'fe9dc3bb2abd617c6fac625f3ba6f15adeeded8ef230de44c8ea0773581b34e2'}]}, 'timestamp': '2025-11-22 08:12:36.880799', '_unique_id': 'd81783cd94044a058f6e00a06a77efa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.882 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.882 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbe0d0ae-9c73-4f51-b7d3-acedfd736409', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.882506', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0184214c-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': 'dc851483a3b22c84d103346e06a82cd80390024e2fb9e02961b1b8489af9fca4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.882506', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '018429e4-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': 'f6ab8ddad0b7ae6e2a6584a27ba1d799b44136a7ecfdff68edd609bc155ab8e1'}]}, 'timestamp': '2025-11-22 08:12:36.883027', '_unique_id': '48deb349ce4b4b39a8850dc7fd7624d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.883 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.884 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.888 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e5342b10-43ea-4199-8bb3-0aa50f8ddd11 / tap77356784-a6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.889 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e5342b10-43ea-4199-8bb3-0aa50f8ddd11 / tape94797a8-0e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.889 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.890 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e4ad9e5-0643-48c7-8cbf-3870a9361dbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.884203', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '01853d20-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '106c883b0120113bc775b9d3e9c3c969513568882ce75c550a9a094b786f7037'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.884203', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '01854ce8-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '64e72064646400f330552d1b8e8a703df4ada2c231696b03c6ec58e6815b8ef6'}]}, 'timestamp': '2025-11-22 08:12:36.890498', '_unique_id': '28695d98a9394c268c2f2627a088d657'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.892 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.893 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89e360bf-0a8c-4fc9-8bc5-7d4411783c4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.892705', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '0185b1ce-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '32a547734f92dbdad86d28223a5aecdd9aa7d5f94a450f1c14ea11e376ac37e7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.892705', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '0185bf0c-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '344dc0bf4d5eb43e1d81443de400f3e97ca2398ee821d24d05d0c03240afb5fa'}]}, 'timestamp': '2025-11-22 08:12:36.893412', '_unique_id': 'bf9a9ae97b3e4426b4190308188d2143'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.895 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.895 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ebd5e2b-2c7f-40e1-9f8f-623064bae7e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.895408', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '01861c0e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': 'f3c2c86e8c099517146a83bffeeef83593345ef1acaed409c9cfb55db23ff7ca'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.895408', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '01862960-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '670e06e73a37ecd2cf0f19ee72a666cd632fbcba8664e2553ef654d32cb41cd1'}]}, 'timestamp': '2025-11-22 08:12:36.896134', '_unique_id': '3be1c4f52bdb41eaa62abf44a123580b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.896 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.898 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.898 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1459174835>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1459174835>]
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.898 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.898 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.898 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1459174835>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1459174835>]
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.899 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.918 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/cpu volume: 3850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52bfcc17-f7ca-4417-84c3-322e3fdc4d0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3850000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'timestamp': '2025-11-22T08:12:36.899294', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0189a3a6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.61741972, 'message_signature': '1e100296ab2afb7344b81e1ba25224d0bfeb782987b1f09c5fcd69430cc1ba2d'}]}, 'timestamp': '2025-11-22 08:12:36.919083', '_unique_id': 'a7017655a0c14fb982f4e2035985c99a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.922 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.922 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b15e375a-78cc-4bc3-a122-c890ca2c87d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.922489', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '018a4040-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': 'a8e9914c1027fa66b1dabec6ec44ea863557e91f1a39a5bd8ce248eee1644520'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.922489', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '018a4eb4-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '46639994c7b7824b56f97187c2b13e730f68234c27c81f0247a6bad9c3b0f2f8'}]}, 'timestamp': '2025-11-22 08:12:36.923271', '_unique_id': 'dae9654cdeb04880a5e9f40d364a1211'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.925 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.925 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa5e5cbc-4457-4128-baf4-a343e64d0e90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.925176', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '018aa594-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': 'bcbe8fb8fb6d6b25b056fb587c1c65fba095e99f10a92aff2831b8491b03fddf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.925176', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '018ab106-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': '63357cfce7ef328e3f6f55bf15841f8d4b62335a3c346651153cd00d1159fd1a'}]}, 'timestamp': '2025-11-22 08:12:36.925774', '_unique_id': '0f06bd8b79434f45ba24ca8cbc943a90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.928 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.928 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c153080c-ad9b-4639-b5fb-6806877ef732', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.928145', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '018b1baa-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '624a59591c0a4ecbf5fd9791f05303d205a69017e0a2574d75852dbdab4507c9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.928145', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '018b2bfe-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '9d6d4dd5b0daf83cefd81547bafd585355e46a46bbfd23275d0497fb79041475'}]}, 'timestamp': '2025-11-22 08:12:36.928979', '_unique_id': '6df0547346744f4092dba098a9396390'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.931 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.931 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66bf8f5c-c853-48da-a01d-e600b488df79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.931462', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '018b9d82-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '070f3c90eef6ef5b2317b8b92d56c1bc37daed8c2d40324009f36f8ca7b8b67f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.931462', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '018ba9c6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': 'a84ec3473a6a5e9a7db4891b7ae2aa06026667ad8e655fcf225fb9757fc5ff18'}]}, 'timestamp': '2025-11-22 08:12:36.932151', '_unique_id': '72ee23fdb74b4dcdbc2caac65349f4c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.934 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.934 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1459174835>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1459174835>]
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.934 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.934 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b03c8477-a0e0-433f-a5bd-525d78fce56a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.934618', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '018c1762-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': 'de45619cb3b40e213280ca9652edd6d3d3bc8b7a168d4035aa0a6f6a620a80a4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.934618', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '018c2446-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': 'b4d7ea91a7d352cbf6965aa7b8d94451c5770ea0f8b470bb09bced6dbc08b5ca'}]}, 'timestamp': '2025-11-22 08:12:36.935311', '_unique_id': '47c933bccf104fcbbb93321a20cd1e3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.937 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.951 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.952 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d9690d8-f0a9-4939-9753-e049ea5a9f9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.937430', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '018ec9f8-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.636997711, 'message_signature': '18323709a5046c9ccde5ddbb8cf81ce1cfcf9e623f8ceb0c6077f3f2535f83a6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.937430', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '018ed9ac-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.636997711, 'message_signature': '24da9895a6365f4949f9a04a75bd81cfbbfacfc59e15cc0d13a71b7f455f1825'}]}, 'timestamp': '2025-11-22 08:12:36.953051', '_unique_id': '1d3c4bb92d534d6483db84912f7526c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.955 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.956 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.956 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '793c195d-6765-4ee0-b180-a5608a88a4ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.956063', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '018f5bb6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.636997711, 'message_signature': 'cd3528d97069e6ac080e40b880442505e3d430afdf2b2b21f38cd5b62df2fb54'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.956063', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '018f6e8a-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.636997711, 'message_signature': '7191eddd6fbb62bd7abd2b74305296b89a67af8f5cca1cd9a2a5e04b7033048e'}]}, 'timestamp': '2025-11-22 08:12:36.956839', '_unique_id': '46aeae24abc24366891ce2eea1daafa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.959 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.959 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1459174835>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1459174835>]
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.959 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.960 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3127c4b-2d06-497b-8fc6-30a9a8a54e5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.959773', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '018fed4c-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '5fe8fe9196d8a3ea2961f6d544dec9b1ccfe9fa2a3a95c1c1d15e7e4af74815c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.959773', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '018ff9ea-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': 'e4abfbd0a578cdedf5623d4c97bdcf166ddec7b992c0d4b0edcf4dbd0860e367'}]}, 'timestamp': '2025-11-22 08:12:36.960443', '_unique_id': '3acfe74bf05341dc9955619fade40fb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.962 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.962 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44e3932d-b052-4a55-81c4-9d64aeb31afe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.962603', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '01905aac-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': 'd0d68bb981ade707e1bb97fe5969bc9eed743152e4a415ea79769a85a9c62ae8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.962603', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '01906cd6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': 'd0d2e25612dc97bebae9bdfd9967dfe984b23eb2570c826ed192a58dc1852e86'}]}, 'timestamp': '2025-11-22 08:12:36.963401', '_unique_id': 'bbbc7c21581240aa9bd0b4c4313cd176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.965 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.966 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2398395-380f-4823-95bf-aa8418f99d2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.965856', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0190db80-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.636997711, 'message_signature': '561000f9cb644d9b29e82cf0f25222e98110ca95e5367388c4e0f71e0f4e0d11'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.965856', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0190e800-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.636997711, 'message_signature': '2356bbe266702eecf663edbc9a60256fe617d5e0ed6f0249e85421833fe274d2'}]}, 'timestamp': '2025-11-22 08:12:36.966505', '_unique_id': '5c30d0eb15fd47bdba1dc6c925b5289d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.968 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.968 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.969 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance e5342b10-43ea-4199-8bb3-0aa50f8ddd11: ceilometer.compute.pollsters.NoVolumeException
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.969 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.969 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.970 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1250a207-b877-443e-bd27-b1006a78b709', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.969512', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '01916c44-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '933c6191ee0a639f8b02d494669e49812e213b47a539c75ca4062f50272b52db'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.969512', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '019186b6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '8c10137d959b823e4bd2947ed70fc0a2ebc1f055703065b9df899473c28aaee2'}]}, 'timestamp': '2025-11-22 08:12:36.970658', '_unique_id': '41c5274d79684d00b306d23a92d630b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.973 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.973 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.read.latency volume: 860388041 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.973 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk.device.read.latency volume: 3518657 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4d29f75-971a-4286-b864-5c70a43c94cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 860388041, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-vda', 'timestamp': '2025-11-22T08:12:36.973418', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01920668-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': '0fc5021f2a7ac56ae04fdd2927b349f6cc9f95e35d4bc4f74402ef61cd880531'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3518657, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11-sda', 'timestamp': '2025-11-22T08:12:36.973418', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'instance-00000085', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '019218b0-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.547162501, 'message_signature': 'e61d4e40ebcca2f92c1c2b776b1c793d12e4c94cb4eec55519d5476bdfd12ba1'}]}, 'timestamp': '2025-11-22 08:12:36.974391', '_unique_id': '2b528c05de20450b986b09bcd0b43dc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.977 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.977 12 DEBUG ceilometer.compute.pollsters [-] e5342b10-43ea-4199-8bb3-0aa50f8ddd11/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11963de6-fd38-4dec-8117-d07b106926f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tap77356784-a6', 'timestamp': '2025-11-22T08:12:36.976996', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tap77356784-a6', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:80:44', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap77356784-a6'}, 'message_id': '01928e12-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '5c93f95fb965f30f343efb9468b099325897fddf00f082f82edd61a65e71881a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000085-e5342b10-43ea-4199-8bb3-0aa50f8ddd11-tape94797a8-0e', 'timestamp': '2025-11-22T08:12:36.976996', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1459174835', 'name': 'tape94797a8-0e', 'instance_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7a:14:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape94797a8-0e'}, 'message_id': '01929916-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 5897.58371533, 'message_signature': '63879e0b11dd67cb63394e4b726a175420499e2a0ceeb8ca0c37936c6d4770d3'}]}, 'timestamp': '2025-11-22 08:12:36.977620', '_unique_id': 'f808ebe5f2bb4fb78fd228a33b9d359b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:12:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:37 np0005531888 nova_compute[186788]: 2025-11-22 08:12:37.109 186792 DEBUG nova.compute.manager [req-7bc42893-31b4-4acd-8709-3654224e2b79 req-5a6b8dba-a1ab-4100-8289-6488a0ece778 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-changed-77356784-a6b0-4c1e-9094-0881172d3c27 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:12:37 np0005531888 nova_compute[186788]: 2025-11-22 08:12:37.110 186792 DEBUG nova.compute.manager [req-7bc42893-31b4-4acd-8709-3654224e2b79 req-5a6b8dba-a1ab-4100-8289-6488a0ece778 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Refreshing instance network info cache due to event network-changed-77356784-a6b0-4c1e-9094-0881172d3c27. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:12:37 np0005531888 nova_compute[186788]: 2025-11-22 08:12:37.110 186792 DEBUG oslo_concurrency.lockutils [req-7bc42893-31b4-4acd-8709-3654224e2b79 req-5a6b8dba-a1ab-4100-8289-6488a0ece778 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:12:37 np0005531888 nova_compute[186788]: 2025-11-22 08:12:37.112 186792 DEBUG oslo_concurrency.lockutils [req-7bc42893-31b4-4acd-8709-3654224e2b79 req-5a6b8dba-a1ab-4100-8289-6488a0ece778 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:12:37 np0005531888 nova_compute[186788]: 2025-11-22 08:12:37.113 186792 DEBUG nova.network.neutron [req-7bc42893-31b4-4acd-8709-3654224e2b79 req-5a6b8dba-a1ab-4100-8289-6488a0ece778 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Refreshing network info cache for port 77356784-a6b0-4c1e-9094-0881172d3c27 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:12:38 np0005531888 nova_compute[186788]: 2025-11-22 08:12:38.791 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:39 np0005531888 nova_compute[186788]: 2025-11-22 08:12:39.304 186792 DEBUG nova.network.neutron [req-7bc42893-31b4-4acd-8709-3654224e2b79 req-5a6b8dba-a1ab-4100-8289-6488a0ece778 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updated VIF entry in instance network info cache for port 77356784-a6b0-4c1e-9094-0881172d3c27. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:12:39 np0005531888 nova_compute[186788]: 2025-11-22 08:12:39.305 186792 DEBUG nova.network.neutron [req-7bc42893-31b4-4acd-8709-3654224e2b79 req-5a6b8dba-a1ab-4100-8289-6488a0ece778 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updating instance_info_cache with network_info: [{"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:12:39 np0005531888 nova_compute[186788]: 2025-11-22 08:12:39.365 186792 DEBUG oslo_concurrency.lockutils [req-7bc42893-31b4-4acd-8709-3654224e2b79 req-5a6b8dba-a1ab-4100-8289-6488a0ece778 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:12:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:40Z|00527|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:12:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:40Z|00528|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:12:40 np0005531888 nova_compute[186788]: 2025-11-22 08:12:40.088 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:40 np0005531888 nova_compute[186788]: 2025-11-22 08:12:40.959 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:42Z|00529|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:12:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:42Z|00530|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:12:42 np0005531888 nova_compute[186788]: 2025-11-22 08:12:42.447 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:43Z|00531|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:12:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:43Z|00532|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:12:43 np0005531888 nova_compute[186788]: 2025-11-22 08:12:43.468 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:43 np0005531888 nova_compute[186788]: 2025-11-22 08:12:43.793 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:45 np0005531888 podman[236129]: 2025-11-22 08:12:45.687861857 +0000 UTC m=+0.053905786 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:12:45 np0005531888 podman[236128]: 2025-11-22 08:12:45.708015623 +0000 UTC m=+0.077260862 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:12:45 np0005531888 nova_compute[186788]: 2025-11-22 08:12:45.961 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:48Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f5:80:44 10.100.0.4
Nov 22 03:12:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:12:48Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f5:80:44 10.100.0.4
Nov 22 03:12:48 np0005531888 nova_compute[186788]: 2025-11-22 08:12:48.796 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:50 np0005531888 nova_compute[186788]: 2025-11-22 08:12:50.473 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:50 np0005531888 nova_compute[186788]: 2025-11-22 08:12:50.963 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:52 np0005531888 nova_compute[186788]: 2025-11-22 08:12:52.268 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:52 np0005531888 nova_compute[186788]: 2025-11-22 08:12:52.859 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:53 np0005531888 nova_compute[186788]: 2025-11-22 08:12:53.799 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:55 np0005531888 nova_compute[186788]: 2025-11-22 08:12:55.656 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:55 np0005531888 podman[236191]: 2025-11-22 08:12:55.677425961 +0000 UTC m=+0.056097121 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:12:55 np0005531888 nova_compute[186788]: 2025-11-22 08:12:55.966 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:57 np0005531888 podman[236210]: 2025-11-22 08:12:57.675281655 +0000 UTC m=+0.048878173 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:12:58 np0005531888 nova_compute[186788]: 2025-11-22 08:12:58.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:58 np0005531888 nova_compute[186788]: 2025-11-22 08:12:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:59 np0005531888 nova_compute[186788]: 2025-11-22 08:12:59.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:59 np0005531888 nova_compute[186788]: 2025-11-22 08:12:59.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:12:59 np0005531888 nova_compute[186788]: 2025-11-22 08:12:59.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:13:00 np0005531888 nova_compute[186788]: 2025-11-22 08:13:00.145 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:13:00 np0005531888 nova_compute[186788]: 2025-11-22 08:13:00.145 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:13:00 np0005531888 nova_compute[186788]: 2025-11-22 08:13:00.145 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:13:00 np0005531888 nova_compute[186788]: 2025-11-22 08:13:00.145 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e5342b10-43ea-4199-8bb3-0aa50f8ddd11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:13:00 np0005531888 podman[236235]: 2025-11-22 08:13:00.685750602 +0000 UTC m=+0.056504931 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350)
Nov 22 03:13:00 np0005531888 nova_compute[186788]: 2025-11-22 08:13:00.969 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:01 np0005531888 nova_compute[186788]: 2025-11-22 08:13:01.267 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:03 np0005531888 podman[236256]: 2025-11-22 08:13:03.709523915 +0000 UTC m=+0.074933653 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:13:03 np0005531888 podman[236257]: 2025-11-22 08:13:03.711021643 +0000 UTC m=+0.072689148 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:13:03 np0005531888 nova_compute[186788]: 2025-11-22 08:13:03.804 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:04 np0005531888 nova_compute[186788]: 2025-11-22 08:13:04.247 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updating instance_info_cache with network_info: [{"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:04 np0005531888 nova_compute[186788]: 2025-11-22 08:13:04.264 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:13:04 np0005531888 nova_compute[186788]: 2025-11-22 08:13:04.264 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:13:04 np0005531888 nova_compute[186788]: 2025-11-22 08:13:04.265 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:04 np0005531888 nova_compute[186788]: 2025-11-22 08:13:04.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:04 np0005531888 nova_compute[186788]: 2025-11-22 08:13:04.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:05 np0005531888 nova_compute[186788]: 2025-11-22 08:13:05.732 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:05 np0005531888 nova_compute[186788]: 2025-11-22 08:13:05.971 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:07 np0005531888 nova_compute[186788]: 2025-11-22 08:13:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:08 np0005531888 nova_compute[186788]: 2025-11-22 08:13:08.806 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:09 np0005531888 nova_compute[186788]: 2025-11-22 08:13:09.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:09 np0005531888 nova_compute[186788]: 2025-11-22 08:13:09.972 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:09 np0005531888 nova_compute[186788]: 2025-11-22 08:13:09.972 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:09 np0005531888 nova_compute[186788]: 2025-11-22 08:13:09.972 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:09 np0005531888 nova_compute[186788]: 2025-11-22 08:13:09.973 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.024 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.088 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.090 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.148 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.314 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.315 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5533MB free_disk=73.24533081054688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.316 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.316 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.420 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance e5342b10-43ea-4199-8bb3-0aa50f8ddd11 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.421 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.421 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.462 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.477 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.508 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.508 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:10 np0005531888 nova_compute[186788]: 2025-11-22 08:13:10.974 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:11 np0005531888 nova_compute[186788]: 2025-11-22 08:13:11.510 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:11 np0005531888 nova_compute[186788]: 2025-11-22 08:13:11.928 186792 DEBUG nova.compute.manager [req-e08fe44f-06fa-4514-875a-2fbbf1a4ad91 req-f480dabb-05eb-49ec-92c3-c83cb3d4e253 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-changed-77356784-a6b0-4c1e-9094-0881172d3c27 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:11 np0005531888 nova_compute[186788]: 2025-11-22 08:13:11.928 186792 DEBUG nova.compute.manager [req-e08fe44f-06fa-4514-875a-2fbbf1a4ad91 req-f480dabb-05eb-49ec-92c3-c83cb3d4e253 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Refreshing instance network info cache due to event network-changed-77356784-a6b0-4c1e-9094-0881172d3c27. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:13:11 np0005531888 nova_compute[186788]: 2025-11-22 08:13:11.928 186792 DEBUG oslo_concurrency.lockutils [req-e08fe44f-06fa-4514-875a-2fbbf1a4ad91 req-f480dabb-05eb-49ec-92c3-c83cb3d4e253 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:13:11 np0005531888 nova_compute[186788]: 2025-11-22 08:13:11.929 186792 DEBUG oslo_concurrency.lockutils [req-e08fe44f-06fa-4514-875a-2fbbf1a4ad91 req-f480dabb-05eb-49ec-92c3-c83cb3d4e253 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:13:11 np0005531888 nova_compute[186788]: 2025-11-22 08:13:11.929 186792 DEBUG nova.network.neutron [req-e08fe44f-06fa-4514-875a-2fbbf1a4ad91 req-f480dabb-05eb-49ec-92c3-c83cb3d4e253 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Refreshing network info cache for port 77356784-a6b0-4c1e-9094-0881172d3c27 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:13:11 np0005531888 nova_compute[186788]: 2025-11-22 08:13:11.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:11 np0005531888 nova_compute[186788]: 2025-11-22 08:13:11.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.056 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.057 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.057 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.057 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.058 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.067 186792 INFO nova.compute.manager [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Terminating instance#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.075 186792 DEBUG nova.compute.manager [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:13:12 np0005531888 kernel: tap77356784-a6 (unregistering): left promiscuous mode
Nov 22 03:13:12 np0005531888 NetworkManager[55166]: <info>  [1763799192.0968] device (tap77356784-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.108 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:12Z|00533|binding|INFO|Releasing lport 77356784-a6b0-4c1e-9094-0881172d3c27 from this chassis (sb_readonly=0)
Nov 22 03:13:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:12Z|00534|binding|INFO|Setting lport 77356784-a6b0-4c1e-9094-0881172d3c27 down in Southbound
Nov 22 03:13:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:12Z|00535|binding|INFO|Removing iface tap77356784-a6 ovn-installed in OVS
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.112 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:12.117 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:80:44 10.100.0.4'], port_security=['fa:16:3e:f5:80:44 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04f3fbae-1178-425a-a955-30dcd392a3d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd57662f9-c343-413b-940d-39a2648160cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79e2fe83-1ab0-49c1-acb4-3bc86f0137dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=77356784-a6b0-4c1e-9094-0881172d3c27) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:12.119 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 77356784-a6b0-4c1e-9094-0881172d3c27 in datapath 04f3fbae-1178-425a-a955-30dcd392a3d3 unbound from our chassis#033[00m
Nov 22 03:13:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:12.121 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04f3fbae-1178-425a-a955-30dcd392a3d3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:13:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:12.122 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5731bc-fd2b-4835-a486-68527b0836c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:12.122 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 namespace which is not needed anymore#033[00m
Nov 22 03:13:12 np0005531888 kernel: tape94797a8-0e (unregistering): left promiscuous mode
Nov 22 03:13:12 np0005531888 NetworkManager[55166]: <info>  [1763799192.1286] device (tape94797a8-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.129 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.138 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:12Z|00536|binding|INFO|Releasing lport e94797a8-0eb2-455e-aebc-d72f3acfb7a6 from this chassis (sb_readonly=0)
Nov 22 03:13:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:12Z|00537|binding|INFO|Setting lport e94797a8-0eb2-455e-aebc-d72f3acfb7a6 down in Southbound
Nov 22 03:13:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:12Z|00538|binding|INFO|Removing iface tape94797a8-0e ovn-installed in OVS
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.143 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:12.146 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:14:7e 2001:db8:0:1:f816:3eff:fe7a:147e 2001:db8::f816:3eff:fe7a:147e'], port_security=['fa:16:3e:7a:14:7e 2001:db8:0:1:f816:3eff:fe7a:147e 2001:db8::f816:3eff:fe7a:147e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe7a:147e/64 2001:db8::f816:3eff:fe7a:147e/64', 'neutron:device_id': 'e5342b10-43ea-4199-8bb3-0aa50f8ddd11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd57662f9-c343-413b-940d-39a2648160cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d81a98b9-7f60-4da8-a82f-30c94c08d498, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e94797a8-0eb2-455e-aebc-d72f3acfb7a6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.162 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000085.scope: Deactivated successfully.
Nov 22 03:13:12 np0005531888 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000085.scope: Consumed 15.498s CPU time.
Nov 22 03:13:12 np0005531888 systemd-machined[153106]: Machine qemu-65-instance-00000085 terminated.
Nov 22 03:13:12 np0005531888 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[235996]: [NOTICE]   (236000) : haproxy version is 2.8.14-c23fe91
Nov 22 03:13:12 np0005531888 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[235996]: [NOTICE]   (236000) : path to executable is /usr/sbin/haproxy
Nov 22 03:13:12 np0005531888 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[235996]: [WARNING]  (236000) : Exiting Master process...
Nov 22 03:13:12 np0005531888 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[235996]: [ALERT]    (236000) : Current worker (236002) exited with code 143 (Terminated)
Nov 22 03:13:12 np0005531888 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[235996]: [WARNING]  (236000) : All workers exited. Exiting... (0)
Nov 22 03:13:12 np0005531888 systemd[1]: libpod-cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8.scope: Deactivated successfully.
Nov 22 03:13:12 np0005531888 podman[236336]: 2025-11-22 08:13:12.296430435 +0000 UTC m=+0.078606315 container died cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.302 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.310 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 NetworkManager[55166]: <info>  [1763799192.3114] manager: (tape94797a8-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.352 186792 INFO nova.virt.libvirt.driver [-] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Instance destroyed successfully.#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.353 186792 DEBUG nova.objects.instance [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid e5342b10-43ea-4199-8bb3-0aa50f8ddd11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.366 186792 DEBUG nova.virt.libvirt.vif [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:12:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1459174835',display_name='tempest-TestGettingAddress-server-1459174835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1459174835',id=133,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:12:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-mrw6wiad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:12:33Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=e5342b10-43ea-4199-8bb3-0aa50f8ddd11,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.367 186792 DEBUG nova.network.os_vif_util [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.367 186792 DEBUG nova.network.os_vif_util [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:80:44,bridge_name='br-int',has_traffic_filtering=True,id=77356784-a6b0-4c1e-9094-0881172d3c27,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77356784-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.368 186792 DEBUG os_vif [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:80:44,bridge_name='br-int',has_traffic_filtering=True,id=77356784-a6b0-4c1e-9094-0881172d3c27,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77356784-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.370 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.370 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77356784-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.373 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.376 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.378 186792 INFO os_vif [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:80:44,bridge_name='br-int',has_traffic_filtering=True,id=77356784-a6b0-4c1e-9094-0881172d3c27,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77356784-a6')#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.378 186792 DEBUG nova.virt.libvirt.vif [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:12:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1459174835',display_name='tempest-TestGettingAddress-server-1459174835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1459174835',id=133,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:12:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-mrw6wiad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:12:33Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=e5342b10-43ea-4199-8bb3-0aa50f8ddd11,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.379 186792 DEBUG nova.network.os_vif_util [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.379 186792 DEBUG nova.network.os_vif_util [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:14:7e,bridge_name='br-int',has_traffic_filtering=True,id=e94797a8-0eb2-455e-aebc-d72f3acfb7a6,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94797a8-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.380 186792 DEBUG os_vif [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:14:7e,bridge_name='br-int',has_traffic_filtering=True,id=e94797a8-0eb2-455e-aebc-d72f3acfb7a6,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94797a8-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.381 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.381 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape94797a8-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.382 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.385 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.387 186792 INFO os_vif [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:14:7e,bridge_name='br-int',has_traffic_filtering=True,id=e94797a8-0eb2-455e-aebc-d72f3acfb7a6,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94797a8-0e')#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.388 186792 INFO nova.virt.libvirt.driver [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Deleting instance files /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11_del#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.389 186792 INFO nova.virt.libvirt.driver [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Deletion of /var/lib/nova/instances/e5342b10-43ea-4199-8bb3-0aa50f8ddd11_del complete#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.456 186792 INFO nova.compute.manager [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.456 186792 DEBUG oslo.service.loopingcall [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.456 186792 DEBUG nova.compute.manager [-] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.457 186792 DEBUG nova.network.neutron [-] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:13:12 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8-userdata-shm.mount: Deactivated successfully.
Nov 22 03:13:12 np0005531888 systemd[1]: var-lib-containers-storage-overlay-e72f92230cd5d8cae98ace53ed166a07e561be82edfd2e38f1a6695483e0d725-merged.mount: Deactivated successfully.
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.621 186792 DEBUG nova.compute.manager [req-b1eb7930-f083-4744-9a8a-ff4eb13c1e59 req-d28f02c3-fb0f-4e92-8844-a6170ebd8db4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-unplugged-77356784-a6b0-4c1e-9094-0881172d3c27 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.622 186792 DEBUG oslo_concurrency.lockutils [req-b1eb7930-f083-4744-9a8a-ff4eb13c1e59 req-d28f02c3-fb0f-4e92-8844-a6170ebd8db4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.622 186792 DEBUG oslo_concurrency.lockutils [req-b1eb7930-f083-4744-9a8a-ff4eb13c1e59 req-d28f02c3-fb0f-4e92-8844-a6170ebd8db4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.622 186792 DEBUG oslo_concurrency.lockutils [req-b1eb7930-f083-4744-9a8a-ff4eb13c1e59 req-d28f02c3-fb0f-4e92-8844-a6170ebd8db4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.622 186792 DEBUG nova.compute.manager [req-b1eb7930-f083-4744-9a8a-ff4eb13c1e59 req-d28f02c3-fb0f-4e92-8844-a6170ebd8db4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] No waiting events found dispatching network-vif-unplugged-77356784-a6b0-4c1e-9094-0881172d3c27 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:12 np0005531888 nova_compute[186788]: 2025-11-22 08:13:12.623 186792 DEBUG nova.compute.manager [req-b1eb7930-f083-4744-9a8a-ff4eb13c1e59 req-d28f02c3-fb0f-4e92-8844-a6170ebd8db4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-unplugged-77356784-a6b0-4c1e-9094-0881172d3c27 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:13:12 np0005531888 podman[236336]: 2025-11-22 08:13:12.857486353 +0000 UTC m=+0.639662233 container cleanup cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 03:13:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:13Z|00539|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:13:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:13Z|00540|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.040 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:13Z|00541|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:13:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:13Z|00542|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.246 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.529 186792 DEBUG nova.compute.manager [req-853a45c8-6c6e-4e62-8400-e17cd53aea95 req-4327403c-eaa5-47dc-b7e9-fffac1552181 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-deleted-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.529 186792 INFO nova.compute.manager [req-853a45c8-6c6e-4e62-8400-e17cd53aea95 req-4327403c-eaa5-47dc-b7e9-fffac1552181 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Neutron deleted interface e94797a8-0eb2-455e-aebc-d72f3acfb7a6; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.530 186792 DEBUG nova.network.neutron [req-853a45c8-6c6e-4e62-8400-e17cd53aea95 req-4327403c-eaa5-47dc-b7e9-fffac1552181 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updating instance_info_cache with network_info: [{"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.548 186792 DEBUG nova.compute.manager [req-853a45c8-6c6e-4e62-8400-e17cd53aea95 req-4327403c-eaa5-47dc-b7e9-fffac1552181 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Detach interface failed, port_id=e94797a8-0eb2-455e-aebc-d72f3acfb7a6, reason: Instance e5342b10-43ea-4199-8bb3-0aa50f8ddd11 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.651 186792 DEBUG nova.network.neutron [req-e08fe44f-06fa-4514-875a-2fbbf1a4ad91 req-f480dabb-05eb-49ec-92c3-c83cb3d4e253 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updated VIF entry in instance network info cache for port 77356784-a6b0-4c1e-9094-0881172d3c27. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.651 186792 DEBUG nova.network.neutron [req-e08fe44f-06fa-4514-875a-2fbbf1a4ad91 req-f480dabb-05eb-49ec-92c3-c83cb3d4e253 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updating instance_info_cache with network_info: [{"id": "77356784-a6b0-4c1e-9094-0881172d3c27", "address": "fa:16:3e:f5:80:44", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77356784-a6", "ovs_interfaceid": "77356784-a6b0-4c1e-9094-0881172d3c27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "address": "fa:16:3e:7a:14:7e", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7a:147e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94797a8-0e", "ovs_interfaceid": "e94797a8-0eb2-455e-aebc-d72f3acfb7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.678 186792 DEBUG oslo_concurrency.lockutils [req-e08fe44f-06fa-4514-875a-2fbbf1a4ad91 req-f480dabb-05eb-49ec-92c3-c83cb3d4e253 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e5342b10-43ea-4199-8bb3-0aa50f8ddd11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.808 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.864 186792 DEBUG nova.network.neutron [-] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.883 186792 INFO nova.compute.manager [-] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Took 1.43 seconds to deallocate network for instance.#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.953 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.953 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:13 np0005531888 nova_compute[186788]: 2025-11-22 08:13:13.995 186792 DEBUG nova.compute.provider_tree [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.000 186792 DEBUG nova.compute.manager [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-unplugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.001 186792 DEBUG oslo_concurrency.lockutils [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.001 186792 DEBUG oslo_concurrency.lockutils [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.001 186792 DEBUG oslo_concurrency.lockutils [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.001 186792 DEBUG nova.compute.manager [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] No waiting events found dispatching network-vif-unplugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.002 186792 WARNING nova.compute.manager [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received unexpected event network-vif-unplugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.002 186792 DEBUG nova.compute.manager [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.002 186792 DEBUG oslo_concurrency.lockutils [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.002 186792 DEBUG oslo_concurrency.lockutils [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.002 186792 DEBUG oslo_concurrency.lockutils [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.003 186792 DEBUG nova.compute.manager [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] No waiting events found dispatching network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.003 186792 WARNING nova.compute.manager [req-429053b3-6738-4053-99b7-700b74e48800 req-37186b9b-d95d-4b75-892d-7e568d64c540 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received unexpected event network-vif-plugged-e94797a8-0eb2-455e-aebc-d72f3acfb7a6 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.007 186792 DEBUG nova.scheduler.client.report [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.033 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.072 186792 INFO nova.scheduler.client.report [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance e5342b10-43ea-4199-8bb3-0aa50f8ddd11#033[00m
Nov 22 03:13:14 np0005531888 podman[236390]: 2025-11-22 08:13:14.134538079 +0000 UTC m=+1.256273396 container remove cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.140 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0197ec8f-5edb-4f17-b83f-29a78dc68c0c]: (4, ('Sat Nov 22 08:13:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 (cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8)\ncc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8\nSat Nov 22 08:13:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 (cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8)\ncc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.141 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c8872e15-e16e-4c25-aa66-535dd9d33b6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.142 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04f3fbae-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.144 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:14 np0005531888 kernel: tap04f3fbae-10: left promiscuous mode
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.157 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.160 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[67755486-28e2-447f-8d85-be84681e373f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.170 186792 DEBUG oslo_concurrency.lockutils [None req-b5fabf8f-0202-4093-bbca-bdca66d867a6 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.183 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd530bf-e942-4c80-9940-a161d3c7050b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.184 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9282957f-7aa2-426f-a459-4a8513111192]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.201 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[35e55eb5-6388-43b0-8c20-e304d143ff4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589271, 'reachable_time': 43930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236405, 'error': None, 'target': 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.205 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:13:14 np0005531888 systemd[1]: run-netns-ovnmeta\x2d04f3fbae\x2d1178\x2d425a\x2da955\x2d30dcd392a3d3.mount: Deactivated successfully.
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.206 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[495e8b77-d83f-48b9-82dc-6c33c4cef5dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.207 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e94797a8-0eb2-455e-aebc-d72f3acfb7a6 in datapath 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 unbound from our chassis#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.208 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.210 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cf9697da-02f5-4bdb-9229-da25e4f78348]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:14.211 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 namespace which is not needed anymore#033[00m
Nov 22 03:13:14 np0005531888 systemd[1]: libpod-conmon-cc5259e2ba15fb3ec7b9fe8148bf08202a1d69a76a8d190fec0b72179634abe8.scope: Deactivated successfully.
Nov 22 03:13:14 np0005531888 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[236111]: [NOTICE]   (236115) : haproxy version is 2.8.14-c23fe91
Nov 22 03:13:14 np0005531888 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[236111]: [NOTICE]   (236115) : path to executable is /usr/sbin/haproxy
Nov 22 03:13:14 np0005531888 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[236111]: [WARNING]  (236115) : Exiting Master process...
Nov 22 03:13:14 np0005531888 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[236111]: [ALERT]    (236115) : Current worker (236117) exited with code 143 (Terminated)
Nov 22 03:13:14 np0005531888 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[236111]: [WARNING]  (236115) : All workers exited. Exiting... (0)
Nov 22 03:13:14 np0005531888 systemd[1]: libpod-cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e.scope: Deactivated successfully.
Nov 22 03:13:14 np0005531888 podman[236423]: 2025-11-22 08:13:14.435903071 +0000 UTC m=+0.135563914 container died cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:13:14 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e-userdata-shm.mount: Deactivated successfully.
Nov 22 03:13:14 np0005531888 systemd[1]: var-lib-containers-storage-overlay-726e13722ce5a9aa5f47af6de48093f58eed4d5aff1858d62843ed0018397fb7-merged.mount: Deactivated successfully.
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.727 186792 DEBUG nova.compute.manager [req-83267794-4fb5-4d75-85a9-59946b930ad4 req-de822b42-5653-4740-883f-534eb470273f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.728 186792 DEBUG oslo_concurrency.lockutils [req-83267794-4fb5-4d75-85a9-59946b930ad4 req-de822b42-5653-4740-883f-534eb470273f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.728 186792 DEBUG oslo_concurrency.lockutils [req-83267794-4fb5-4d75-85a9-59946b930ad4 req-de822b42-5653-4740-883f-534eb470273f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.728 186792 DEBUG oslo_concurrency.lockutils [req-83267794-4fb5-4d75-85a9-59946b930ad4 req-de822b42-5653-4740-883f-534eb470273f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e5342b10-43ea-4199-8bb3-0aa50f8ddd11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.729 186792 DEBUG nova.compute.manager [req-83267794-4fb5-4d75-85a9-59946b930ad4 req-de822b42-5653-4740-883f-534eb470273f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] No waiting events found dispatching network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:14 np0005531888 nova_compute[186788]: 2025-11-22 08:13:14.729 186792 WARNING nova.compute.manager [req-83267794-4fb5-4d75-85a9-59946b930ad4 req-de822b42-5653-4740-883f-534eb470273f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received unexpected event network-vif-plugged-77356784-a6b0-4c1e-9094-0881172d3c27 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:13:14 np0005531888 podman[236423]: 2025-11-22 08:13:14.863305072 +0000 UTC m=+0.562965895 container cleanup cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:13:14 np0005531888 systemd[1]: libpod-conmon-cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e.scope: Deactivated successfully.
Nov 22 03:13:15 np0005531888 podman[236455]: 2025-11-22 08:13:15.733180505 +0000 UTC m=+0.850898358 container remove cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.739 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f41b75c5-b259-4ffb-aec8-669c6f1abcf8]: (4, ('Sat Nov 22 08:13:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 (cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e)\ncfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e\nSat Nov 22 08:13:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 (cfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e)\ncfe42d17749a69a0ed92b3a080e2ac5686717df631a121650adb51681aae5b4e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.741 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d096334d-af81-45ca-be1c-28fec24b911a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.742 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b7e9f2d-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:15 np0005531888 nova_compute[186788]: 2025-11-22 08:13:15.744 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:15 np0005531888 kernel: tap2b7e9f2d-20: left promiscuous mode
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.751 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[771b6108-5fff-43c8-a1af-72193359fae1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:15 np0005531888 nova_compute[186788]: 2025-11-22 08:13:15.754 186792 DEBUG nova.compute.manager [req-603a8e9d-cf76-46a6-82d4-9dbcf39b9aaf req-4fa5bd6d-2ce6-473d-a3c1-ad8fe56dd9a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Received event network-vif-deleted-77356784-a6b0-4c1e-9094-0881172d3c27 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:15 np0005531888 nova_compute[186788]: 2025-11-22 08:13:15.758 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.767 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8681948e-3d21-4dba-9d72-7415015047fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.768 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cb701e2a-7557-4a12-8747-3332f17c5d72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.785 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[06af0014-2c7d-4ab7-a610-8cfece0c3a1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589386, 'reachable_time': 23635, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236470, 'error': None, 'target': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.788 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:13:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:15.788 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[444ff606-1f9e-432b-91ee-4f4885700e7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:15 np0005531888 systemd[1]: run-netns-ovnmeta\x2d2b7e9f2d\x2d2098\x2d4c6b\x2d8c54\x2da96b7bb53c94.mount: Deactivated successfully.
Nov 22 03:13:15 np0005531888 podman[236471]: 2025-11-22 08:13:15.842182496 +0000 UTC m=+0.053030755 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:13:15 np0005531888 podman[236469]: 2025-11-22 08:13:15.874236115 +0000 UTC m=+0.088680853 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:13:17 np0005531888 nova_compute[186788]: 2025-11-22 08:13:17.382 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:18 np0005531888 nova_compute[186788]: 2025-11-22 08:13:18.809 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:21.571 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:21.572 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:13:21 np0005531888 nova_compute[186788]: 2025-11-22 08:13:21.572 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:22 np0005531888 nova_compute[186788]: 2025-11-22 08:13:22.385 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:23 np0005531888 nova_compute[186788]: 2025-11-22 08:13:23.812 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:26 np0005531888 podman[236514]: 2025-11-22 08:13:26.692806876 +0000 UTC m=+0.066339052 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.954 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.955 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.955 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.955 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.956 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.956 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.970 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.977 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.977 186792 WARNING nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.977 186792 WARNING nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.977 186792 WARNING nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.977 186792 WARNING nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.978 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Removable base files: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53 /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.978 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.978 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.978 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.979 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.979 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.979 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.980 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 22 03:13:26 np0005531888 nova_compute[186788]: 2025-11-22 08:13:26.980 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Nov 22 03:13:27 np0005531888 nova_compute[186788]: 2025-11-22 08:13:27.351 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799192.3501842, e5342b10-43ea-4199-8bb3-0aa50f8ddd11 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:13:27 np0005531888 nova_compute[186788]: 2025-11-22 08:13:27.352 186792 INFO nova.compute.manager [-] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:13:27 np0005531888 nova_compute[186788]: 2025-11-22 08:13:27.386 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:27 np0005531888 nova_compute[186788]: 2025-11-22 08:13:27.388 186792 DEBUG nova.compute.manager [None req-12558f23-efbd-4f7d-a75b-678d48234b2a - - - - - -] [instance: e5342b10-43ea-4199-8bb3-0aa50f8ddd11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:28 np0005531888 podman[236534]: 2025-11-22 08:13:28.667155203 +0000 UTC m=+0.045627464 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:13:28 np0005531888 nova_compute[186788]: 2025-11-22 08:13:28.814 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:31 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:31.574 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:31 np0005531888 podman[236558]: 2025-11-22 08:13:31.679732921 +0000 UTC m=+0.051424506 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, release=1755695350)
Nov 22 03:13:32 np0005531888 nova_compute[186788]: 2025-11-22 08:13:32.430 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:33 np0005531888 nova_compute[186788]: 2025-11-22 08:13:33.815 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:34 np0005531888 podman[236579]: 2025-11-22 08:13:34.703456364 +0000 UTC m=+0.074589786 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 03:13:34 np0005531888 podman[236580]: 2025-11-22 08:13:34.72483595 +0000 UTC m=+0.092432964 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 03:13:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:36.828 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:36.829 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:36.829 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:37 np0005531888 nova_compute[186788]: 2025-11-22 08:13:37.432 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:38 np0005531888 nova_compute[186788]: 2025-11-22 08:13:38.816 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.228 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.229 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.242 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.400 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.400 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.410 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.411 186792 INFO nova.compute.claims [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.523 186792 DEBUG nova.compute.provider_tree [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.536 186792 DEBUG nova.scheduler.client.report [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.557 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.558 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.617 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.618 186792 DEBUG nova.network.neutron [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.636 186792 INFO nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.810 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.831 186792 DEBUG nova.policy [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.908 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.909 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.909 186792 INFO nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Creating image(s)#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.910 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "/var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.910 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.911 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.923 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.992 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.993 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:39 np0005531888 nova_compute[186788]: 2025-11-22 08:13:39.994 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.004 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.067 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.068 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.405 186792 DEBUG nova.network.neutron [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Successfully created port: 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.705 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk 1073741824" returned: 0 in 0.637s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.706 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.706 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.767 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.768 186792 DEBUG nova.virt.disk.api [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Checking if we can resize image /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.768 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.836 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.837 186792 DEBUG nova.virt.disk.api [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Cannot resize image /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.837 186792 DEBUG nova.objects.instance [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'migration_context' on Instance uuid 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.858 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.858 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Ensure instance console log exists: /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.859 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.859 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:40 np0005531888 nova_compute[186788]: 2025-11-22 08:13:40.860 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:41 np0005531888 nova_compute[186788]: 2025-11-22 08:13:41.622 186792 DEBUG nova.network.neutron [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Successfully updated port: 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:13:41 np0005531888 nova_compute[186788]: 2025-11-22 08:13:41.638 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:13:41 np0005531888 nova_compute[186788]: 2025-11-22 08:13:41.638 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquired lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:13:41 np0005531888 nova_compute[186788]: 2025-11-22 08:13:41.639 186792 DEBUG nova.network.neutron [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:13:41 np0005531888 nova_compute[186788]: 2025-11-22 08:13:41.712 186792 DEBUG nova.compute.manager [req-8585a560-5a08-4fa1-8198-feab705aec0a req-64d83fcd-2000-40fd-908c-9e69e76cf649 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received event network-changed-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:41 np0005531888 nova_compute[186788]: 2025-11-22 08:13:41.713 186792 DEBUG nova.compute.manager [req-8585a560-5a08-4fa1-8198-feab705aec0a req-64d83fcd-2000-40fd-908c-9e69e76cf649 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Refreshing instance network info cache due to event network-changed-3c3e4988-0822-4b4c-9326-3cf6ec5155d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:13:41 np0005531888 nova_compute[186788]: 2025-11-22 08:13:41.713 186792 DEBUG oslo_concurrency.lockutils [req-8585a560-5a08-4fa1-8198-feab705aec0a req-64d83fcd-2000-40fd-908c-9e69e76cf649 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:13:41 np0005531888 nova_compute[186788]: 2025-11-22 08:13:41.836 186792 DEBUG nova.network.neutron [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:13:42 np0005531888 nova_compute[186788]: 2025-11-22 08:13:42.433 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.067 186792 DEBUG nova.network.neutron [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Updating instance_info_cache with network_info: [{"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.099 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Releasing lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.100 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Instance network_info: |[{"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.100 186792 DEBUG oslo_concurrency.lockutils [req-8585a560-5a08-4fa1-8198-feab705aec0a req-64d83fcd-2000-40fd-908c-9e69e76cf649 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.100 186792 DEBUG nova.network.neutron [req-8585a560-5a08-4fa1-8198-feab705aec0a req-64d83fcd-2000-40fd-908c-9e69e76cf649 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Refreshing network info cache for port 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.103 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Start _get_guest_xml network_info=[{"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.107 186792 WARNING nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.113 186792 DEBUG nova.virt.libvirt.host [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.114 186792 DEBUG nova.virt.libvirt.host [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.117 186792 DEBUG nova.virt.libvirt.host [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.118 186792 DEBUG nova.virt.libvirt.host [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.119 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.119 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.119 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.120 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.120 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.120 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.120 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.121 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.121 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.121 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.121 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.121 186792 DEBUG nova.virt.hardware [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.125 186792 DEBUG nova.virt.libvirt.vif [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-19652717',display_name='tempest-₡-19652717',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest--19652717',id=137,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-q1p88s0k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:13:39Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=2f6a7fca-8a29-4c0c-936f-8184ac3b4abe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.125 186792 DEBUG nova.network.os_vif_util [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.126 186792 DEBUG nova.network.os_vif_util [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:ba:92,bridge_name='br-int',has_traffic_filtering=True,id=3c3e4988-0822-4b4c-9326-3cf6ec5155d9,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3e4988-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.126 186792 DEBUG nova.objects.instance [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.136 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <uuid>2f6a7fca-8a29-4c0c-936f-8184ac3b4abe</uuid>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <name>instance-00000089</name>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <nova:name>tempest-₡-19652717</nova:name>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:13:43</nova:creationTime>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        <nova:user uuid="11d95211a44e4da9a04eb309ec3ab024">tempest-ServersTestJSON-1620770071-project-member</nova:user>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        <nova:project uuid="70cb231da30d4002a985cf18a579cd6a">tempest-ServersTestJSON-1620770071</nova:project>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        <nova:port uuid="3c3e4988-0822-4b4c-9326-3cf6ec5155d9">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <entry name="serial">2f6a7fca-8a29-4c0c-936f-8184ac3b4abe</entry>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <entry name="uuid">2f6a7fca-8a29-4c0c-936f-8184ac3b4abe</entry>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.config"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:c6:ba:92"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <target dev="tap3c3e4988-08"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/console.log" append="off"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:13:43 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:13:43 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:13:43 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:13:43 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.137 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Preparing to wait for external event network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.138 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.138 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.138 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.139 186792 DEBUG nova.virt.libvirt.vif [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-19652717',display_name='tempest-₡-19652717',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest--19652717',id=137,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-q1p88s0k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:13:39Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=2f6a7fca-8a29-4c0c-936f-8184ac3b4abe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.139 186792 DEBUG nova.network.os_vif_util [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.140 186792 DEBUG nova.network.os_vif_util [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:ba:92,bridge_name='br-int',has_traffic_filtering=True,id=3c3e4988-0822-4b4c-9326-3cf6ec5155d9,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3e4988-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.140 186792 DEBUG os_vif [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:ba:92,bridge_name='br-int',has_traffic_filtering=True,id=3c3e4988-0822-4b4c-9326-3cf6ec5155d9,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3e4988-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.140 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.141 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.141 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.143 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.143 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c3e4988-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.144 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c3e4988-08, col_values=(('external_ids', {'iface-id': '3c3e4988-0822-4b4c-9326-3cf6ec5155d9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:ba:92', 'vm-uuid': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.145 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:43 np0005531888 NetworkManager[55166]: <info>  [1763799223.1460] manager: (tap3c3e4988-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.148 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.151 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.152 186792 INFO os_vif [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:ba:92,bridge_name='br-int',has_traffic_filtering=True,id=3c3e4988-0822-4b4c-9326-3cf6ec5155d9,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3e4988-08')#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.238 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.238 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.238 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No VIF found with MAC fa:16:3e:c6:ba:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.239 186792 INFO nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Using config drive#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.817 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.858 186792 INFO nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Creating config drive at /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.config#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.864 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpre_fef9t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:43 np0005531888 nova_compute[186788]: 2025-11-22 08:13:43.990 186792 DEBUG oslo_concurrency.processutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpre_fef9t" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:44 np0005531888 kernel: tap3c3e4988-08: entered promiscuous mode
Nov 22 03:13:44 np0005531888 NetworkManager[55166]: <info>  [1763799224.0446] manager: (tap3c3e4988-08): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.045 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:44Z|00543|binding|INFO|Claiming lport 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 for this chassis.
Nov 22 03:13:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:44Z|00544|binding|INFO|3c3e4988-0822-4b4c-9326-3cf6ec5155d9: Claiming fa:16:3e:c6:ba:92 10.100.0.4
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.048 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 systemd-udevd[236657]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:13:44 np0005531888 systemd-machined[153106]: New machine qemu-66-instance-00000089.
Nov 22 03:13:44 np0005531888 NetworkManager[55166]: <info>  [1763799224.1021] device (tap3c3e4988-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:13:44 np0005531888 NetworkManager[55166]: <info>  [1763799224.1037] device (tap3c3e4988-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.110 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 systemd[1]: Started Virtual Machine qemu-66-instance-00000089.
Nov 22 03:13:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:44Z|00545|binding|INFO|Setting lport 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 ovn-installed in OVS
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.115 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:44Z|00546|binding|INFO|Setting lport 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 up in Southbound
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.394 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:ba:92 10.100.0.4'], port_security=['fa:16:3e:c6:ba:92 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=3c3e4988-0822-4b4c-9326-3cf6ec5155d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.396 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a bound to our chassis#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.397 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.408 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3db3d0b5-7a0a-4e3c-9377-83a106c13399]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.409 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66c945b4-71 in ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.411 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66c945b4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.411 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4fb21a-3f1e-46e3-b009-6560cafcaf04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.413 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f45eca-2170-4bda-a908-5dc0a72746ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.426 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[39093bca-5b23-4fa5-9328-6bc33f992479]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.449 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[37383585-6090-4333-a034-79833a30e2c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.483 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[104a8907-1e5c-4724-b16d-618cb5ced97e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 NetworkManager[55166]: <info>  [1763799224.4914] manager: (tap66c945b4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/258)
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.492 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fbeb7399-fcfe-4de2-bd3f-9e3601330bdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.526 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[83e615be-9358-44b9-b52a-b31d9b78533f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.529 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[aa73aec0-9d15-44a0-a32a-0779f9cf038f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 NetworkManager[55166]: <info>  [1763799224.5550] device (tap66c945b4-70): carrier: link connected
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.560 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[fdfa7d59-dca5-47b3-8ad1-c607d8872119]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.577 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c7ea36-29b0-4035-8279-758cbda5d2a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596519, 'reachable_time': 27089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236690, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.593 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d7292f20-b361-429a-8ee2-55180818a185]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:5a27'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596519, 'tstamp': 596519}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236691, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.609 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[53bbd88a-ea7d-49bd-9ff4-c1cf98912fe6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596519, 'reachable_time': 27089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236693, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.640 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1d0080dd-17be-4c6c-85e3-4499c23ab105]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.704 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[db4b9fa7-9880-401b-9eae-4ef0df712317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.706 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.706 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.706 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:44 np0005531888 NetworkManager[55166]: <info>  [1763799224.7085] manager: (tap66c945b4-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.708 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 kernel: tap66c945b4-70: entered promiscuous mode
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.709 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.711 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.712 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:13:44Z|00547|binding|INFO|Releasing lport d6ef1392-aa2a-4e3e-91ba-ec0ce61e416a from this chassis (sb_readonly=0)
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.723 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.724 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.726 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aee9dedd-48e3-4549-a62e-0b8d39f7bcb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.727 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-66c945b4-7237-4e85-b411-0c51b31ea31a
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 66c945b4-7237-4e85-b411-0c51b31ea31a
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:13:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:44.728 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'env', 'PROCESS_TAG=haproxy-66c945b4-7237-4e85-b411-0c51b31ea31a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66c945b4-7237-4e85-b411-0c51b31ea31a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.771 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799224.7710013, 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.772 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] VM Started (Lifecycle Event)#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.791 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.796 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799224.7713687, 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.796 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.814 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.817 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:13:44 np0005531888 nova_compute[186788]: 2025-11-22 08:13:44.837 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:13:45 np0005531888 podman[236731]: 2025-11-22 08:13:45.090346511 +0000 UTC m=+0.024558685 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.215 186792 DEBUG nova.compute.manager [req-65f28e74-913c-42ed-b2e2-76a6c6c185e8 req-f31f797a-601c-4899-9206-436958f6a2af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received event network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.216 186792 DEBUG oslo_concurrency.lockutils [req-65f28e74-913c-42ed-b2e2-76a6c6c185e8 req-f31f797a-601c-4899-9206-436958f6a2af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.216 186792 DEBUG oslo_concurrency.lockutils [req-65f28e74-913c-42ed-b2e2-76a6c6c185e8 req-f31f797a-601c-4899-9206-436958f6a2af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.216 186792 DEBUG oslo_concurrency.lockutils [req-65f28e74-913c-42ed-b2e2-76a6c6c185e8 req-f31f797a-601c-4899-9206-436958f6a2af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.216 186792 DEBUG nova.compute.manager [req-65f28e74-913c-42ed-b2e2-76a6c6c185e8 req-f31f797a-601c-4899-9206-436958f6a2af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Processing event network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.217 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.221 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799225.221189, 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.221 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.223 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.227 186792 INFO nova.virt.libvirt.driver [-] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Instance spawned successfully.#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.228 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.240 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.246 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.252 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.252 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.253 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.253 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.254 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.254 186792 DEBUG nova.virt.libvirt.driver [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.277 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.313 186792 INFO nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Took 5.41 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.314 186792 DEBUG nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.391 186792 INFO nova.compute.manager [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Took 6.05 seconds to build instance.#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.415 186792 DEBUG oslo_concurrency.lockutils [None req-3426aac8-99f5-43d6-be78-abd600a2b7a0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.535 186792 DEBUG nova.network.neutron [req-8585a560-5a08-4fa1-8198-feab705aec0a req-64d83fcd-2000-40fd-908c-9e69e76cf649 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Updated VIF entry in instance network info cache for port 3c3e4988-0822-4b4c-9326-3cf6ec5155d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.537 186792 DEBUG nova.network.neutron [req-8585a560-5a08-4fa1-8198-feab705aec0a req-64d83fcd-2000-40fd-908c-9e69e76cf649 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Updating instance_info_cache with network_info: [{"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:45 np0005531888 nova_compute[186788]: 2025-11-22 08:13:45.555 186792 DEBUG oslo_concurrency.lockutils [req-8585a560-5a08-4fa1-8198-feab705aec0a req-64d83fcd-2000-40fd-908c-9e69e76cf649 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:13:46 np0005531888 podman[236731]: 2025-11-22 08:13:46.052418901 +0000 UTC m=+0.986631045 container create 65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 03:13:46 np0005531888 systemd[1]: Started libpod-conmon-65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74.scope.
Nov 22 03:13:46 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:13:46 np0005531888 podman[236744]: 2025-11-22 08:13:46.271857028 +0000 UTC m=+0.179427603 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:13:46 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9d7125f32fa27413e4966b8f6aa5a34401206ab17e03fc75c2bbebd2a1a7c11/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:13:46 np0005531888 podman[236731]: 2025-11-22 08:13:46.362040846 +0000 UTC m=+1.296253020 container init 65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:13:46 np0005531888 podman[236731]: 2025-11-22 08:13:46.369742196 +0000 UTC m=+1.303954370 container start 65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:13:46 np0005531888 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[236774]: [NOTICE]   (236795) : New worker (236797) forked
Nov 22 03:13:46 np0005531888 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[236774]: [NOTICE]   (236795) : Loading success.
Nov 22 03:13:46 np0005531888 podman[236745]: 2025-11-22 08:13:46.47646556 +0000 UTC m=+0.380430547 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:13:47 np0005531888 nova_compute[186788]: 2025-11-22 08:13:47.303 186792 DEBUG nova.compute.manager [req-b0c2ba0b-e02e-451a-adf6-992990bad4e7 req-c8427fd3-7b75-407e-879f-fc572336ec3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received event network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:47 np0005531888 nova_compute[186788]: 2025-11-22 08:13:47.305 186792 DEBUG oslo_concurrency.lockutils [req-b0c2ba0b-e02e-451a-adf6-992990bad4e7 req-c8427fd3-7b75-407e-879f-fc572336ec3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:47 np0005531888 nova_compute[186788]: 2025-11-22 08:13:47.305 186792 DEBUG oslo_concurrency.lockutils [req-b0c2ba0b-e02e-451a-adf6-992990bad4e7 req-c8427fd3-7b75-407e-879f-fc572336ec3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:47 np0005531888 nova_compute[186788]: 2025-11-22 08:13:47.306 186792 DEBUG oslo_concurrency.lockutils [req-b0c2ba0b-e02e-451a-adf6-992990bad4e7 req-c8427fd3-7b75-407e-879f-fc572336ec3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:47 np0005531888 nova_compute[186788]: 2025-11-22 08:13:47.306 186792 DEBUG nova.compute.manager [req-b0c2ba0b-e02e-451a-adf6-992990bad4e7 req-c8427fd3-7b75-407e-879f-fc572336ec3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] No waiting events found dispatching network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:47 np0005531888 nova_compute[186788]: 2025-11-22 08:13:47.306 186792 WARNING nova.compute.manager [req-b0c2ba0b-e02e-451a-adf6-992990bad4e7 req-c8427fd3-7b75-407e-879f-fc572336ec3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received unexpected event network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:13:48 np0005531888 nova_compute[186788]: 2025-11-22 08:13:48.146 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:48 np0005531888 nova_compute[186788]: 2025-11-22 08:13:48.819 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:53 np0005531888 nova_compute[186788]: 2025-11-22 08:13:53.149 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:53 np0005531888 nova_compute[186788]: 2025-11-22 08:13:53.822 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:54.297 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:a2:f4 2001:db8:0:1:f816:3eff:fe7e:a2f4 2001:db8::f816:3eff:fe7e:a2f4'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe7e:a2f4/64 2001:db8::f816:3eff:fe7e:a2f4/64', 'neutron:device_id': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f6eb0a2-d476-48e9-8756-79e6bbc84c15, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=433cf940-3b59-425c-aeb8-689a57de46c2) old=Port_Binding(mac=['fa:16:3e:7e:a2:f4 2001:db8::f816:3eff:fe7e:a2f4'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe7e:a2f4/64', 'neutron:device_id': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:54.300 104023 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 433cf940-3b59-425c-aeb8-689a57de46c2 in datapath 6c0a2255-6426-43c4-abc3-5c1857ba0a79 updated#033[00m
Nov 22 03:13:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:54.302 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c0a2255-6426-43c4-abc3-5c1857ba0a79, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:13:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:54.303 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[baeba0e5-f78c-488a-a4f7-4decfae8cc46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531888 podman[236808]: 2025-11-22 08:13:57.685159209 +0000 UTC m=+0.056387219 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:13:58 np0005531888 nova_compute[186788]: 2025-11-22 08:13:58.152 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:58.691 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:58 np0005531888 nova_compute[186788]: 2025-11-22 08:13:58.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:13:58.693 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:13:58 np0005531888 nova_compute[186788]: 2025-11-22 08:13:58.824 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:59 np0005531888 podman[236829]: 2025-11-22 08:13:59.675253162 +0000 UTC m=+0.049835697 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:13:59 np0005531888 nova_compute[186788]: 2025-11-22 08:13:59.981 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:01 np0005531888 nova_compute[186788]: 2025-11-22 08:14:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:01 np0005531888 nova_compute[186788]: 2025-11-22 08:14:01.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:14:01 np0005531888 nova_compute[186788]: 2025-11-22 08:14:01.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:14:02 np0005531888 nova_compute[186788]: 2025-11-22 08:14:02.213 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:02 np0005531888 nova_compute[186788]: 2025-11-22 08:14:02.214 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:02 np0005531888 nova_compute[186788]: 2025-11-22 08:14:02.214 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:14:02 np0005531888 nova_compute[186788]: 2025-11-22 08:14:02.215 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:14:02Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c6:ba:92 10.100.0.4
Nov 22 03:14:02 np0005531888 ovn_controller[95067]: 2025-11-22T08:14:02Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c6:ba:92 10.100.0.4
Nov 22 03:14:02 np0005531888 podman[236872]: 2025-11-22 08:14:02.678007541 +0000 UTC m=+0.049873488 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Nov 22 03:14:03 np0005531888 nova_compute[186788]: 2025-11-22 08:14:03.154 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:03 np0005531888 nova_compute[186788]: 2025-11-22 08:14:03.826 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:03 np0005531888 nova_compute[186788]: 2025-11-22 08:14:03.947 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Updating instance_info_cache with network_info: [{"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:03 np0005531888 nova_compute[186788]: 2025-11-22 08:14:03.970 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:03 np0005531888 nova_compute[186788]: 2025-11-22 08:14:03.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:14:03 np0005531888 nova_compute[186788]: 2025-11-22 08:14:03.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:04 np0005531888 nova_compute[186788]: 2025-11-22 08:14:04.965 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:05 np0005531888 podman[236894]: 2025-11-22 08:14:05.686983021 +0000 UTC m=+0.055748012 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:14:05 np0005531888 podman[236895]: 2025-11-22 08:14:05.745682865 +0000 UTC m=+0.112367106 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:14:05 np0005531888 nova_compute[186788]: 2025-11-22 08:14:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:08 np0005531888 nova_compute[186788]: 2025-11-22 08:14:08.157 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:08.695 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:08 np0005531888 nova_compute[186788]: 2025-11-22 08:14:08.827 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:08 np0005531888 nova_compute[186788]: 2025-11-22 08:14:08.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:09 np0005531888 nova_compute[186788]: 2025-11-22 08:14:09.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.003 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.004 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.004 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.004 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.099 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.155 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.156 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.210 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.375 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.377 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5540MB free_disk=73.24517440795898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.377 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.377 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.645 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.645 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.646 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.870 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.897 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.950 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:14:10 np0005531888 nova_compute[186788]: 2025-11-22 08:14:10.951 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:11 np0005531888 nova_compute[186788]: 2025-11-22 08:14:11.952 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:12 np0005531888 nova_compute[186788]: 2025-11-22 08:14:12.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:12 np0005531888 nova_compute[186788]: 2025-11-22 08:14:12.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:14:12 np0005531888 nova_compute[186788]: 2025-11-22 08:14:12.978 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:14:12 np0005531888 nova_compute[186788]: 2025-11-22 08:14:12.980 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:12 np0005531888 nova_compute[186788]: 2025-11-22 08:14:12.980 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:14:12 np0005531888 nova_compute[186788]: 2025-11-22 08:14:12.990 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:13 np0005531888 nova_compute[186788]: 2025-11-22 08:14:13.160 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:13 np0005531888 nova_compute[186788]: 2025-11-22 08:14:13.830 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:14 np0005531888 nova_compute[186788]: 2025-11-22 08:14:14.002 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:14 np0005531888 nova_compute[186788]: 2025-11-22 08:14:14.002 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:14:16 np0005531888 podman[236941]: 2025-11-22 08:14:16.680178009 +0000 UTC m=+0.048486924 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:14:16 np0005531888 podman[236942]: 2025-11-22 08:14:16.689509699 +0000 UTC m=+0.051266262 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:14:18 np0005531888 nova_compute[186788]: 2025-11-22 08:14:18.163 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:18 np0005531888 nova_compute[186788]: 2025-11-22 08:14:18.832 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:23 np0005531888 nova_compute[186788]: 2025-11-22 08:14:23.166 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:23 np0005531888 nova_compute[186788]: 2025-11-22 08:14:23.833 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531888 nova_compute[186788]: 2025-11-22 08:14:26.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:28 np0005531888 nova_compute[186788]: 2025-11-22 08:14:28.168 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:28 np0005531888 podman[236984]: 2025-11-22 08:14:28.683481018 +0000 UTC m=+0.059829702 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:14:28 np0005531888 nova_compute[186788]: 2025-11-22 08:14:28.835 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:30 np0005531888 podman[237006]: 2025-11-22 08:14:30.671442519 +0000 UTC m=+0.049714914 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:14:33 np0005531888 nova_compute[186788]: 2025-11-22 08:14:33.171 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:33 np0005531888 podman[237030]: 2025-11-22 08:14:33.719120801 +0000 UTC m=+0.077778214 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64)
Nov 22 03:14:33 np0005531888 nova_compute[186788]: 2025-11-22 08:14:33.837 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.157 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.158 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.281 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.478 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.479 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.483 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.484 186792 INFO nova.compute.claims [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:14:36 np0005531888 podman[237052]: 2025-11-22 08:14:36.682596702 +0000 UTC m=+0.056233844 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:14:36 np0005531888 podman[237053]: 2025-11-22 08:14:36.703399214 +0000 UTC m=+0.074409651 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.712 186792 DEBUG nova.compute.provider_tree [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.724 186792 DEBUG nova.scheduler.client.report [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.746 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.747 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:14:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:36.830 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:36.830 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:36.831 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.848 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'name': 'tempest-₡-19652717', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000089', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '70cb231da30d4002a985cf18a579cd6a', 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'hostId': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.848 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.865 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/memory.usage volume: 42.4375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d6e0f58-d348-4e86-a953-00a51e97f1f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4375, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'timestamp': '2025-11-22T08:14:36.848996', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4908135c-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.564615833, 'message_signature': '214482fe5cce922e7666227a30fe54f531a1f709bb2b07fe03e18ba90be610b0'}]}, 'timestamp': '2025-11-22 08:14:36.865794', '_unique_id': '1af56368375744f88d669fdab3b7d326'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.868 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.870 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe / tap3c3e4988-08 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.870 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cb108de-639e-42bb-a6ed-55d1a00fd62c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.868144', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '4908d4fe-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': '2af5dc44b7911200872dc56617b6b0ce0e93596c0a678027a12b549b60203799'}]}, 'timestamp': '2025-11-22 08:14:36.870707', '_unique_id': '0111822077084dea883b83b2697a6b97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.872 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.872 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-₡-19652717>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-19652717>]
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.873 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.873 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89254d31-fff6-4616-97f6-2ad48bea8627', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.873529', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '490950aa-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': 'e82eb7737ba176e17f35475b81336df8eae455a1e0881fbbbfcf1cd949bbbecd'}]}, 'timestamp': '2025-11-22 08:14:36.873899', '_unique_id': '94267877fb3c4e44924b1c26fe099755'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.874 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.875 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.875 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2e2bacc-b76f-4d33-aa69-01ec2d9562f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.875860', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '4909aa1e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': '1d5f00b91fcbafc77bb2436b94cc908f9ca4435ad837562e961fdad6a77cef1f'}]}, 'timestamp': '2025-11-22 08:14:36.876155', '_unique_id': '3752fca0fed7490385c51a2c596ec82c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.877 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.877 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8500a0f0-a968-4b53-80c8-c29ecd9f7538', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.877804', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '4909f58c-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': '163050138555764b55e9508e9c12758741b8bc8478c79ed604a2d7f6c1d98d8d'}]}, 'timestamp': '2025-11-22 08:14:36.878079', '_unique_id': '7b8e5c4fb4cf4893a0ccb4999cf2422b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.903 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.bytes volume: 72908800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.904 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24c43aee-25e9-4f8f-a4ef-640a4417c6e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72908800, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.879509', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '490de9d0-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': '13ef69f8852d7f40deec732d1d8d5ee77cca406bba0c8413f9984df7d378b774'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.879509', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '490df880-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': '269fecbd7887301783330fec9f6491c061123e5ecb17f40aec12864fb288f50a'}]}, 'timestamp': '2025-11-22 08:14:36.904397', '_unique_id': '21268a117b1f48a6abba644f94a146bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.906 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.907 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.907 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.latency volume: 36741127659 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c47d8bea-5e01-4464-a44c-ac553a483743', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 36741127659, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.907838', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '490e8c28-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': '3309bf333e0b35ffb4d0832e3c78bd3d58d0d7c7694ae63817bcb17cbc2ee840'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.907838', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '490e9704-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': 'c1bb0bad7fb2f7fc7f562a82c2675c0026f6ce0c82c93f4a99988afd9667a186'}]}, 'timestamp': '2025-11-22 08:14:36.908416', '_unique_id': 'c331a07569aa4858bd38dc3cda6ce1b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63893f64-064c-403b-a03c-634dca73cf6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.910060', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '490ee222-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': '2a41d48f919d44d9e95004967bc9f22807e17acfa1132e34c4b23e8d432ef795'}]}, 'timestamp': '2025-11-22 08:14:36.910347', '_unique_id': '3c58d03490e745c19f1d91871f593f92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.910 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.911 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '917ae3a3-1f56-410f-b4a2-22c984cc3fde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.911907', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '490f2a8e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': '3da1445cfd09e4a47f212267ab10496ab664641d96de7bb0ab5a667a3fa7b54e'}]}, 'timestamp': '2025-11-22 08:14:36.912203', '_unique_id': '039ac0f8e25e4835b0612671d76c8464'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.922 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.923 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db608aa6-6be8-4948-b81f-bc3ea62de709', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.913698', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4910d8b6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.613201859, 'message_signature': '0e8c9da39e649502a4456fd6c27085ad7beacaa0cf867cc27c3ba113a80dfc99'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.913698', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4910e4e6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.613201859, 'message_signature': '4398f7ede07821b4307eec89bb14b9d6ecf3f3f5587d92c5daa2deb3286ad07f'}]}, 'timestamp': '2025-11-22 08:14:36.923518', '_unique_id': '1ebe364316794ab0bbdec434d3f5a207'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.925 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.requests volume: 282 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.925 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de82f796-8e17-4b96-bf1f-631977667abb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 282, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.925339', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4911363a-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': '7192505336d788dd3570ccc1db9b1c19ab70a2ced47a324e3e4c696eb1fd05b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.925339', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49114076-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': '4cdee53bbdaa43e476b45fbaa62e776d0159aa4c7a14964b63d7a37b0424a3c6'}]}, 'timestamp': '2025-11-22 08:14:36.925855', '_unique_id': 'cc3d9fa96082498386e5c04c1c94245f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.927 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.927 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '398630d7-b637-4b6c-8ff0-9111029a8c51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.927340', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4911843c-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.613201859, 'message_signature': '27091a0b41e99ca95525cb221f6aa9ba09c1b6947dfef1757554ba79174491a9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.927340', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49118f04-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.613201859, 'message_signature': 'ce2932677f46c770c4d13452993b8a34b5be774df1528f64517dbee494efcf47'}]}, 'timestamp': '2025-11-22 08:14:36.927867', '_unique_id': '456f6f6263124b78bcd5f12f6d09e356'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.929 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faaea8dc-5863-4b6c-84c4-cdea2c184dac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.929342', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '4911d2ac-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': '9b2456d8029312cffff12467de3f3252225860f1eeef87d1040d849485bed3f7'}]}, 'timestamp': '2025-11-22 08:14:36.929628', '_unique_id': '12dc1a12821a48efaf2831bac41cf11e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.931 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.931 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4e74526-6a11-4ce1-9e29-358194673670', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.931155', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4912191a-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': '8c578772edd133eda6eefc360c0a215badb0f1f728614340d3e08a8ed8f8afeb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.931155', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49122234-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': 'b1fa5df51bcabb834b2c453ea0cef1014e70fe0fb6e02723e6d1746b23ae6caa'}]}, 'timestamp': '2025-11-22 08:14:36.931653', '_unique_id': '57c5c3b6f24e494489418a7fa4a065e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4162f620-1191-4315-8104-7cdf71b8bd27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.933213', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '49126a14-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': 'b5098c7b76c46c0c542b2e242dc1ce825306c9496d212a7be5bf2f2e2740eb88'}]}, 'timestamp': '2025-11-22 08:14:36.933487', '_unique_id': 'fa5946cdcbd14a4496bd5d26b11ff725'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.934 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c74573b-6664-4d69-9acc-d5bf34fa6607', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.934948', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '4912ad58-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': '9d1f0c4c0a2467cfb01d2f370ad5442b3abb3c312288963b839583b00595af01'}]}, 'timestamp': '2025-11-22 08:14:36.935207', '_unique_id': '4a8db46aa6b047b48c2109381671f701'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.936 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.936 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.936 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-₡-19652717>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-19652717>]
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.936 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b710f58-896c-4130-b2cc-1efef2fe52b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.937057', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4912ffb0-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': 'bfa360714232f51fb3892f22c5e42e7ce6111f5aa65c36964032eefd65abfb34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.937057', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '491308ca-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': 'bd2b3faf630163daa5960c4df70a43572627ab456fce399adb32e3a09fd4054f'}]}, 'timestamp': '2025-11-22 08:14:36.937536', '_unique_id': '1f910489d01f4f39b7d7f4cdb3369811'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.939 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.939 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-₡-19652717>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-19652717>]
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.939 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.939 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82bdc9ba-d902-42f5-bf98-f90fb7139075', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:14:36.939367', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '491359d8-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.567654689, 'message_signature': '761b9a13ac7b9b82b15b80dc998fbe952c1850817a53a05f8525652423d6d047'}]}, 'timestamp': '2025-11-22 08:14:36.939643', '_unique_id': 'eb76a458a20446a4a52aabfd0cf49039'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/cpu volume: 15240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05a1dda8-12c5-4a43-81f2-492d182bf6f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15240000000, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'timestamp': '2025-11-22T08:14:36.941218', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4913a230-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.564615833, 'message_signature': '96c61df3ed2cfbdba54b78189f2bd6e10dc59cef51618013c489d7cc101a5319'}]}, 'timestamp': '2025-11-22 08:14:36.941470', '_unique_id': '1fe32bf73dbc4defa175bf0b22773c7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.942 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b96f7d3-4163-44fa-a391-76b2b228c241', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.942876', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4913e2e0-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.613201859, 'message_signature': '1d8d5cfb1ad17952175a342645618f555f59f0de500d53b0b1e4d2d0bfc05a55'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.942876', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4913ebf0-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.613201859, 'message_signature': 'd42e122d685e19f8ef9838e5cfb35f17b6fcaa04776527adf08cd144b864a2ba'}]}, 'timestamp': '2025-11-22 08:14:36.943351', '_unique_id': '6ef0820419474283bbf479886f0ebdee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.944 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.944 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-₡-19652717>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-19652717>]
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.945 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.945 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.latency volume: 2551870973 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.945 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.latency volume: 159932662 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e26eae6a-b8f2-475e-b74f-4d3ec6ecc519', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2551870973, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:14:36.945174', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49143c90-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': 'dbc2340e7fe1fc65f4abc0aaa8e49c6438c6cf2d42840d5699cbaa3ba75deaa5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 159932662, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:14:36.945174', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '491445a0-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6017.579034399, 'message_signature': 'a3096620dc7cb8df9e5c78c3f3cb8ab8179251b623643133e1be6ab5d315480a'}]}, 'timestamp': '2025-11-22 08:14:36.945664', '_unique_id': '7bb155da762f4f958462d5aac7e80bfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:14:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.983 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:14:36 np0005531888 nova_compute[186788]: 2025-11-22 08:14:36.984 186792 DEBUG nova.network.neutron [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.005 186792 INFO nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.028 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.141 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.142 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.143 186792 INFO nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Creating image(s)#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.143 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "/var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.144 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.145 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.158 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.213 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.214 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.214 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.225 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.283 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.284 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.397 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk 1073741824" returned: 0 in 0.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.399 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.400 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.463 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.465 186792 DEBUG nova.virt.disk.api [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Checking if we can resize image /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.466 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.534 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.536 186792 DEBUG nova.virt.disk.api [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Cannot resize image /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.536 186792 DEBUG nova.objects.instance [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'migration_context' on Instance uuid 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.559 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.560 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Ensure instance console log exists: /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.561 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.561 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.561 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:37 np0005531888 nova_compute[186788]: 2025-11-22 08:14:37.765 186792 DEBUG nova.policy [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:14:38 np0005531888 nova_compute[186788]: 2025-11-22 08:14:38.173 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:38 np0005531888 nova_compute[186788]: 2025-11-22 08:14:38.839 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:39 np0005531888 nova_compute[186788]: 2025-11-22 08:14:39.853 186792 DEBUG nova.network.neutron [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Successfully created port: 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:14:40 np0005531888 nova_compute[186788]: 2025-11-22 08:14:40.967 186792 DEBUG nova.network.neutron [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Successfully updated port: 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:14:40 np0005531888 nova_compute[186788]: 2025-11-22 08:14:40.988 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "refresh_cache-8ed928e2-c2fc-4d34-a5df-e191f0ee2880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:40 np0005531888 nova_compute[186788]: 2025-11-22 08:14:40.988 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquired lock "refresh_cache-8ed928e2-c2fc-4d34-a5df-e191f0ee2880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:40 np0005531888 nova_compute[186788]: 2025-11-22 08:14:40.988 186792 DEBUG nova.network.neutron [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:14:41 np0005531888 nova_compute[186788]: 2025-11-22 08:14:41.289 186792 DEBUG nova.compute.manager [req-be47c6ac-f3cc-4ff9-8388-ac9ac7e3b35f req-e3b399c5-ab6a-40cd-9b01-af564242b8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received event network-changed-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:41 np0005531888 nova_compute[186788]: 2025-11-22 08:14:41.290 186792 DEBUG nova.compute.manager [req-be47c6ac-f3cc-4ff9-8388-ac9ac7e3b35f req-e3b399c5-ab6a-40cd-9b01-af564242b8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Refreshing instance network info cache due to event network-changed-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:14:41 np0005531888 nova_compute[186788]: 2025-11-22 08:14:41.291 186792 DEBUG oslo_concurrency.lockutils [req-be47c6ac-f3cc-4ff9-8388-ac9ac7e3b35f req-e3b399c5-ab6a-40cd-9b01-af564242b8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8ed928e2-c2fc-4d34-a5df-e191f0ee2880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:41 np0005531888 nova_compute[186788]: 2025-11-22 08:14:41.486 186792 DEBUG nova.network.neutron [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.786 186792 DEBUG nova.network.neutron [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Updating instance_info_cache with network_info: [{"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.840 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Releasing lock "refresh_cache-8ed928e2-c2fc-4d34-a5df-e191f0ee2880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.840 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Instance network_info: |[{"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.841 186792 DEBUG oslo_concurrency.lockutils [req-be47c6ac-f3cc-4ff9-8388-ac9ac7e3b35f req-e3b399c5-ab6a-40cd-9b01-af564242b8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8ed928e2-c2fc-4d34-a5df-e191f0ee2880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.841 186792 DEBUG nova.network.neutron [req-be47c6ac-f3cc-4ff9-8388-ac9ac7e3b35f req-e3b399c5-ab6a-40cd-9b01-af564242b8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Refreshing network info cache for port 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.845 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Start _get_guest_xml network_info=[{"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.850 186792 WARNING nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.857 186792 DEBUG nova.virt.libvirt.host [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.858 186792 DEBUG nova.virt.libvirt.host [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.862 186792 DEBUG nova.virt.libvirt.host [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.863 186792 DEBUG nova.virt.libvirt.host [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.864 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.864 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.865 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.865 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.865 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.865 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.866 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.866 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.866 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.866 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.866 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.867 186792 DEBUG nova.virt.hardware [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.870 186792 DEBUG nova.virt.libvirt.vif [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1864001579',display_name='tempest-ServersTestJSON-server-1864001579',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1864001579',id=142,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-ocio1nf3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:37Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=8ed928e2-c2fc-4d34-a5df-e191f0ee2880,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.870 186792 DEBUG nova.network.os_vif_util [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.871 186792 DEBUG nova.network.os_vif_util [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:4a:80,bridge_name='br-int',has_traffic_filtering=True,id=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a0e461d-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.872 186792 DEBUG nova.objects.instance [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.891 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <uuid>8ed928e2-c2fc-4d34-a5df-e191f0ee2880</uuid>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <name>instance-0000008e</name>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersTestJSON-server-1864001579</nova:name>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:14:42</nova:creationTime>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        <nova:user uuid="11d95211a44e4da9a04eb309ec3ab024">tempest-ServersTestJSON-1620770071-project-member</nova:user>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        <nova:project uuid="70cb231da30d4002a985cf18a579cd6a">tempest-ServersTestJSON-1620770071</nova:project>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        <nova:port uuid="0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <entry name="serial">8ed928e2-c2fc-4d34-a5df-e191f0ee2880</entry>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <entry name="uuid">8ed928e2-c2fc-4d34-a5df-e191f0ee2880</entry>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk.config"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:de:4a:80"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <target dev="tap0a0e461d-9e"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/console.log" append="off"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:14:42 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:14:42 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:14:42 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:14:42 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.892 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Preparing to wait for external event network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.893 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.893 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.893 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.894 186792 DEBUG nova.virt.libvirt.vif [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1864001579',display_name='tempest-ServersTestJSON-server-1864001579',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1864001579',id=142,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-ocio1nf3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:37Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=8ed928e2-c2fc-4d34-a5df-e191f0ee2880,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.894 186792 DEBUG nova.network.os_vif_util [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.895 186792 DEBUG nova.network.os_vif_util [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:4a:80,bridge_name='br-int',has_traffic_filtering=True,id=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a0e461d-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.895 186792 DEBUG os_vif [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:4a:80,bridge_name='br-int',has_traffic_filtering=True,id=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a0e461d-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.896 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.896 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.897 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.899 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.899 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a0e461d-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.900 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0a0e461d-9e, col_values=(('external_ids', {'iface-id': '0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:4a:80', 'vm-uuid': '8ed928e2-c2fc-4d34-a5df-e191f0ee2880'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.901 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:42 np0005531888 NetworkManager[55166]: <info>  [1763799282.9022] manager: (tap0a0e461d-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.903 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.907 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.908 186792 INFO os_vif [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:4a:80,bridge_name='br-int',has_traffic_filtering=True,id=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a0e461d-9e')#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.974 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.974 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.975 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No VIF found with MAC fa:16:3e:de:4a:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:14:42 np0005531888 nova_compute[186788]: 2025-11-22 08:14:42.975 186792 INFO nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Using config drive#033[00m
Nov 22 03:14:43 np0005531888 nova_compute[186788]: 2025-11-22 08:14:43.840 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:45 np0005531888 nova_compute[186788]: 2025-11-22 08:14:45.308 186792 INFO nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Creating config drive at /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk.config#033[00m
Nov 22 03:14:45 np0005531888 nova_compute[186788]: 2025-11-22 08:14:45.318 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf4kjp6eo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:45 np0005531888 nova_compute[186788]: 2025-11-22 08:14:45.472 186792 DEBUG oslo_concurrency.processutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf4kjp6eo" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:45 np0005531888 kernel: tap0a0e461d-9e: entered promiscuous mode
Nov 22 03:14:45 np0005531888 NetworkManager[55166]: <info>  [1763799285.5462] manager: (tap0a0e461d-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/261)
Nov 22 03:14:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:14:45Z|00548|binding|INFO|Claiming lport 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd for this chassis.
Nov 22 03:14:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:14:45Z|00549|binding|INFO|0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd: Claiming fa:16:3e:de:4a:80 10.100.0.7
Nov 22 03:14:45 np0005531888 nova_compute[186788]: 2025-11-22 08:14:45.548 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:14:45Z|00550|binding|INFO|Setting lport 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd ovn-installed in OVS
Nov 22 03:14:45 np0005531888 nova_compute[186788]: 2025-11-22 08:14:45.565 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:14:45Z|00551|binding|INFO|Setting lport 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd up in Southbound
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.568 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:4a:80 10.100.0.7'], port_security=['fa:16:3e:de:4a:80 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '8ed928e2-c2fc-4d34-a5df-e191f0ee2880', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.570 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a bound to our chassis#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.574 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:14:45 np0005531888 systemd-udevd[237128]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:14:45 np0005531888 NetworkManager[55166]: <info>  [1763799285.5919] device (tap0a0e461d-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:14:45 np0005531888 NetworkManager[55166]: <info>  [1763799285.5926] device (tap0a0e461d-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.592 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[32e14f42-be2e-4e7a-866d-0318755b0b39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:45 np0005531888 systemd-machined[153106]: New machine qemu-67-instance-0000008e.
Nov 22 03:14:45 np0005531888 systemd[1]: Started Virtual Machine qemu-67-instance-0000008e.
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.624 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[fc2544b3-c033-47a1-8a05-8b8defb5c972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.629 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c74e85a9-3839-48dc-aa22-fcf206b20c07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.661 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6e305a84-cb0a-417c-b64a-19c23a3aeb2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.680 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f51f6016-a6d9-41f3-833e-adc69e965e62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596519, 'reachable_time': 17106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237144, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.701 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2fe487f4-0b08-418e-8415-187d8ad8f5e6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596531, 'tstamp': 596531}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237145, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596534, 'tstamp': 596534}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237145, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.703 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:45 np0005531888 nova_compute[186788]: 2025-11-22 08:14:45.704 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.705 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.706 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.706 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:45.706 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.001 186792 DEBUG nova.compute.manager [req-ffb0dda1-acbb-4601-aa39-a9fc0295fe34 req-1792f2f1-8e08-48d5-8adb-ce1fe0b2aa5f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received event network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.002 186792 DEBUG oslo_concurrency.lockutils [req-ffb0dda1-acbb-4601-aa39-a9fc0295fe34 req-1792f2f1-8e08-48d5-8adb-ce1fe0b2aa5f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.002 186792 DEBUG oslo_concurrency.lockutils [req-ffb0dda1-acbb-4601-aa39-a9fc0295fe34 req-1792f2f1-8e08-48d5-8adb-ce1fe0b2aa5f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.002 186792 DEBUG oslo_concurrency.lockutils [req-ffb0dda1-acbb-4601-aa39-a9fc0295fe34 req-1792f2f1-8e08-48d5-8adb-ce1fe0b2aa5f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.003 186792 DEBUG nova.compute.manager [req-ffb0dda1-acbb-4601-aa39-a9fc0295fe34 req-1792f2f1-8e08-48d5-8adb-ce1fe0b2aa5f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Processing event network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.211 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799286.2106245, 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.211 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] VM Started (Lifecycle Event)#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.214 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.219 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.224 186792 INFO nova.virt.libvirt.driver [-] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Instance spawned successfully.#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.225 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.233 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.239 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.261 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.262 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.264 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.265 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.265 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.266 186792 DEBUG nova.virt.libvirt.driver [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.272 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.272 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799286.2109509, 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.272 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.297 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.301 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799286.2178986, 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.302 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.329 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.334 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.339 186792 INFO nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Took 9.20 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.340 186792 DEBUG nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.351 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.417 186792 INFO nova.compute.manager [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Took 10.00 seconds to build instance.#033[00m
Nov 22 03:14:46 np0005531888 nova_compute[186788]: 2025-11-22 08:14:46.451 186792 DEBUG oslo_concurrency.lockutils [None req-52761510-73ff-478d-a323-149781bba5e0 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:47 np0005531888 nova_compute[186788]: 2025-11-22 08:14:47.386 186792 DEBUG nova.network.neutron [req-be47c6ac-f3cc-4ff9-8388-ac9ac7e3b35f req-e3b399c5-ab6a-40cd-9b01-af564242b8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Updated VIF entry in instance network info cache for port 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:14:47 np0005531888 nova_compute[186788]: 2025-11-22 08:14:47.386 186792 DEBUG nova.network.neutron [req-be47c6ac-f3cc-4ff9-8388-ac9ac7e3b35f req-e3b399c5-ab6a-40cd-9b01-af564242b8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Updating instance_info_cache with network_info: [{"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:47 np0005531888 nova_compute[186788]: 2025-11-22 08:14:47.400 186792 DEBUG oslo_concurrency.lockutils [req-be47c6ac-f3cc-4ff9-8388-ac9ac7e3b35f req-e3b399c5-ab6a-40cd-9b01-af564242b8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8ed928e2-c2fc-4d34-a5df-e191f0ee2880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:47 np0005531888 podman[237154]: 2025-11-22 08:14:47.701924703 +0000 UTC m=+0.066613369 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 03:14:47 np0005531888 podman[237153]: 2025-11-22 08:14:47.726475737 +0000 UTC m=+0.090407725 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:14:47 np0005531888 nova_compute[186788]: 2025-11-22 08:14:47.902 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:48 np0005531888 nova_compute[186788]: 2025-11-22 08:14:48.149 186792 DEBUG nova.compute.manager [req-acadc431-ef8a-48d7-bf78-b6517733c881 req-082a91f2-59f4-42ef-89ae-3dd44a2747df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received event network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:48 np0005531888 nova_compute[186788]: 2025-11-22 08:14:48.150 186792 DEBUG oslo_concurrency.lockutils [req-acadc431-ef8a-48d7-bf78-b6517733c881 req-082a91f2-59f4-42ef-89ae-3dd44a2747df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:48 np0005531888 nova_compute[186788]: 2025-11-22 08:14:48.150 186792 DEBUG oslo_concurrency.lockutils [req-acadc431-ef8a-48d7-bf78-b6517733c881 req-082a91f2-59f4-42ef-89ae-3dd44a2747df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:48 np0005531888 nova_compute[186788]: 2025-11-22 08:14:48.150 186792 DEBUG oslo_concurrency.lockutils [req-acadc431-ef8a-48d7-bf78-b6517733c881 req-082a91f2-59f4-42ef-89ae-3dd44a2747df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:48 np0005531888 nova_compute[186788]: 2025-11-22 08:14:48.150 186792 DEBUG nova.compute.manager [req-acadc431-ef8a-48d7-bf78-b6517733c881 req-082a91f2-59f4-42ef-89ae-3dd44a2747df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] No waiting events found dispatching network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:14:48 np0005531888 nova_compute[186788]: 2025-11-22 08:14:48.150 186792 WARNING nova.compute.manager [req-acadc431-ef8a-48d7-bf78-b6517733c881 req-082a91f2-59f4-42ef-89ae-3dd44a2747df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received unexpected event network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd for instance with vm_state active and task_state None.#033[00m
Nov 22 03:14:48 np0005531888 nova_compute[186788]: 2025-11-22 08:14:48.842 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:50 np0005531888 nova_compute[186788]: 2025-11-22 08:14:50.803 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:50 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:50.802 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:14:50 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:50.803 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.586 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.586 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.632 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.762 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.763 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.772 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.773 186792 INFO nova.compute.claims [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.903 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:52 np0005531888 nova_compute[186788]: 2025-11-22 08:14:52.985 186792 DEBUG nova.compute.provider_tree [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.007 186792 DEBUG nova.scheduler.client.report [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.046 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.047 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.146 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.147 186792 DEBUG nova.network.neutron [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.168 186792 INFO nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.188 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.466 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.467 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.468 186792 INFO nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Creating image(s)#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.469 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "/var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.469 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.470 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.495 186792 DEBUG nova.policy [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.498 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.575 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.576 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.577 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.593 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.654 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.655 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:53 np0005531888 nova_compute[186788]: 2025-11-22 08:14:53.845 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.408 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk 1073741824" returned: 0 in 0.753s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.409 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.409 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.463 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.465 186792 DEBUG nova.virt.disk.api [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Checking if we can resize image /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.465 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.520 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.521 186792 DEBUG nova.virt.disk.api [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Cannot resize image /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.522 186792 DEBUG nova.objects.instance [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'migration_context' on Instance uuid 542fee2c-da44-4eb5-ba95-c2bd439402d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.536 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.537 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Ensure instance console log exists: /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.537 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.537 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:54 np0005531888 nova_compute[186788]: 2025-11-22 08:14:54.538 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:14:54.805 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:55 np0005531888 nova_compute[186788]: 2025-11-22 08:14:55.151 186792 DEBUG nova.network.neutron [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Successfully created port: d1446855-f298-4615-8b46-aaae3654486b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:14:57 np0005531888 nova_compute[186788]: 2025-11-22 08:14:57.181 186792 DEBUG nova.network.neutron [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Successfully updated port: d1446855-f298-4615-8b46-aaae3654486b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:14:57 np0005531888 nova_compute[186788]: 2025-11-22 08:14:57.198 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "refresh_cache-542fee2c-da44-4eb5-ba95-c2bd439402d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:57 np0005531888 nova_compute[186788]: 2025-11-22 08:14:57.199 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquired lock "refresh_cache-542fee2c-da44-4eb5-ba95-c2bd439402d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:57 np0005531888 nova_compute[186788]: 2025-11-22 08:14:57.199 186792 DEBUG nova.network.neutron [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:14:57 np0005531888 nova_compute[186788]: 2025-11-22 08:14:57.418 186792 DEBUG nova.network.neutron [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:14:57 np0005531888 nova_compute[186788]: 2025-11-22 08:14:57.906 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:58 np0005531888 nova_compute[186788]: 2025-11-22 08:14:58.495 186792 DEBUG nova.network.neutron [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Updating instance_info_cache with network_info: [{"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:58 np0005531888 nova_compute[186788]: 2025-11-22 08:14:58.846 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.371 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Releasing lock "refresh_cache-542fee2c-da44-4eb5-ba95-c2bd439402d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.372 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Instance network_info: |[{"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.375 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Start _get_guest_xml network_info=[{"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.380 186792 WARNING nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.385 186792 DEBUG nova.virt.libvirt.host [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.386 186792 DEBUG nova.virt.libvirt.host [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.391 186792 DEBUG nova.virt.libvirt.host [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.392 186792 DEBUG nova.virt.libvirt.host [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.393 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.393 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.393 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.393 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.394 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.394 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.394 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.394 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.394 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.394 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.395 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.395 186792 DEBUG nova.virt.hardware [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.398 186792 DEBUG nova.virt.libvirt.vif [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1864001579',display_name='tempest-ServersTestJSON-server-1864001579',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1864001579',id=144,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-5aybi1qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:53Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=542fee2c-da44-4eb5-ba95-c2bd439402d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.398 186792 DEBUG nova.network.os_vif_util [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.399 186792 DEBUG nova.network.os_vif_util [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:f2,bridge_name='br-int',has_traffic_filtering=True,id=d1446855-f298-4615-8b46-aaae3654486b,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1446855-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.400 186792 DEBUG nova.objects.instance [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 542fee2c-da44-4eb5-ba95-c2bd439402d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.446 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <uuid>542fee2c-da44-4eb5-ba95-c2bd439402d8</uuid>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <name>instance-00000090</name>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersTestJSON-server-1864001579</nova:name>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:14:59</nova:creationTime>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        <nova:user uuid="11d95211a44e4da9a04eb309ec3ab024">tempest-ServersTestJSON-1620770071-project-member</nova:user>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        <nova:project uuid="70cb231da30d4002a985cf18a579cd6a">tempest-ServersTestJSON-1620770071</nova:project>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        <nova:port uuid="d1446855-f298-4615-8b46-aaae3654486b">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <entry name="serial">542fee2c-da44-4eb5-ba95-c2bd439402d8</entry>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <entry name="uuid">542fee2c-da44-4eb5-ba95-c2bd439402d8</entry>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk.config"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:ee:36:f2"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <target dev="tapd1446855-f2"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/console.log" append="off"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:14:59 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:14:59 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:14:59 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:14:59 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.448 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Preparing to wait for external event network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.448 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.448 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.448 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.449 186792 DEBUG nova.virt.libvirt.vif [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1864001579',display_name='tempest-ServersTestJSON-server-1864001579',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1864001579',id=144,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-5aybi1qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:53Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=542fee2c-da44-4eb5-ba95-c2bd439402d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.449 186792 DEBUG nova.network.os_vif_util [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.450 186792 DEBUG nova.network.os_vif_util [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:f2,bridge_name='br-int',has_traffic_filtering=True,id=d1446855-f298-4615-8b46-aaae3654486b,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1446855-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.450 186792 DEBUG os_vif [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:f2,bridge_name='br-int',has_traffic_filtering=True,id=d1446855-f298-4615-8b46-aaae3654486b,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1446855-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.451 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.451 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.454 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.454 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1446855-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.454 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd1446855-f2, col_values=(('external_ids', {'iface-id': 'd1446855-f298-4615-8b46-aaae3654486b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:36:f2', 'vm-uuid': '542fee2c-da44-4eb5-ba95-c2bd439402d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.456 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:59 np0005531888 NetworkManager[55166]: <info>  [1763799299.4570] manager: (tapd1446855-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.458 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.464 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.465 186792 INFO os_vif [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:f2,bridge_name='br-int',has_traffic_filtering=True,id=d1446855-f298-4615-8b46-aaae3654486b,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1446855-f2')#033[00m
Nov 22 03:14:59 np0005531888 podman[237214]: 2025-11-22 08:14:59.559281833 +0000 UTC m=+0.057295640 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.567 186792 DEBUG nova.compute.manager [req-74fe8b2a-29d7-4fc6-a2b7-916fb9400cd2 req-36376b4b-5fa6-4fb5-ab52-efc8f4defce6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received event network-changed-d1446855-f298-4615-8b46-aaae3654486b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.567 186792 DEBUG nova.compute.manager [req-74fe8b2a-29d7-4fc6-a2b7-916fb9400cd2 req-36376b4b-5fa6-4fb5-ab52-efc8f4defce6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Refreshing instance network info cache due to event network-changed-d1446855-f298-4615-8b46-aaae3654486b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.567 186792 DEBUG oslo_concurrency.lockutils [req-74fe8b2a-29d7-4fc6-a2b7-916fb9400cd2 req-36376b4b-5fa6-4fb5-ab52-efc8f4defce6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-542fee2c-da44-4eb5-ba95-c2bd439402d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.567 186792 DEBUG oslo_concurrency.lockutils [req-74fe8b2a-29d7-4fc6-a2b7-916fb9400cd2 req-36376b4b-5fa6-4fb5-ab52-efc8f4defce6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-542fee2c-da44-4eb5-ba95-c2bd439402d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.567 186792 DEBUG nova.network.neutron [req-74fe8b2a-29d7-4fc6-a2b7-916fb9400cd2 req-36376b4b-5fa6-4fb5-ab52-efc8f4defce6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Refreshing network info cache for port d1446855-f298-4615-8b46-aaae3654486b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.688 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.689 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.689 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No VIF found with MAC fa:16:3e:ee:36:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:14:59 np0005531888 nova_compute[186788]: 2025-11-22 08:14:59.690 186792 INFO nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Using config drive#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.048 186792 INFO nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Creating config drive at /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk.config#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.054 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdvucivs5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.179 186792 DEBUG oslo_concurrency.processutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdvucivs5" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:00 np0005531888 kernel: tapd1446855-f2: entered promiscuous mode
Nov 22 03:15:00 np0005531888 NetworkManager[55166]: <info>  [1763799300.2320] manager: (tapd1446855-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Nov 22 03:15:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:00Z|00552|binding|INFO|Claiming lport d1446855-f298-4615-8b46-aaae3654486b for this chassis.
Nov 22 03:15:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:00Z|00553|binding|INFO|d1446855-f298-4615-8b46-aaae3654486b: Claiming fa:16:3e:ee:36:f2 10.100.0.11
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.236 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:00Z|00554|binding|INFO|Setting lport d1446855-f298-4615-8b46-aaae3654486b ovn-installed in OVS
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.248 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.251 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531888 systemd-udevd[237267]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:15:00 np0005531888 systemd-machined[153106]: New machine qemu-68-instance-00000090.
Nov 22 03:15:00 np0005531888 NetworkManager[55166]: <info>  [1763799300.2853] device (tapd1446855-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:15:00 np0005531888 NetworkManager[55166]: <info>  [1763799300.2859] device (tapd1446855-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:15:00 np0005531888 systemd[1]: Started Virtual Machine qemu-68-instance-00000090.
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.326 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:36:f2 10.100.0.11'], port_security=['fa:16:3e:ee:36:f2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '542fee2c-da44-4eb5-ba95-c2bd439402d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=d1446855-f298-4615-8b46-aaae3654486b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:00Z|00555|binding|INFO|Setting lport d1446855-f298-4615-8b46-aaae3654486b up in Southbound
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.327 104023 INFO neutron.agent.ovn.metadata.agent [-] Port d1446855-f298-4615-8b46-aaae3654486b in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a bound to our chassis#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.329 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.345 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1925da3a-fdc5-4011-8451-63a93f0a6cd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.380 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[dd06e8ef-3cb5-443d-a441-d67df25f8e46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.384 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b248e6b2-f932-4e8b-be65-c7a3da2714fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.414 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4c04fc88-da80-489c-a117-3494e36c9350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.430 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fe68f699-050e-4d28-99c8-37b5d091a996]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596519, 'reachable_time': 17106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237282, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.451 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0597d371-8c63-42c5-8dc5-1c98508c4f80]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596531, 'tstamp': 596531}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237283, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596534, 'tstamp': 596534}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237283, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.453 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.454 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.455 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.455 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.456 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.456 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:00.456 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.674 186792 DEBUG nova.compute.manager [req-0638b02c-fe55-4f8f-b434-a5d7a51a014e req-2b4ebb27-5126-4b92-80d1-53834a4da6c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received event network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.676 186792 DEBUG oslo_concurrency.lockutils [req-0638b02c-fe55-4f8f-b434-a5d7a51a014e req-2b4ebb27-5126-4b92-80d1-53834a4da6c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.677 186792 DEBUG oslo_concurrency.lockutils [req-0638b02c-fe55-4f8f-b434-a5d7a51a014e req-2b4ebb27-5126-4b92-80d1-53834a4da6c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.677 186792 DEBUG oslo_concurrency.lockutils [req-0638b02c-fe55-4f8f-b434-a5d7a51a014e req-2b4ebb27-5126-4b92-80d1-53834a4da6c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.678 186792 DEBUG nova.compute.manager [req-0638b02c-fe55-4f8f-b434-a5d7a51a014e req-2b4ebb27-5126-4b92-80d1-53834a4da6c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Processing event network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.898 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799300.8978033, 542fee2c-da44-4eb5-ba95-c2bd439402d8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.899 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] VM Started (Lifecycle Event)#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.903 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.909 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.914 186792 INFO nova.virt.libvirt.driver [-] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Instance spawned successfully.#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.915 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.918 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.924 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.941 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.942 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.943 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.944 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.944 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.945 186792 DEBUG nova.virt.libvirt.driver [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.950 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.950 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799300.9031613, 542fee2c-da44-4eb5-ba95-c2bd439402d8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.951 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.977 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.982 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799300.908177, 542fee2c-da44-4eb5-ba95-c2bd439402d8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:00 np0005531888 nova_compute[186788]: 2025-11-22 08:15:00.982 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.000 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.004 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.039 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.089 186792 INFO nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Took 7.62 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.090 186792 DEBUG nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.225 186792 INFO nova.compute.manager [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Took 8.50 seconds to build instance.#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.264 186792 DEBUG oslo_concurrency.lockutils [None req-bbe312d6-0f83-4d4a-bd20-df2c8ccae91d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:01 np0005531888 podman[237293]: 2025-11-22 08:15:01.688416256 +0000 UTC m=+0.059376842 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:15:01 np0005531888 nova_compute[186788]: 2025-11-22 08:15:01.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:15:02 np0005531888 nova_compute[186788]: 2025-11-22 08:15:02.815 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:02 np0005531888 nova_compute[186788]: 2025-11-22 08:15:02.816 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:02 np0005531888 nova_compute[186788]: 2025-11-22 08:15:02.816 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:15:02 np0005531888 nova_compute[186788]: 2025-11-22 08:15:02.817 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:03Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:de:4a:80 10.100.0.7
Nov 22 03:15:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:03Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:de:4a:80 10.100.0.7
Nov 22 03:15:03 np0005531888 nova_compute[186788]: 2025-11-22 08:15:03.848 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:04 np0005531888 nova_compute[186788]: 2025-11-22 08:15:04.456 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:04 np0005531888 podman[237315]: 2025-11-22 08:15:04.683229058 +0000 UTC m=+0.056543733 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 22 03:15:05 np0005531888 nova_compute[186788]: 2025-11-22 08:15:05.653 186792 DEBUG nova.compute.manager [req-a584743c-1820-4974-9d06-bc73f6460b05 req-53e3ddf2-5532-45e0-916e-d187efb74b0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received event network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:05 np0005531888 nova_compute[186788]: 2025-11-22 08:15:05.654 186792 DEBUG oslo_concurrency.lockutils [req-a584743c-1820-4974-9d06-bc73f6460b05 req-53e3ddf2-5532-45e0-916e-d187efb74b0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:05 np0005531888 nova_compute[186788]: 2025-11-22 08:15:05.655 186792 DEBUG oslo_concurrency.lockutils [req-a584743c-1820-4974-9d06-bc73f6460b05 req-53e3ddf2-5532-45e0-916e-d187efb74b0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:05 np0005531888 nova_compute[186788]: 2025-11-22 08:15:05.655 186792 DEBUG oslo_concurrency.lockutils [req-a584743c-1820-4974-9d06-bc73f6460b05 req-53e3ddf2-5532-45e0-916e-d187efb74b0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:05 np0005531888 nova_compute[186788]: 2025-11-22 08:15:05.656 186792 DEBUG nova.compute.manager [req-a584743c-1820-4974-9d06-bc73f6460b05 req-53e3ddf2-5532-45e0-916e-d187efb74b0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] No waiting events found dispatching network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:05 np0005531888 nova_compute[186788]: 2025-11-22 08:15:05.656 186792 WARNING nova.compute.manager [req-a584743c-1820-4974-9d06-bc73f6460b05 req-53e3ddf2-5532-45e0-916e-d187efb74b0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received unexpected event network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b for instance with vm_state active and task_state None.#033[00m
Nov 22 03:15:06 np0005531888 nova_compute[186788]: 2025-11-22 08:15:06.401 186792 DEBUG nova.network.neutron [req-74fe8b2a-29d7-4fc6-a2b7-916fb9400cd2 req-36376b4b-5fa6-4fb5-ab52-efc8f4defce6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Updated VIF entry in instance network info cache for port d1446855-f298-4615-8b46-aaae3654486b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:15:06 np0005531888 nova_compute[186788]: 2025-11-22 08:15:06.402 186792 DEBUG nova.network.neutron [req-74fe8b2a-29d7-4fc6-a2b7-916fb9400cd2 req-36376b4b-5fa6-4fb5-ab52-efc8f4defce6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Updating instance_info_cache with network_info: [{"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:06 np0005531888 nova_compute[186788]: 2025-11-22 08:15:06.424 186792 DEBUG oslo_concurrency.lockutils [req-74fe8b2a-29d7-4fc6-a2b7-916fb9400cd2 req-36376b4b-5fa6-4fb5-ab52-efc8f4defce6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-542fee2c-da44-4eb5-ba95-c2bd439402d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:07 np0005531888 podman[237337]: 2025-11-22 08:15:07.696323269 +0000 UTC m=+0.072923075 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 03:15:07 np0005531888 podman[237338]: 2025-11-22 08:15:07.728550562 +0000 UTC m=+0.095574772 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.594 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Updating instance_info_cache with network_info: [{"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.631 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.632 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.633 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.633 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.850 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.936 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.937 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.937 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.938 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.938 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.957 186792 INFO nova.compute.manager [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Terminating instance#033[00m
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.964 186792 DEBUG nova.compute.manager [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:15:08 np0005531888 kernel: tapd1446855-f2 (unregistering): left promiscuous mode
Nov 22 03:15:08 np0005531888 NetworkManager[55166]: <info>  [1763799308.9844] device (tapd1446855-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:15:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:08Z|00556|binding|INFO|Releasing lport d1446855-f298-4615-8b46-aaae3654486b from this chassis (sb_readonly=0)
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.991 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:08Z|00557|binding|INFO|Setting lport d1446855-f298-4615-8b46-aaae3654486b down in Southbound
Nov 22 03:15:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:08Z|00558|binding|INFO|Removing iface tapd1446855-f2 ovn-installed in OVS
Nov 22 03:15:08 np0005531888 nova_compute[186788]: 2025-11-22 08:15:08.994 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.004 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:36:f2 10.100.0.11'], port_security=['fa:16:3e:ee:36:f2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '542fee2c-da44-4eb5-ba95-c2bd439402d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=d1446855-f298-4615-8b46-aaae3654486b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.005 104023 INFO neutron.agent.ovn.metadata.agent [-] Port d1446855-f298-4615-8b46-aaae3654486b in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a unbound from our chassis#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.008 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.023 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[700f6426-994d-4e71-9b4a-99a8e366c8dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:09 np0005531888 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000090.scope: Deactivated successfully.
Nov 22 03:15:09 np0005531888 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000090.scope: Consumed 8.737s CPU time.
Nov 22 03:15:09 np0005531888 systemd-machined[153106]: Machine qemu-68-instance-00000090 terminated.
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.053 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[315ad93e-b643-4b26-8c8a-14dae3266201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.057 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a075bdcf-78af-4455-a262-688ff3f1f288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.084 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[70674122-1e09-43c4-876e-e2898dca97bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.100 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e5234aa6-7c43-4506-a910-bb5ee599c003]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 9, 'rx_bytes': 658, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 9, 'rx_bytes': 658, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596519, 'reachable_time': 17106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237394, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.116 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[44f82b04-6314-416d-b7c8-01735efd99c4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596531, 'tstamp': 596531}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237395, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596534, 'tstamp': 596534}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237395, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.118 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.120 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.124 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.124 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.125 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.125 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:09.126 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:09 np0005531888 kernel: tapd1446855-f2: entered promiscuous mode
Nov 22 03:15:09 np0005531888 kernel: tapd1446855-f2 (unregistering): left promiscuous mode
Nov 22 03:15:09 np0005531888 NetworkManager[55166]: <info>  [1763799309.1845] manager: (tapd1446855-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.190 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.227 186792 INFO nova.virt.libvirt.driver [-] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Instance destroyed successfully.#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.228 186792 DEBUG nova.objects.instance [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'resources' on Instance uuid 542fee2c-da44-4eb5-ba95-c2bd439402d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.460 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.575 186792 DEBUG nova.virt.libvirt.vif [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:14:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1864001579',display_name='tempest-ServersTestJSON-server-1864001579',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1864001579',id=144,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:15:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-5aybi1qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:15:01Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=542fee2c-da44-4eb5-ba95-c2bd439402d8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.575 186792 DEBUG nova.network.os_vif_util [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "d1446855-f298-4615-8b46-aaae3654486b", "address": "fa:16:3e:ee:36:f2", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1446855-f2", "ovs_interfaceid": "d1446855-f298-4615-8b46-aaae3654486b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.576 186792 DEBUG nova.network.os_vif_util [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:f2,bridge_name='br-int',has_traffic_filtering=True,id=d1446855-f298-4615-8b46-aaae3654486b,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1446855-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.577 186792 DEBUG os_vif [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:f2,bridge_name='br-int',has_traffic_filtering=True,id=d1446855-f298-4615-8b46-aaae3654486b,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1446855-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.578 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.578 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1446855-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.580 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.581 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.583 186792 INFO os_vif [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:f2,bridge_name='br-int',has_traffic_filtering=True,id=d1446855-f298-4615-8b46-aaae3654486b,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1446855-f2')#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.584 186792 INFO nova.virt.libvirt.driver [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Deleting instance files /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8_del#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.585 186792 INFO nova.virt.libvirt.driver [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Deletion of /var/lib/nova/instances/542fee2c-da44-4eb5-ba95-c2bd439402d8_del complete#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.696 186792 INFO nova.compute.manager [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Took 0.73 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.696 186792 DEBUG oslo.service.loopingcall [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.697 186792 DEBUG nova.compute.manager [-] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:15:09 np0005531888 nova_compute[186788]: 2025-11-22 08:15:09.697 186792 DEBUG nova.network.neutron [-] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:15:10 np0005531888 nova_compute[186788]: 2025-11-22 08:15:10.939 186792 DEBUG nova.compute.manager [req-0b88991b-d6c5-4188-9be5-5898a3b0e316 req-9f855f4d-4720-472d-9dfd-c007c28bbf21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received event network-vif-unplugged-d1446855-f298-4615-8b46-aaae3654486b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:10 np0005531888 nova_compute[186788]: 2025-11-22 08:15:10.939 186792 DEBUG oslo_concurrency.lockutils [req-0b88991b-d6c5-4188-9be5-5898a3b0e316 req-9f855f4d-4720-472d-9dfd-c007c28bbf21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:10 np0005531888 nova_compute[186788]: 2025-11-22 08:15:10.939 186792 DEBUG oslo_concurrency.lockutils [req-0b88991b-d6c5-4188-9be5-5898a3b0e316 req-9f855f4d-4720-472d-9dfd-c007c28bbf21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:10 np0005531888 nova_compute[186788]: 2025-11-22 08:15:10.940 186792 DEBUG oslo_concurrency.lockutils [req-0b88991b-d6c5-4188-9be5-5898a3b0e316 req-9f855f4d-4720-472d-9dfd-c007c28bbf21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:10 np0005531888 nova_compute[186788]: 2025-11-22 08:15:10.940 186792 DEBUG nova.compute.manager [req-0b88991b-d6c5-4188-9be5-5898a3b0e316 req-9f855f4d-4720-472d-9dfd-c007c28bbf21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] No waiting events found dispatching network-vif-unplugged-d1446855-f298-4615-8b46-aaae3654486b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:10 np0005531888 nova_compute[186788]: 2025-11-22 08:15:10.940 186792 DEBUG nova.compute.manager [req-0b88991b-d6c5-4188-9be5-5898a3b0e316 req-9f855f4d-4720-472d-9dfd-c007c28bbf21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received event network-vif-unplugged-d1446855-f298-4615-8b46-aaae3654486b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.409 186792 DEBUG nova.network.neutron [-] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.448 186792 INFO nova.compute.manager [-] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Took 1.75 seconds to deallocate network for instance.#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.543 186792 DEBUG nova.compute.manager [req-885d5ce9-8882-4c15-bfdc-4a33d8dee717 req-ac2a3c3f-29bb-4624-a544-70d7b4c17f16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received event network-vif-deleted-d1446855-f298-4615-8b46-aaae3654486b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.592 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.593 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.634 186792 DEBUG nova.scheduler.client.report [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.721 186792 DEBUG nova.scheduler.client.report [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.721 186792 DEBUG nova.compute.provider_tree [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.739 186792 DEBUG nova.scheduler.client.report [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.785 186792 DEBUG nova.scheduler.client.report [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.879 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.907 186792 WARNING nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] While synchronizing instance power states, found 3 instances in the database and 2 instances on the hypervisor.#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.907 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Triggering sync for uuid 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.908 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Triggering sync for uuid 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.908 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Triggering sync for uuid 542fee2c-da44-4eb5-ba95-c2bd439402d8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.909 186792 DEBUG nova.compute.provider_tree [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.910 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.911 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.911 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.911 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.912 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.931 186792 DEBUG nova.scheduler.client.report [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.941 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.956 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.961 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.973 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.975 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:11 np0005531888 nova_compute[186788]: 2025-11-22 08:15:11.976 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.003 186792 INFO nova.scheduler.client.report [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Deleted allocations for instance 542fee2c-da44-4eb5-ba95-c2bd439402d8#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.110 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.137 186792 DEBUG oslo_concurrency.lockutils [None req-34194fd9-cacc-4ed6-adbd-a80c51c58139 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.139 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.166 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.173 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.174 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.232 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.240 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.301 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.302 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.356 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.507 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.508 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5352MB free_disk=73.2158432006836GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.509 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.509 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.667 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.668 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.668 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.668 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.791 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.841 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.876 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:15:12 np0005531888 nova_compute[186788]: 2025-11-22 08:15:12.876 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.410 186792 DEBUG nova.compute.manager [req-fa1d3b30-8d0d-4ff5-8ce4-8cd6787fdf5b req-dfd6ab8e-c2d4-4923-92a2-2e3c500fa9cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received event network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.411 186792 DEBUG oslo_concurrency.lockutils [req-fa1d3b30-8d0d-4ff5-8ce4-8cd6787fdf5b req-dfd6ab8e-c2d4-4923-92a2-2e3c500fa9cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.412 186792 DEBUG oslo_concurrency.lockutils [req-fa1d3b30-8d0d-4ff5-8ce4-8cd6787fdf5b req-dfd6ab8e-c2d4-4923-92a2-2e3c500fa9cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.412 186792 DEBUG oslo_concurrency.lockutils [req-fa1d3b30-8d0d-4ff5-8ce4-8cd6787fdf5b req-dfd6ab8e-c2d4-4923-92a2-2e3c500fa9cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "542fee2c-da44-4eb5-ba95-c2bd439402d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.412 186792 DEBUG nova.compute.manager [req-fa1d3b30-8d0d-4ff5-8ce4-8cd6787fdf5b req-dfd6ab8e-c2d4-4923-92a2-2e3c500fa9cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] No waiting events found dispatching network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.413 186792 WARNING nova.compute.manager [req-fa1d3b30-8d0d-4ff5-8ce4-8cd6787fdf5b req-dfd6ab8e-c2d4-4923-92a2-2e3c500fa9cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Received unexpected event network-vif-plugged-d1446855-f298-4615-8b46-aaae3654486b for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.852 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.861 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.861 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.862 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.862 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.863 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.874 186792 INFO nova.compute.manager [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Terminating instance#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.881 186792 DEBUG nova.compute.manager [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:15:13 np0005531888 kernel: tap0a0e461d-9e (unregistering): left promiscuous mode
Nov 22 03:15:13 np0005531888 NetworkManager[55166]: <info>  [1763799313.9202] device (tap0a0e461d-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.927 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:13Z|00559|binding|INFO|Releasing lport 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd from this chassis (sb_readonly=0)
Nov 22 03:15:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:13Z|00560|binding|INFO|Setting lport 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd down in Southbound
Nov 22 03:15:13 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:13Z|00561|binding|INFO|Removing iface tap0a0e461d-9e ovn-installed in OVS
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.929 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:13.939 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:4a:80 10.100.0.7'], port_security=['fa:16:3e:de:4a:80 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '8ed928e2-c2fc-4d34-a5df-e191f0ee2880', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:13 np0005531888 nova_compute[186788]: 2025-11-22 08:15:13.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:13.940 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a unbound from our chassis#033[00m
Nov 22 03:15:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:13.944 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:15:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:13.960 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[264beb00-c9e4-4ffa-8b84-6f54faeb55f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:13 np0005531888 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Nov 22 03:15:13 np0005531888 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008e.scope: Consumed 15.998s CPU time.
Nov 22 03:15:13 np0005531888 systemd-machined[153106]: Machine qemu-67-instance-0000008e terminated.
Nov 22 03:15:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:13.988 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2006a4ec-210e-45ea-94df-08d5bebd3e2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:13 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:13.991 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[69205838-40c8-4d7a-8e63-9ddd2a7dbc2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:14.016 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a20787c0-69ad-48f3-b01c-ea395720425f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:14.031 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e6433f-8d69-4156-bbf0-32acb54a9c8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596519, 'reachable_time': 17106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237437, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:14.047 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c9043222-2010-420e-b856-131ae66be5f1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596531, 'tstamp': 596531}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237438, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596534, 'tstamp': 596534}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237438, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:14.048 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.049 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.053 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:14.053 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:14.053 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:14.054 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:14.054 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.138 186792 INFO nova.virt.libvirt.driver [-] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Instance destroyed successfully.#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.139 186792 DEBUG nova.objects.instance [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'resources' on Instance uuid 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.154 186792 DEBUG nova.virt.libvirt.vif [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:14:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1864001579',display_name='tempest-ServersTestJSON-server-1864001579',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1864001579',id=142,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:14:46Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-ocio1nf3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:14:46Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=8ed928e2-c2fc-4d34-a5df-e191f0ee2880,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.154 186792 DEBUG nova.network.os_vif_util [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "address": "fa:16:3e:de:4a:80", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a0e461d-9e", "ovs_interfaceid": "0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.155 186792 DEBUG nova.network.os_vif_util [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:4a:80,bridge_name='br-int',has_traffic_filtering=True,id=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a0e461d-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.155 186792 DEBUG os_vif [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:4a:80,bridge_name='br-int',has_traffic_filtering=True,id=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a0e461d-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.157 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.157 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a0e461d-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.159 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.160 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.162 186792 INFO os_vif [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:4a:80,bridge_name='br-int',has_traffic_filtering=True,id=0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a0e461d-9e')#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.162 186792 INFO nova.virt.libvirt.driver [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Deleting instance files /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880_del#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.163 186792 INFO nova.virt.libvirt.driver [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Deletion of /var/lib/nova/instances/8ed928e2-c2fc-4d34-a5df-e191f0ee2880_del complete#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.262 186792 INFO nova.compute.manager [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.263 186792 DEBUG oslo.service.loopingcall [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.264 186792 DEBUG nova.compute.manager [-] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:15:14 np0005531888 nova_compute[186788]: 2025-11-22 08:15:14.264 186792 DEBUG nova.network.neutron [-] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.336 186792 DEBUG nova.network.neutron [-] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.388 186792 INFO nova.compute.manager [-] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Took 1.12 seconds to deallocate network for instance.#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.499 186792 DEBUG nova.compute.manager [req-621c0343-9428-4379-8df8-ee9b66fc0359 req-74f98982-0646-4a96-bc20-98e1620e13d3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received event network-vif-deleted-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.504 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.504 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.563 186792 DEBUG nova.compute.manager [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received event network-vif-unplugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.564 186792 DEBUG oslo_concurrency.lockutils [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.565 186792 DEBUG oslo_concurrency.lockutils [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.565 186792 DEBUG oslo_concurrency.lockutils [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.566 186792 DEBUG nova.compute.manager [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] No waiting events found dispatching network-vif-unplugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.566 186792 WARNING nova.compute.manager [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received unexpected event network-vif-unplugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.567 186792 DEBUG nova.compute.manager [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received event network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.567 186792 DEBUG oslo_concurrency.lockutils [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.568 186792 DEBUG oslo_concurrency.lockutils [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.568 186792 DEBUG oslo_concurrency.lockutils [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.569 186792 DEBUG nova.compute.manager [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] No waiting events found dispatching network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.569 186792 WARNING nova.compute.manager [req-a138594a-f43f-450e-a7a2-ddf714f4f64b req-4c0efc73-09be-4410-8f7c-e01b6e02bac2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Received unexpected event network-vif-plugged-0a0e461d-9e37-4a2e-99b4-3c4dec2f48bd for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.629 186792 DEBUG nova.compute.provider_tree [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.646 186792 DEBUG nova.scheduler.client.report [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.696 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.833 186792 INFO nova.scheduler.client.report [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Deleted allocations for instance 8ed928e2-c2fc-4d34-a5df-e191f0ee2880#033[00m
Nov 22 03:15:15 np0005531888 nova_compute[186788]: 2025-11-22 08:15:15.993 186792 DEBUG oslo_concurrency.lockutils [None req-fb7f0193-06c0-46b1-878e-bbe53cb5a96d 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "8ed928e2-c2fc-4d34-a5df-e191f0ee2880" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:16 np0005531888 nova_compute[186788]: 2025-11-22 08:15:16.877 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:16 np0005531888 nova_compute[186788]: 2025-11-22 08:15:16.878 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:15:18 np0005531888 podman[237457]: 2025-11-22 08:15:18.684796308 +0000 UTC m=+0.054591723 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:15:18 np0005531888 podman[237458]: 2025-11-22 08:15:18.689112575 +0000 UTC m=+0.055806904 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 22 03:15:18 np0005531888 nova_compute[186788]: 2025-11-22 08:15:18.855 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531888 nova_compute[186788]: 2025-11-22 08:15:19.159 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.213 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.214 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.248 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.378 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.379 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.389 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.390 186792 INFO nova.compute.claims [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.772 186792 DEBUG nova.compute.provider_tree [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.786 186792 DEBUG nova.scheduler.client.report [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.967 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:20 np0005531888 nova_compute[186788]: 2025-11-22 08:15:20.967 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.040 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.040 186792 DEBUG nova.network.neutron [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.099 186792 INFO nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.142 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.316 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.317 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.317 186792 INFO nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Creating image(s)#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.318 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "/var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.318 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.319 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.332 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.405 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.406 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.407 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.426 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.501 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.502 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.561 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.562 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.563 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.625 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.627 186792 DEBUG nova.virt.disk.api [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Checking if we can resize image /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.628 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.700 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.701 186792 DEBUG nova.virt.disk.api [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Cannot resize image /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.702 186792 DEBUG nova.objects.instance [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'migration_context' on Instance uuid 2770b8d9-8240-41e5-9b5d-5424298ff3f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.719 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.720 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Ensure instance console log exists: /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.720 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.720 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.721 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:21 np0005531888 nova_compute[186788]: 2025-11-22 08:15:21.930 186792 DEBUG nova.policy [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:15:23 np0005531888 nova_compute[186788]: 2025-11-22 08:15:23.856 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:23 np0005531888 nova_compute[186788]: 2025-11-22 08:15:23.893 186792 DEBUG nova.network.neutron [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Successfully created port: dacaeb80-9494-4184-a865-eeefe225e904 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:15:24 np0005531888 nova_compute[186788]: 2025-11-22 08:15:24.161 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:24 np0005531888 nova_compute[186788]: 2025-11-22 08:15:24.226 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799309.2251198, 542fee2c-da44-4eb5-ba95-c2bd439402d8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:24 np0005531888 nova_compute[186788]: 2025-11-22 08:15:24.227 186792 INFO nova.compute.manager [-] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:15:24 np0005531888 nova_compute[186788]: 2025-11-22 08:15:24.265 186792 DEBUG nova.compute.manager [None req-161d7d3f-05e8-44a1-8da5-bd2fd588ba8f - - - - - -] [instance: 542fee2c-da44-4eb5-ba95-c2bd439402d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:24 np0005531888 nova_compute[186788]: 2025-11-22 08:15:24.864 186792 DEBUG nova.network.neutron [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Successfully updated port: dacaeb80-9494-4184-a865-eeefe225e904 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:15:24 np0005531888 nova_compute[186788]: 2025-11-22 08:15:24.911 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "refresh_cache-2770b8d9-8240-41e5-9b5d-5424298ff3f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:24 np0005531888 nova_compute[186788]: 2025-11-22 08:15:24.912 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquired lock "refresh_cache-2770b8d9-8240-41e5-9b5d-5424298ff3f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:24 np0005531888 nova_compute[186788]: 2025-11-22 08:15:24.912 186792 DEBUG nova.network.neutron [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:15:25 np0005531888 nova_compute[186788]: 2025-11-22 08:15:25.225 186792 DEBUG nova.compute.manager [req-7c983491-4b2b-418a-8a59-58cd8829ffdc req-10515e33-0139-428e-85c1-3c8893ce11c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received event network-changed-dacaeb80-9494-4184-a865-eeefe225e904 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:25 np0005531888 nova_compute[186788]: 2025-11-22 08:15:25.225 186792 DEBUG nova.compute.manager [req-7c983491-4b2b-418a-8a59-58cd8829ffdc req-10515e33-0139-428e-85c1-3c8893ce11c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Refreshing instance network info cache due to event network-changed-dacaeb80-9494-4184-a865-eeefe225e904. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:15:25 np0005531888 nova_compute[186788]: 2025-11-22 08:15:25.225 186792 DEBUG oslo_concurrency.lockutils [req-7c983491-4b2b-418a-8a59-58cd8829ffdc req-10515e33-0139-428e-85c1-3c8893ce11c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-2770b8d9-8240-41e5-9b5d-5424298ff3f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:25 np0005531888 nova_compute[186788]: 2025-11-22 08:15:25.339 186792 DEBUG nova.network.neutron [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:15:28 np0005531888 nova_compute[186788]: 2025-11-22 08:15:28.858 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.138 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799314.136451, 8ed928e2-c2fc-4d34-a5df-e191f0ee2880 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.139 186792 INFO nova.compute.manager [-] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.163 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.173 186792 DEBUG nova.compute.manager [None req-48b01c0e-f7d6-4a54-aca7-b5298b85affa - - - - - -] [instance: 8ed928e2-c2fc-4d34-a5df-e191f0ee2880] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.215 186792 DEBUG nova.network.neutron [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Updating instance_info_cache with network_info: [{"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.324 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Releasing lock "refresh_cache-2770b8d9-8240-41e5-9b5d-5424298ff3f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.324 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Instance network_info: |[{"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.325 186792 DEBUG oslo_concurrency.lockutils [req-7c983491-4b2b-418a-8a59-58cd8829ffdc req-10515e33-0139-428e-85c1-3c8893ce11c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-2770b8d9-8240-41e5-9b5d-5424298ff3f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.325 186792 DEBUG nova.network.neutron [req-7c983491-4b2b-418a-8a59-58cd8829ffdc req-10515e33-0139-428e-85c1-3c8893ce11c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Refreshing network info cache for port dacaeb80-9494-4184-a865-eeefe225e904 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.328 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Start _get_guest_xml network_info=[{"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.332 186792 WARNING nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.338 186792 DEBUG nova.virt.libvirt.host [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.338 186792 DEBUG nova.virt.libvirt.host [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.349 186792 DEBUG nova.virt.libvirt.host [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.350 186792 DEBUG nova.virt.libvirt.host [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.351 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.351 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.351 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.352 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.352 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.352 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.352 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.352 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.353 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.353 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.353 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.353 186792 DEBUG nova.virt.hardware [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.356 186792 DEBUG nova.virt.libvirt.vif [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1143915207',display_name='tempest-ServersTestJSON-server-1143915207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1143915207',id=145,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-qd8ypeff',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:15:21Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=2770b8d9-8240-41e5-9b5d-5424298ff3f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.357 186792 DEBUG nova.network.os_vif_util [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.357 186792 DEBUG nova.network.os_vif_util [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:6f:35,bridge_name='br-int',has_traffic_filtering=True,id=dacaeb80-9494-4184-a865-eeefe225e904,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdacaeb80-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.358 186792 DEBUG nova.objects.instance [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 2770b8d9-8240-41e5-9b5d-5424298ff3f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.421 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <uuid>2770b8d9-8240-41e5-9b5d-5424298ff3f8</uuid>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <name>instance-00000091</name>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServersTestJSON-server-1143915207</nova:name>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:15:29</nova:creationTime>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        <nova:user uuid="11d95211a44e4da9a04eb309ec3ab024">tempest-ServersTestJSON-1620770071-project-member</nova:user>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        <nova:project uuid="70cb231da30d4002a985cf18a579cd6a">tempest-ServersTestJSON-1620770071</nova:project>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        <nova:port uuid="dacaeb80-9494-4184-a865-eeefe225e904">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <entry name="serial">2770b8d9-8240-41e5-9b5d-5424298ff3f8</entry>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <entry name="uuid">2770b8d9-8240-41e5-9b5d-5424298ff3f8</entry>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk.config"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:60:6f:35"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <target dev="tapdacaeb80-94"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/console.log" append="off"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:15:29 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:15:29 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:15:29 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:15:29 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.423 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Preparing to wait for external event network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.423 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.424 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.424 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.426 186792 DEBUG nova.virt.libvirt.vif [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1143915207',display_name='tempest-ServersTestJSON-server-1143915207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1143915207',id=145,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-qd8ypeff',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:15:21Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=2770b8d9-8240-41e5-9b5d-5424298ff3f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.427 186792 DEBUG nova.network.os_vif_util [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.428 186792 DEBUG nova.network.os_vif_util [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:6f:35,bridge_name='br-int',has_traffic_filtering=True,id=dacaeb80-9494-4184-a865-eeefe225e904,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdacaeb80-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.429 186792 DEBUG os_vif [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:6f:35,bridge_name='br-int',has_traffic_filtering=True,id=dacaeb80-9494-4184-a865-eeefe225e904,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdacaeb80-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.429 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.430 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.431 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.435 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.435 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdacaeb80-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.436 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdacaeb80-94, col_values=(('external_ids', {'iface-id': 'dacaeb80-9494-4184-a865-eeefe225e904', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:6f:35', 'vm-uuid': '2770b8d9-8240-41e5-9b5d-5424298ff3f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.439 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:29 np0005531888 NetworkManager[55166]: <info>  [1763799329.4415] manager: (tapdacaeb80-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.446 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.448 186792 INFO os_vif [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:6f:35,bridge_name='br-int',has_traffic_filtering=True,id=dacaeb80-9494-4184-a865-eeefe225e904,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdacaeb80-94')#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.645 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.646 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.647 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No VIF found with MAC fa:16:3e:60:6f:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:15:29 np0005531888 nova_compute[186788]: 2025-11-22 08:15:29.648 186792 INFO nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Using config drive#033[00m
Nov 22 03:15:29 np0005531888 podman[237518]: 2025-11-22 08:15:29.698418619 +0000 UTC m=+0.074079493 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.038 186792 INFO nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Creating config drive at /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk.config#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.043 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp960rxkwi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.172 186792 DEBUG oslo_concurrency.processutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp960rxkwi" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:30 np0005531888 kernel: tapdacaeb80-94: entered promiscuous mode
Nov 22 03:15:30 np0005531888 NetworkManager[55166]: <info>  [1763799330.2426] manager: (tapdacaeb80-94): new Tun device (/org/freedesktop/NetworkManager/Devices/266)
Nov 22 03:15:30 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:30Z|00562|binding|INFO|Claiming lport dacaeb80-9494-4184-a865-eeefe225e904 for this chassis.
Nov 22 03:15:30 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:30Z|00563|binding|INFO|dacaeb80-9494-4184-a865-eeefe225e904: Claiming fa:16:3e:60:6f:35 10.100.0.10
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.244 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:30 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:30Z|00564|binding|INFO|Setting lport dacaeb80-9494-4184-a865-eeefe225e904 ovn-installed in OVS
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.257 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.260 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:30 np0005531888 systemd-udevd[237553]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:15:30 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:30Z|00565|binding|INFO|Setting lport dacaeb80-9494-4184-a865-eeefe225e904 up in Southbound
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.278 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:6f:35 10.100.0.10'], port_security=['fa:16:3e:60:6f:35 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2770b8d9-8240-41e5-9b5d-5424298ff3f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=dacaeb80-9494-4184-a865-eeefe225e904) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.280 104023 INFO neutron.agent.ovn.metadata.agent [-] Port dacaeb80-9494-4184-a865-eeefe225e904 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a bound to our chassis#033[00m
Nov 22 03:15:30 np0005531888 NetworkManager[55166]: <info>  [1763799330.2816] device (tapdacaeb80-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:15:30 np0005531888 NetworkManager[55166]: <info>  [1763799330.2826] device (tapdacaeb80-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.281 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:15:30 np0005531888 systemd-machined[153106]: New machine qemu-69-instance-00000091.
Nov 22 03:15:30 np0005531888 systemd[1]: Started Virtual Machine qemu-69-instance-00000091.
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.299 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf65ae4-2e31-42e8-96a1-1ad4db3548a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.332 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[85476da9-cd22-4fb0-ad4d-f0d7b5e196b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.335 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[ba09ec7e-fe8c-4e99-9de2-d4d6d626713d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.366 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f48d6671-6e22-4bb0-9127-46e44598d2f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.388 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ffaaf2e0-07ff-4565-9f3b-07da0a8e8f5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596519, 'reachable_time': 17106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237569, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.408 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[05f7fda1-bffc-4ee7-8512-9d0a7b6128a5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596531, 'tstamp': 596531}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237570, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596534, 'tstamp': 596534}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237570, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.410 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.412 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.413 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.413 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.414 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.414 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:30.414 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.857 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799330.8571696, 2770b8d9-8240-41e5-9b5d-5424298ff3f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.858 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] VM Started (Lifecycle Event)#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.883 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.888 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799330.8580506, 2770b8d9-8240-41e5-9b5d-5424298ff3f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.888 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.905 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.909 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:15:30 np0005531888 nova_compute[186788]: 2025-11-22 08:15:30.927 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.750 186792 DEBUG nova.compute.manager [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received event network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.750 186792 DEBUG oslo_concurrency.lockutils [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.751 186792 DEBUG oslo_concurrency.lockutils [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.751 186792 DEBUG oslo_concurrency.lockutils [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.751 186792 DEBUG nova.compute.manager [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Processing event network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.752 186792 DEBUG nova.compute.manager [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received event network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.752 186792 DEBUG oslo_concurrency.lockutils [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.752 186792 DEBUG oslo_concurrency.lockutils [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.752 186792 DEBUG oslo_concurrency.lockutils [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.753 186792 DEBUG nova.compute.manager [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] No waiting events found dispatching network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.753 186792 WARNING nova.compute.manager [req-3d0ec05f-6b3f-43b5-8437-11ef94cb4f3a req-931efc83-34fa-40e2-9def-04ff648b537d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received unexpected event network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.753 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.757 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799331.7572002, 2770b8d9-8240-41e5-9b5d-5424298ff3f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.757 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.759 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.763 186792 INFO nova.virt.libvirt.driver [-] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Instance spawned successfully.#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.763 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.782 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.786 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.786 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.787 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.788 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.788 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.788 186792 DEBUG nova.virt.libvirt.driver [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.793 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.841 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.885 186792 INFO nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Took 10.57 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.886 186792 DEBUG nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:31 np0005531888 nova_compute[186788]: 2025-11-22 08:15:31.992 186792 INFO nova.compute.manager [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Took 11.65 seconds to build instance.#033[00m
Nov 22 03:15:32 np0005531888 nova_compute[186788]: 2025-11-22 08:15:32.017 186792 DEBUG oslo_concurrency.lockutils [None req-f7668ff8-83d7-4267-8acb-525bdcbe78ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:32 np0005531888 podman[237578]: 2025-11-22 08:15:32.685683986 +0000 UTC m=+0.054406269 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:15:32 np0005531888 nova_compute[186788]: 2025-11-22 08:15:32.891 186792 DEBUG nova.network.neutron [req-7c983491-4b2b-418a-8a59-58cd8829ffdc req-10515e33-0139-428e-85c1-3c8893ce11c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Updated VIF entry in instance network info cache for port dacaeb80-9494-4184-a865-eeefe225e904. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:15:32 np0005531888 nova_compute[186788]: 2025-11-22 08:15:32.891 186792 DEBUG nova.network.neutron [req-7c983491-4b2b-418a-8a59-58cd8829ffdc req-10515e33-0139-428e-85c1-3c8893ce11c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Updating instance_info_cache with network_info: [{"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:32 np0005531888 nova_compute[186788]: 2025-11-22 08:15:32.903 186792 DEBUG oslo_concurrency.lockutils [req-7c983491-4b2b-418a-8a59-58cd8829ffdc req-10515e33-0139-428e-85c1-3c8893ce11c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-2770b8d9-8240-41e5-9b5d-5424298ff3f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:33 np0005531888 nova_compute[186788]: 2025-11-22 08:15:33.860 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:34 np0005531888 nova_compute[186788]: 2025-11-22 08:15:34.439 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.652 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.653 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.653 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.653 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.653 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.661 186792 INFO nova.compute.manager [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Terminating instance#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.669 186792 DEBUG nova.compute.manager [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:15:35 np0005531888 kernel: tapdacaeb80-94 (unregistering): left promiscuous mode
Nov 22 03:15:35 np0005531888 NetworkManager[55166]: <info>  [1763799335.6934] device (tapdacaeb80-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:15:35 np0005531888 podman[237602]: 2025-11-22 08:15:35.693764715 +0000 UTC m=+0.066299563 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:15:35 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:35Z|00566|binding|INFO|Releasing lport dacaeb80-9494-4184-a865-eeefe225e904 from this chassis (sb_readonly=0)
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.702 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:35Z|00567|binding|INFO|Setting lport dacaeb80-9494-4184-a865-eeefe225e904 down in Southbound
Nov 22 03:15:35 np0005531888 ovn_controller[95067]: 2025-11-22T08:15:35Z|00568|binding|INFO|Removing iface tapdacaeb80-94 ovn-installed in OVS
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.706 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.711 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:6f:35 10.100.0.10'], port_security=['fa:16:3e:60:6f:35 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2770b8d9-8240-41e5-9b5d-5424298ff3f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=dacaeb80-9494-4184-a865-eeefe225e904) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.713 104023 INFO neutron.agent.ovn.metadata.agent [-] Port dacaeb80-9494-4184-a865-eeefe225e904 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a unbound from our chassis#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.715 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.718 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.730 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7fc193-0588-4f16-9439-d0ccce0f3283]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:35 np0005531888 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000091.scope: Deactivated successfully.
Nov 22 03:15:35 np0005531888 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000091.scope: Consumed 4.506s CPU time.
Nov 22 03:15:35 np0005531888 systemd-machined[153106]: Machine qemu-69-instance-00000091 terminated.
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.756 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a85ad5e7-e926-466b-a447-b87df69484e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.759 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[bf3d0ac3-8121-4c8c-8e2c-17ffe53918c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.782 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e24b47c6-9c21-4060-a13e-1d0fdb1f7867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.797 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e62681-cb3c-4a0f-ad44-1ea0f99d6ebb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596519, 'reachable_time': 17106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237634, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.813 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[94c35074-3d22-4918-b481-5be2cc3a2ea9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596531, 'tstamp': 596531}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237635, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap66c945b4-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 596534, 'tstamp': 596534}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237635, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.815 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.816 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.821 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.821 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.821 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.822 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:35.822 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.953 186792 INFO nova.virt.libvirt.driver [-] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Instance destroyed successfully.#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.954 186792 DEBUG nova.objects.instance [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'resources' on Instance uuid 2770b8d9-8240-41e5-9b5d-5424298ff3f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.972 186792 DEBUG nova.virt.libvirt.vif [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1143915207',display_name='tempest-ServersTestJSON-server-1143915207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1143915207',id=145,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:15:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-qd8ypeff',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:15:34Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=2770b8d9-8240-41e5-9b5d-5424298ff3f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.973 186792 DEBUG nova.network.os_vif_util [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "dacaeb80-9494-4184-a865-eeefe225e904", "address": "fa:16:3e:60:6f:35", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdacaeb80-94", "ovs_interfaceid": "dacaeb80-9494-4184-a865-eeefe225e904", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.974 186792 DEBUG nova.network.os_vif_util [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:6f:35,bridge_name='br-int',has_traffic_filtering=True,id=dacaeb80-9494-4184-a865-eeefe225e904,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdacaeb80-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.974 186792 DEBUG os_vif [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:6f:35,bridge_name='br-int',has_traffic_filtering=True,id=dacaeb80-9494-4184-a865-eeefe225e904,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdacaeb80-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.976 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdacaeb80-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.977 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.978 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.982 186792 INFO os_vif [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:6f:35,bridge_name='br-int',has_traffic_filtering=True,id=dacaeb80-9494-4184-a865-eeefe225e904,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdacaeb80-94')#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.983 186792 INFO nova.virt.libvirt.driver [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Deleting instance files /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8_del#033[00m
Nov 22 03:15:35 np0005531888 nova_compute[186788]: 2025-11-22 08:15:35.983 186792 INFO nova.virt.libvirt.driver [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Deletion of /var/lib/nova/instances/2770b8d9-8240-41e5-9b5d-5424298ff3f8_del complete#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.062 186792 INFO nova.compute.manager [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.063 186792 DEBUG oslo.service.loopingcall [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.063 186792 DEBUG nova.compute.manager [-] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.063 186792 DEBUG nova.network.neutron [-] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.095 186792 DEBUG nova.compute.manager [req-2d7cd87e-a95a-490d-9453-586e063a073f req-f6aee63a-9f1d-46e8-a1e1-81c5f74a09df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received event network-vif-unplugged-dacaeb80-9494-4184-a865-eeefe225e904 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.095 186792 DEBUG oslo_concurrency.lockutils [req-2d7cd87e-a95a-490d-9453-586e063a073f req-f6aee63a-9f1d-46e8-a1e1-81c5f74a09df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.096 186792 DEBUG oslo_concurrency.lockutils [req-2d7cd87e-a95a-490d-9453-586e063a073f req-f6aee63a-9f1d-46e8-a1e1-81c5f74a09df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.096 186792 DEBUG oslo_concurrency.lockutils [req-2d7cd87e-a95a-490d-9453-586e063a073f req-f6aee63a-9f1d-46e8-a1e1-81c5f74a09df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.096 186792 DEBUG nova.compute.manager [req-2d7cd87e-a95a-490d-9453-586e063a073f req-f6aee63a-9f1d-46e8-a1e1-81c5f74a09df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] No waiting events found dispatching network-vif-unplugged-dacaeb80-9494-4184-a865-eeefe225e904 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:36 np0005531888 nova_compute[186788]: 2025-11-22 08:15:36.096 186792 DEBUG nova.compute.manager [req-2d7cd87e-a95a-490d-9453-586e063a073f req-f6aee63a-9f1d-46e8-a1e1-81c5f74a09df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received event network-vif-unplugged-dacaeb80-9494-4184-a865-eeefe225e904 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:15:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:36.831 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:36.831 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:36.831 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.277 186792 DEBUG nova.network.neutron [-] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.296 186792 INFO nova.compute.manager [-] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Took 1.23 seconds to deallocate network for instance.#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.355 186792 DEBUG nova.compute.manager [req-379b66a7-fd7a-4672-8d8b-45e0c949634b req-1a39c6af-1cfa-49b9-954f-bae32703e4cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received event network-vif-deleted-dacaeb80-9494-4184-a865-eeefe225e904 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.371 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.372 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.438 186792 DEBUG nova.compute.provider_tree [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.456 186792 DEBUG nova.scheduler.client.report [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.503 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.544 186792 INFO nova.scheduler.client.report [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Deleted allocations for instance 2770b8d9-8240-41e5-9b5d-5424298ff3f8#033[00m
Nov 22 03:15:37 np0005531888 nova_compute[186788]: 2025-11-22 08:15:37.713 186792 DEBUG oslo_concurrency.lockutils [None req-9dd51263-795d-4050-ab32-020a6e2eb278 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:38 np0005531888 nova_compute[186788]: 2025-11-22 08:15:38.186 186792 DEBUG nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received event network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:38 np0005531888 nova_compute[186788]: 2025-11-22 08:15:38.186 186792 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:38 np0005531888 nova_compute[186788]: 2025-11-22 08:15:38.187 186792 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:38 np0005531888 nova_compute[186788]: 2025-11-22 08:15:38.187 186792 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2770b8d9-8240-41e5-9b5d-5424298ff3f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:38 np0005531888 nova_compute[186788]: 2025-11-22 08:15:38.187 186792 DEBUG nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] No waiting events found dispatching network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:38 np0005531888 nova_compute[186788]: 2025-11-22 08:15:38.187 186792 WARNING nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Received unexpected event network-vif-plugged-dacaeb80-9494-4184-a865-eeefe225e904 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:15:38 np0005531888 podman[237654]: 2025-11-22 08:15:38.679158685 +0000 UTC m=+0.053296052 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm)
Nov 22 03:15:38 np0005531888 podman[237655]: 2025-11-22 08:15:38.716457672 +0000 UTC m=+0.087580255 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:15:38 np0005531888 nova_compute[186788]: 2025-11-22 08:15:38.862 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:40 np0005531888 nova_compute[186788]: 2025-11-22 08:15:40.977 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:43 np0005531888 nova_compute[186788]: 2025-11-22 08:15:43.864 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:45 np0005531888 nova_compute[186788]: 2025-11-22 08:15:45.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:48 np0005531888 nova_compute[186788]: 2025-11-22 08:15:48.866 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:49 np0005531888 podman[237699]: 2025-11-22 08:15:49.700670739 +0000 UTC m=+0.069296846 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:15:49 np0005531888 podman[237700]: 2025-11-22 08:15:49.711188427 +0000 UTC m=+0.069392807 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:15:50 np0005531888 nova_compute[186788]: 2025-11-22 08:15:50.952 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799335.9510484, 2770b8d9-8240-41e5-9b5d-5424298ff3f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:50 np0005531888 nova_compute[186788]: 2025-11-22 08:15:50.952 186792 INFO nova.compute.manager [-] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:15:50 np0005531888 nova_compute[186788]: 2025-11-22 08:15:50.970 186792 DEBUG nova.compute.manager [None req-4edbecce-d0da-4771-b1a5-7246d2b19130 - - - - - -] [instance: 2770b8d9-8240-41e5-9b5d-5424298ff3f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:50 np0005531888 nova_compute[186788]: 2025-11-22 08:15:50.981 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:51 np0005531888 nova_compute[186788]: 2025-11-22 08:15:51.808 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:51.807 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:51.810 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:15:53 np0005531888 nova_compute[186788]: 2025-11-22 08:15:53.869 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:54 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:15:54.812 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:55 np0005531888 nova_compute[186788]: 2025-11-22 08:15:55.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:58 np0005531888 nova_compute[186788]: 2025-11-22 08:15:58.871 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:00 np0005531888 podman[237737]: 2025-11-22 08:16:00.732408728 +0000 UTC m=+0.093615054 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:16:00 np0005531888 nova_compute[186788]: 2025-11-22 08:16:00.985 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:01 np0005531888 nova_compute[186788]: 2025-11-22 08:16:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:03 np0005531888 podman[237758]: 2025-11-22 08:16:03.732331405 +0000 UTC m=+0.084956359 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:16:03 np0005531888 nova_compute[186788]: 2025-11-22 08:16:03.874 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:03 np0005531888 nova_compute[186788]: 2025-11-22 08:16:03.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:03 np0005531888 nova_compute[186788]: 2025-11-22 08:16:03.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:03 np0005531888 nova_compute[186788]: 2025-11-22 08:16:03.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:16:03 np0005531888 nova_compute[186788]: 2025-11-22 08:16:03.996 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:16:05 np0005531888 nova_compute[186788]: 2025-11-22 08:16:05.987 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:06 np0005531888 podman[237782]: 2025-11-22 08:16:06.71141347 +0000 UTC m=+0.075146869 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Nov 22 03:16:06 np0005531888 nova_compute[186788]: 2025-11-22 08:16:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:06 np0005531888 nova_compute[186788]: 2025-11-22 08:16:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:08 np0005531888 nova_compute[186788]: 2025-11-22 08:16:08.875 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:09 np0005531888 podman[237805]: 2025-11-22 08:16:09.69874474 +0000 UTC m=+0.073894389 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:16:09 np0005531888 podman[237804]: 2025-11-22 08:16:09.71340211 +0000 UTC m=+0.088467487 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:16:10 np0005531888 nova_compute[186788]: 2025-11-22 08:16:10.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:10 np0005531888 nova_compute[186788]: 2025-11-22 08:16:10.988 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:12 np0005531888 nova_compute[186788]: 2025-11-22 08:16:12.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:13 np0005531888 nova_compute[186788]: 2025-11-22 08:16:13.878 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:13 np0005531888 nova_compute[186788]: 2025-11-22 08:16:13.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:13 np0005531888 nova_compute[186788]: 2025-11-22 08:16:13.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:13 np0005531888 nova_compute[186788]: 2025-11-22 08:16:13.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:13 np0005531888 nova_compute[186788]: 2025-11-22 08:16:13.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:13 np0005531888 nova_compute[186788]: 2025-11-22 08:16:13.980 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.048 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.123 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.125 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.180 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.331 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.333 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5533MB free_disk=73.24515151977539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.333 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.334 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.403 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.404 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.405 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.448 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.458 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.567 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:16:14 np0005531888 nova_compute[186788]: 2025-11-22 08:16:14.568 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:15 np0005531888 nova_compute[186788]: 2025-11-22 08:16:15.990 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:17 np0005531888 nova_compute[186788]: 2025-11-22 08:16:17.569 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:17 np0005531888 nova_compute[186788]: 2025-11-22 08:16:17.570 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:16:18 np0005531888 nova_compute[186788]: 2025-11-22 08:16:18.879 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:20 np0005531888 podman[237858]: 2025-11-22 08:16:20.699046483 +0000 UTC m=+0.067652875 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:16:20 np0005531888 podman[237859]: 2025-11-22 08:16:20.722308145 +0000 UTC m=+0.081574258 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 22 03:16:20 np0005531888 nova_compute[186788]: 2025-11-22 08:16:20.993 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:23 np0005531888 nova_compute[186788]: 2025-11-22 08:16:23.880 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:25 np0005531888 nova_compute[186788]: 2025-11-22 08:16:25.994 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:28 np0005531888 nova_compute[186788]: 2025-11-22 08:16:28.882 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:30 np0005531888 nova_compute[186788]: 2025-11-22 08:16:30.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:30 np0005531888 nova_compute[186788]: 2025-11-22 08:16:30.995 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:31 np0005531888 podman[237897]: 2025-11-22 08:16:31.671752127 +0000 UTC m=+0.046916665 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:16:33 np0005531888 nova_compute[186788]: 2025-11-22 08:16:33.884 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:34 np0005531888 podman[237917]: 2025-11-22 08:16:34.676417192 +0000 UTC m=+0.056734327 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:16:34 np0005531888 nova_compute[186788]: 2025-11-22 08:16:34.922 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:34.923 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:16:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:34.924 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:16:35 np0005531888 nova_compute[186788]: 2025-11-22 08:16:35.998 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:36.832 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:36.833 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:36.834 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.849 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'name': 'tempest-₡-19652717', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000089', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '70cb231da30d4002a985cf18a579cd6a', 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'hostId': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.850 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.879 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.latency volume: 2551870973 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.879 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.latency volume: 159932662 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cdb07f8-71e4-463b-a3ec-14d2da00dbd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2551870973, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.850952', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9090c890-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': 'ae3fc805e78d0b7cb37e088435d43abc7beb527f91b501eafdb60df72af77de8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 159932662, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.850952', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9090d524-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': '470f05ef296a32a70503decadfb43fe1fde36dac300654c23010fab9c3397c60'}]}, 'timestamp': '2025-11-22 08:16:36.880219', '_unique_id': 'd1caff41286b4e2a8c706ed701ff7ea9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.881 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.882 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.882 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd35507ff-b7e1-4e5d-8f30-8305abd44f41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.882500', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90913ac8-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': '5a6f96557420d51b480ed80a0ac2fd3dd1cac8b6ad81c8d86e16e57cb7d48890'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.882500', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90914482-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': 'b122745ca161051ee2211daf5fdfabdcea63bd0387a92b7f38c54e6205d34c95'}]}, 'timestamp': '2025-11-22 08:16:36.883055', '_unique_id': '281ecce63d4e490185cd113a52c83283'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.883 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.884 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.886 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd817d7e-3b4d-4ce3-b7e3-f4fab74a924e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.884644', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '9091e6d0-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': '8a4194bd1cbace18688ba9082ba79bed99e1c917e98dddda0910f4ebb2eb9383'}]}, 'timestamp': '2025-11-22 08:16:36.887277', '_unique_id': 'dad7a21a67eb4dbca9ae78ade8e1cc2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.890 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.890 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.890 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.latency volume: 36759977790 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.891 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0224b4fd-53c2-458a-9d5a-ec665421cd4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 36759977790, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.890528', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90927992-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': 'c72fae2ea8b0be0b802a3274e347e09c2722bdc629b99981e2b29106cab8e3b6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.890528', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90928e96-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': '746a89418777ccd0cf8e8f8f512695bc4d7b9b6dd29905dde9f6986789afdd30'}]}, 'timestamp': '2025-11-22 08:16:36.891588', '_unique_id': '1c397e695e384e7180417b1e9f20be75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a3a4f40-0bdb-4d11-9ecd-5aed1f81c8b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.894002', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '9092fd4a-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': '527bea9c8821dff8094f52a7e7a8b797a0d0c25f2b0aae46a5989912552bcd73'}]}, 'timestamp': '2025-11-22 08:16:36.894375', '_unique_id': '450d41bb2ed64077acf86cfbc30f29cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.896 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.896 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.bytes volume: 73007104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.896 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0acd3b52-39a7-41ce-9500-241138f54315', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73007104, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.896349', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9093588a-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': '2819bd435502b67e3607cad24ff3c61b7d42b4bbc6ba67ae6db9e43ccd61316d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.896349', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9093642e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': 'c4e9a22c6a438bb44a519f878302b14df9004df70f9e89c0621d63f3bead29f1'}]}, 'timestamp': '2025-11-22 08:16:36.896981', '_unique_id': '91ae5ad9aecf4d34ad2a228f55027002'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.897 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.898 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.910 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.910 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4d589da-19a3-47df-8739-93dc4318649d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.898911', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90957408-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.598447454, 'message_signature': 'e3ae8e6d50232fbfa95addb17a3202ce8ce9abb1fb1bd7ce156e520b6e9d077c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.898911', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '909581f0-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.598447454, 'message_signature': 'bd63c6894333105e8a102369fafa5ccd9cb338701df535d9d856550c7807b372'}]}, 'timestamp': '2025-11-22 08:16:36.910859', '_unique_id': '9d7ed43e7031474bbef4e3509062ca9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.912 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.bytes.delta volume: 672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2aa9db62-976d-439b-8b31-86eb34c4cc82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 672, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.912898', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '9095dc86-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': '278c4157bb098c73b8ceada0364b211b07d56e5252c1735c050a773dab3d38aa'}]}, 'timestamp': '2025-11-22 08:16:36.913175', '_unique_id': '5c0a0d661144454195d39e341b1e803c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.914 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d2b5700-73db-4742-8b31-63a7077238e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.914695', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '909622d6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': '96169648474bf73833a5536dd30898899cc59c85e09be3ec1791be7ba32213d9'}]}, 'timestamp': '2025-11-22 08:16:36.914974', '_unique_id': '8cf38b4ee6e345619a92a09dc31a9411'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.916 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.916 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.916 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58ec2e44-66d3-4667-a029-65ada0ebb26b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.916520', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '90966bd8-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': '651dbe0341098e1d3e121dc3ca8a1b1c68e4b395ce4b372a4e0e9fceb73058bd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.916520', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '90967588-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': '8c1cad7d677e8379623abbd46f3fd29ccee13268b1003cb8ea622a328501ffcf'}]}, 'timestamp': '2025-11-22 08:16:36.917110', '_unique_id': 'a4b12df132e7455292b9ea1ec2c0babc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.918 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a43573f-de28-4e41-8823-a7a5cf9280d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.918890', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '9096c7fe-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': '3382f4acf04a2d348f83e88af4b747aa12cb689eeae45106be93a25a2dde6360'}]}, 'timestamp': '2025-11-22 08:16:36.919243', '_unique_id': 'df57c2a068b44421bf27c58f0bcd85c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.921 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59eba214-8914-4570-9678-0480a9b59e4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.921693', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '90973414-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': '1130ba2d8fc42ce8e5e66bbf8d1f42e314a09b6e1f2d04f8bb1a8527722acd9f'}]}, 'timestamp': '2025-11-22 08:16:36.921977', '_unique_id': '0a2f270d5c0f4acdabdddc677a71186f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.923 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.935 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/cpu volume: 15810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e57010c-abe3-4f23-aadf-6383ad4b711f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15810000000, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'timestamp': '2025-11-22T08:16:36.923666', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9099631a-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.635366963, 'message_signature': 'aef3025231feabf865586743dc4c8a0bbe3a13b8a857763162e32433dd0e3fce'}]}, 'timestamp': '2025-11-22 08:16:36.936301', '_unique_id': 'f7d95d5d717a467590926926de9878d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9ee34b0-1a91-4a29-96f8-88147d62cedb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.938032', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '9099b6bc-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': 'e25063c4eff8ec349e0e0ef343bf3a22b8ebe462222ae3cc48d0d6fe6f4c5fad'}]}, 'timestamp': '2025-11-22 08:16:36.938427', '_unique_id': '9523f64343e743a6afcfb40d3bd7a596'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.939 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.940 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/memory.usage volume: 42.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '309a6d98-fb19-48ef-aac2-28960784ac60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.63671875, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'timestamp': '2025-11-22T08:16:36.940194', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '909a0900-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.635366963, 'message_signature': '5b97c779edd5aa8e27828890cb7d3bee695969551a994b031016ce615c805e18'}]}, 'timestamp': '2025-11-22 08:16:36.940636', '_unique_id': '0fed8a71eb234dd8a51b9ef5c38c5f6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.941 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.942 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.942 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.requests volume: 295 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.942 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd19e4d09-62a0-49da-99ff-6c8a2cb1f94b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 295, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.942331', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '909a591e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': '58cfbeb0aa48429784a43fed4b85c1eb36d2eaa685b649e30413fb0e52078063'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.942331', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '909a62c4-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.550501525, 'message_signature': '6573288ddd0fa17dc22391dd835bb355ee80810395f6520001bea5970a66311b'}]}, 'timestamp': '2025-11-22 08:16:36.942813', '_unique_id': '7845b9734a0140fca1451e08827c366d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.944 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.944 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08549e08-9b1e-4b9b-bc3b-10174e65a7b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.944154', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '909a9fd2-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.598447454, 'message_signature': '9ac4c951d7333a7bdafb86318b71e267bf53fe37782a28a7aba115717caeea9d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.944154', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '909aa900-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.598447454, 'message_signature': '2fc017eccba24f592b177da0e868ef6edd4f456d1c5e97c2d5fa40fa4dc67a46'}]}, 'timestamp': '2025-11-22 08:16:36.944627', '_unique_id': 'a2625851d8274a92a61030b07aecc25c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.945 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.incoming.bytes volume: 2024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae9a69e2-c628-4fa8-b336-4a0cb1201efe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2024, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.945831', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '909ae172-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': '7e96a6f114b0c9b8666b0e15df24a43dd5392e630bc75f27cd7bb800da1d2bd7'}]}, 'timestamp': '2025-11-22 08:16:36.946070', '_unique_id': 'a83e4ed6ec0844b4885f19a5a0b9af8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.947 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c45df53-7a0a-43fe-a016-a52407f09fbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.947343', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '909b1c64-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': 'd573ac8483fd5cf7bba57cb71f7a1f84235fbab4053e0ccf3d9e50c45389db36'}]}, 'timestamp': '2025-11-22 08:16:36.947597', '_unique_id': '6a511a32757e45fdba195d7655f0e56b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.948 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.949 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6e4939b-18cd-48b2-b3e7-ef1920a84b70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-vda', 'timestamp': '2025-11-22T08:16:36.948876', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '909b59ea-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.598447454, 'message_signature': 'a3be8666590cba6fb09fa2c765e16f57cf64c6d23512908f12e7e19694e0b1db'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-sda', 'timestamp': '2025-11-22T08:16:36.948876', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'instance-00000089', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '909b6390-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.598447454, 'message_signature': '7bef6ad4bdad455e36f1ce1e1ab831260ed4ee319815f3cee18e4d23a1428c1f'}]}, 'timestamp': '2025-11-22 08:16:36.949386', '_unique_id': '2b66643d687a407c9d28226b9c96a82d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.950 12 DEBUG ceilometer.compute.pollsters [-] 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '529b88df-511d-4a3e-beba-e2684a852fdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_name': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_name': None, 'resource_id': 'instance-00000089-2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-tap3c3e4988-08', 'timestamp': '2025-11-22T08:16:36.950771', 'resource_metadata': {'display_name': 'tempest-₡-19652717', 'name': 'tap3c3e4988-08', 'instance_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'instance_type': 'm1.nano', 'host': '3e27fa2eda21bcc45897fa02751cebefc332e4a18053adf939da6a73', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:ba:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c3e4988-08'}, 'message_id': '909ba3d2-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6137.584159213, 'message_signature': '756668dfbcf1242bffdebe6320016af5b66ee4862f5efc2bd2d700c8b7a15714'}]}, 'timestamp': '2025-11-22 08:16:36.951092', '_unique_id': 'eaca1c13f27448438c1f444e29c43a5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:16:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:16:37 np0005531888 podman[237941]: 2025-11-22 08:16:37.698282119 +0000 UTC m=+0.069373767 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41)
Nov 22 03:16:38 np0005531888 nova_compute[186788]: 2025-11-22 08:16:38.887 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:40 np0005531888 podman[237962]: 2025-11-22 08:16:40.683810014 +0000 UTC m=+0.061268707 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:16:40 np0005531888 podman[237963]: 2025-11-22 08:16:40.726540785 +0000 UTC m=+0.100758879 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 22 03:16:41 np0005531888 nova_compute[186788]: 2025-11-22 08:16:41.001 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:42.929 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:43 np0005531888 nova_compute[186788]: 2025-11-22 08:16:43.890 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.004 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.148 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.149 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.149 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.149 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.149 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.156 186792 INFO nova.compute.manager [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Terminating instance#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.162 186792 DEBUG nova.compute.manager [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:16:46 np0005531888 kernel: tap3c3e4988-08 (unregistering): left promiscuous mode
Nov 22 03:16:46 np0005531888 NetworkManager[55166]: <info>  [1763799406.1953] device (tap3c3e4988-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:16:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:16:46Z|00569|binding|INFO|Releasing lport 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 from this chassis (sb_readonly=0)
Nov 22 03:16:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:16:46Z|00570|binding|INFO|Setting lport 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 down in Southbound
Nov 22 03:16:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:16:46Z|00571|binding|INFO|Removing iface tap3c3e4988-08 ovn-installed in OVS
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.203 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.219 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000089.scope: Deactivated successfully.
Nov 22 03:16:46 np0005531888 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000089.scope: Consumed 24.241s CPU time.
Nov 22 03:16:46 np0005531888 systemd-machined[153106]: Machine qemu-66-instance-00000089 terminated.
Nov 22 03:16:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:46.312 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:ba:92 10.100.0.4'], port_security=['fa:16:3e:c6:ba:92 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2f6a7fca-8a29-4c0c-936f-8184ac3b4abe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=3c3e4988-0822-4b4c-9326-3cf6ec5155d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:16:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:46.313 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 3c3e4988-0822-4b4c-9326-3cf6ec5155d9 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a unbound from our chassis#033[00m
Nov 22 03:16:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:46.315 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66c945b4-7237-4e85-b411-0c51b31ea31a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:16:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:46.316 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fc3f5992-0d79-4a5c-b25a-cc4cf4bda05a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:46.316 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a namespace which is not needed anymore#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.384 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.389 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.421 186792 INFO nova.virt.libvirt.driver [-] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Instance destroyed successfully.#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.422 186792 DEBUG nova.objects.instance [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'resources' on Instance uuid 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.433 186792 DEBUG nova.virt.libvirt.vif [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:13:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-19652717',display_name='tempest-₡-19652717',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest--19652717',id=137,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:13:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-q1p88s0k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:13:45Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=2f6a7fca-8a29-4c0c-936f-8184ac3b4abe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.434 186792 DEBUG nova.network.os_vif_util [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "address": "fa:16:3e:c6:ba:92", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3e4988-08", "ovs_interfaceid": "3c3e4988-0822-4b4c-9326-3cf6ec5155d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.435 186792 DEBUG nova.network.os_vif_util [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c6:ba:92,bridge_name='br-int',has_traffic_filtering=True,id=3c3e4988-0822-4b4c-9326-3cf6ec5155d9,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3e4988-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.435 186792 DEBUG os_vif [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:ba:92,bridge_name='br-int',has_traffic_filtering=True,id=3c3e4988-0822-4b4c-9326-3cf6ec5155d9,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3e4988-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.436 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.436 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c3e4988-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.438 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.439 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.442 186792 INFO os_vif [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:ba:92,bridge_name='br-int',has_traffic_filtering=True,id=3c3e4988-0822-4b4c-9326-3cf6ec5155d9,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3e4988-08')#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.443 186792 INFO nova.virt.libvirt.driver [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Deleting instance files /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe_del#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.444 186792 INFO nova.virt.libvirt.driver [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Deletion of /var/lib/nova/instances/2f6a7fca-8a29-4c0c-936f-8184ac3b4abe_del complete#033[00m
Nov 22 03:16:46 np0005531888 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[236774]: [NOTICE]   (236795) : haproxy version is 2.8.14-c23fe91
Nov 22 03:16:46 np0005531888 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[236774]: [NOTICE]   (236795) : path to executable is /usr/sbin/haproxy
Nov 22 03:16:46 np0005531888 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[236774]: [WARNING]  (236795) : Exiting Master process...
Nov 22 03:16:46 np0005531888 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[236774]: [WARNING]  (236795) : Exiting Master process...
Nov 22 03:16:46 np0005531888 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[236774]: [ALERT]    (236795) : Current worker (236797) exited with code 143 (Terminated)
Nov 22 03:16:46 np0005531888 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[236774]: [WARNING]  (236795) : All workers exited. Exiting... (0)
Nov 22 03:16:46 np0005531888 systemd[1]: libpod-65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74.scope: Deactivated successfully.
Nov 22 03:16:46 np0005531888 podman[238037]: 2025-11-22 08:16:46.635688399 +0000 UTC m=+0.228132211 container died 65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.661 186792 INFO nova.compute.manager [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.663 186792 DEBUG oslo.service.loopingcall [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.663 186792 DEBUG nova.compute.manager [-] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.663 186792 DEBUG nova.network.neutron [-] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.946 186792 DEBUG nova.compute.manager [req-4fa4101d-d317-4efb-ab53-0205d3b08b93 req-c7514dbc-ecf1-4612-b61c-de8b95dc52b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received event network-vif-unplugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.947 186792 DEBUG oslo_concurrency.lockutils [req-4fa4101d-d317-4efb-ab53-0205d3b08b93 req-c7514dbc-ecf1-4612-b61c-de8b95dc52b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.947 186792 DEBUG oslo_concurrency.lockutils [req-4fa4101d-d317-4efb-ab53-0205d3b08b93 req-c7514dbc-ecf1-4612-b61c-de8b95dc52b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.948 186792 DEBUG oslo_concurrency.lockutils [req-4fa4101d-d317-4efb-ab53-0205d3b08b93 req-c7514dbc-ecf1-4612-b61c-de8b95dc52b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.948 186792 DEBUG nova.compute.manager [req-4fa4101d-d317-4efb-ab53-0205d3b08b93 req-c7514dbc-ecf1-4612-b61c-de8b95dc52b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] No waiting events found dispatching network-vif-unplugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:16:46 np0005531888 nova_compute[186788]: 2025-11-22 08:16:46.948 186792 DEBUG nova.compute.manager [req-4fa4101d-d317-4efb-ab53-0205d3b08b93 req-c7514dbc-ecf1-4612-b61c-de8b95dc52b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received event network-vif-unplugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:16:47 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74-userdata-shm.mount: Deactivated successfully.
Nov 22 03:16:47 np0005531888 systemd[1]: var-lib-containers-storage-overlay-b9d7125f32fa27413e4966b8f6aa5a34401206ab17e03fc75c2bbebd2a1a7c11-merged.mount: Deactivated successfully.
Nov 22 03:16:47 np0005531888 podman[238037]: 2025-11-22 08:16:47.350724705 +0000 UTC m=+0.943168507 container cleanup 65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:16:47 np0005531888 systemd[1]: libpod-conmon-65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74.scope: Deactivated successfully.
Nov 22 03:16:47 np0005531888 podman[238078]: 2025-11-22 08:16:47.869132774 +0000 UTC m=+0.494568034 container remove 65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.875 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[539b7046-1940-4189-aff7-fa0b8192b8f0]: (4, ('Sat Nov 22 08:16:46 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a (65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74)\n65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74\nSat Nov 22 08:16:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a (65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74)\n65c83c767ec24ab97dfb6daf7f2f3ce276292239ba1f4a25c01e346a88a41f74\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.877 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9bcd025c-6164-4ea1-be26-8ef95648c930]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.878 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:47 np0005531888 nova_compute[186788]: 2025-11-22 08:16:47.880 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:47 np0005531888 kernel: tap66c945b4-70: left promiscuous mode
Nov 22 03:16:47 np0005531888 nova_compute[186788]: 2025-11-22 08:16:47.893 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.896 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5089f92c-bfbd-4940-abce-85fe76cf876c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.911 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2cfdd74b-7352-4ca9-953c-e8f9d20441f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.912 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9e236d07-eeaf-4066-9f8f-fa7ce3fe0f7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.930 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce8c424-6e34-49e5-9b05-c8070fb68b87]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 596512, 'reachable_time': 16717, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238095, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.933 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:16:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:16:47.934 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f8db5cf5-1ce5-442f-ba9d-2f743b3e43ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:47 np0005531888 systemd[1]: run-netns-ovnmeta\x2d66c945b4\x2d7237\x2d4e85\x2db411\x2d0c51b31ea31a.mount: Deactivated successfully.
Nov 22 03:16:48 np0005531888 nova_compute[186788]: 2025-11-22 08:16:48.892 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.038 186792 DEBUG nova.compute.manager [req-5c2209fa-2ce1-4d4d-89db-4dca507d25f6 req-c810f578-56d9-4bfe-b7f9-64203db66dba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received event network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.038 186792 DEBUG oslo_concurrency.lockutils [req-5c2209fa-2ce1-4d4d-89db-4dca507d25f6 req-c810f578-56d9-4bfe-b7f9-64203db66dba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.039 186792 DEBUG oslo_concurrency.lockutils [req-5c2209fa-2ce1-4d4d-89db-4dca507d25f6 req-c810f578-56d9-4bfe-b7f9-64203db66dba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.039 186792 DEBUG oslo_concurrency.lockutils [req-5c2209fa-2ce1-4d4d-89db-4dca507d25f6 req-c810f578-56d9-4bfe-b7f9-64203db66dba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.039 186792 DEBUG nova.compute.manager [req-5c2209fa-2ce1-4d4d-89db-4dca507d25f6 req-c810f578-56d9-4bfe-b7f9-64203db66dba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] No waiting events found dispatching network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.039 186792 WARNING nova.compute.manager [req-5c2209fa-2ce1-4d4d-89db-4dca507d25f6 req-c810f578-56d9-4bfe-b7f9-64203db66dba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received unexpected event network-vif-plugged-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.185 186792 DEBUG nova.network.neutron [-] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.203 186792 INFO nova.compute.manager [-] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Took 4.54 seconds to deallocate network for instance.#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.291 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.292 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.344 186792 DEBUG nova.compute.manager [req-ca6a88f9-a83a-4a5d-b260-2136bed5085b req-a519d94d-ca05-4c62-a516-64c3a0d81987 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Received event network-vif-deleted-3c3e4988-0822-4b4c-9326-3cf6ec5155d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.394 186792 DEBUG nova.compute.provider_tree [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.421 186792 DEBUG nova.scheduler.client.report [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.440 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.453 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.487 186792 INFO nova.scheduler.client.report [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Deleted allocations for instance 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe#033[00m
Nov 22 03:16:51 np0005531888 nova_compute[186788]: 2025-11-22 08:16:51.578 186792 DEBUG oslo_concurrency.lockutils [None req-c85f285d-91b5-410e-8b77-b8765f962e19 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "2f6a7fca-8a29-4c0c-936f-8184ac3b4abe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:51 np0005531888 podman[238097]: 2025-11-22 08:16:51.675604688 +0000 UTC m=+0.049682124 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:16:51 np0005531888 podman[238096]: 2025-11-22 08:16:51.676631393 +0000 UTC m=+0.051395195 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:16:53 np0005531888 nova_compute[186788]: 2025-11-22 08:16:53.894 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:56 np0005531888 nova_compute[186788]: 2025-11-22 08:16:56.235 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:56 np0005531888 nova_compute[186788]: 2025-11-22 08:16:56.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:58 np0005531888 nova_compute[186788]: 2025-11-22 08:16:58.897 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:01 np0005531888 nova_compute[186788]: 2025-11-22 08:17:01.421 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799406.4193513, 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:17:01 np0005531888 nova_compute[186788]: 2025-11-22 08:17:01.421 186792 INFO nova.compute.manager [-] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:17:01 np0005531888 nova_compute[186788]: 2025-11-22 08:17:01.441 186792 DEBUG nova.compute.manager [None req-94c14379-5b3f-43cc-9f25-8b481b146ab8 - - - - - -] [instance: 2f6a7fca-8a29-4c0c-936f-8184ac3b4abe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:17:01 np0005531888 nova_compute[186788]: 2025-11-22 08:17:01.445 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:01 np0005531888 nova_compute[186788]: 2025-11-22 08:17:01.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:02 np0005531888 podman[238136]: 2025-11-22 08:17:02.679488569 +0000 UTC m=+0.057751811 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.592 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "18bc318c-d706-498c-9736-fc71aaa6660c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.593 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "18bc318c-d706-498c-9736-fc71aaa6660c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.625 186792 DEBUG nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.763 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.763 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.774 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.774 186792 INFO nova.compute.claims [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.899 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.954 186792 DEBUG nova.compute.provider_tree [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:17:03 np0005531888 nova_compute[186788]: 2025-11-22 08:17:03.966 186792 DEBUG nova.scheduler.client.report [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.014 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.015 186792 DEBUG nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.120 186792 DEBUG nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.155 186792 INFO nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.196 186792 DEBUG nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.347 186792 DEBUG nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.348 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.349 186792 INFO nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Creating image(s)#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.349 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.349 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.350 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.363 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.420 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.421 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.421 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.434 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.489 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.490 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.565 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk 1073741824" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.567 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.567 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.622 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.623 186792 DEBUG nova.virt.disk.api [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Checking if we can resize image /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.623 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.678 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.679 186792 DEBUG nova.virt.disk.api [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Cannot resize image /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.679 186792 DEBUG nova.objects.instance [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.698 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.699 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Ensure instance console log exists: /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.699 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.700 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.700 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.701 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.706 186792 WARNING nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.710 186792 DEBUG nova.virt.libvirt.host [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.710 186792 DEBUG nova.virt.libvirt.host [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.713 186792 DEBUG nova.virt.libvirt.host [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.714 186792 DEBUG nova.virt.libvirt.host [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.715 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.716 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.716 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.716 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.716 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.717 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.717 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.717 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.717 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.718 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.718 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.718 186792 DEBUG nova.virt.hardware [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.723 186792 DEBUG nova.objects.instance [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.745 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <uuid>18bc318c-d706-498c-9736-fc71aaa6660c</uuid>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <name>instance-00000094</name>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerShowV254Test-server-636223678</nova:name>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:17:04</nova:creationTime>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:        <nova:user uuid="18cd88673a4c4a8d99c2d8402355658f">tempest-ServerShowV254Test-1383239291-project-member</nova:user>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:        <nova:project uuid="f8f9152c32bc421ab7cc2d80a473c0d6">tempest-ServerShowV254Test-1383239291</nova:project>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <entry name="serial">18bc318c-d706-498c-9736-fc71aaa6660c</entry>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <entry name="uuid">18bc318c-d706-498c-9736-fc71aaa6660c</entry>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.config"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/console.log" append="off"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:17:04 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:17:04 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:17:04 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:17:04 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.821 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.821 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.821 186792 INFO nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Using config drive#033[00m
Nov 22 03:17:04 np0005531888 podman[238173]: 2025-11-22 08:17:04.843486009 +0000 UTC m=+0.072707449 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:17:04 np0005531888 nova_compute[186788]: 2025-11-22 08:17:04.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.045 186792 INFO nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Creating config drive at /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.config#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.050 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9mwxh1ke execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.175 186792 DEBUG oslo_concurrency.processutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9mwxh1ke" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:05 np0005531888 systemd-machined[153106]: New machine qemu-70-instance-00000094.
Nov 22 03:17:05 np0005531888 systemd[1]: Started Virtual Machine qemu-70-instance-00000094.
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.648 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799425.647552, 18bc318c-d706-498c-9736-fc71aaa6660c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.649 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.652 186792 DEBUG nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.652 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.656 186792 INFO nova.virt.libvirt.driver [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance spawned successfully.#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.657 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.680 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.686 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.689 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.689 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.690 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.690 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.691 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.691 186792 DEBUG nova.virt.libvirt.driver [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.737 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.737 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799425.6487513, 18bc318c-d706-498c-9736-fc71aaa6660c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.738 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] VM Started (Lifecycle Event)#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.774 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.777 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.825 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.876 186792 INFO nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Took 1.53 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.876 186792 DEBUG nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:17:05 np0005531888 nova_compute[186788]: 2025-11-22 08:17:05.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:17:06 np0005531888 nova_compute[186788]: 2025-11-22 08:17:06.023 186792 INFO nova.compute.manager [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Took 2.30 seconds to build instance.#033[00m
Nov 22 03:17:06 np0005531888 nova_compute[186788]: 2025-11-22 08:17:06.098 186792 DEBUG oslo_concurrency.lockutils [None req-39b16b10-97e4-452f-9e60-f2750bba64b7 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "18bc318c-d706-498c-9736-fc71aaa6660c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:06 np0005531888 nova_compute[186788]: 2025-11-22 08:17:06.151 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-18bc318c-d706-498c-9736-fc71aaa6660c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:17:06 np0005531888 nova_compute[186788]: 2025-11-22 08:17:06.152 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-18bc318c-d706-498c-9736-fc71aaa6660c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:17:06 np0005531888 nova_compute[186788]: 2025-11-22 08:17:06.152 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:17:06 np0005531888 nova_compute[186788]: 2025-11-22 08:17:06.153 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:06 np0005531888 nova_compute[186788]: 2025-11-22 08:17:06.413 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:17:06 np0005531888 nova_compute[186788]: 2025-11-22 08:17:06.448 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:07 np0005531888 nova_compute[186788]: 2025-11-22 08:17:07.095 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:17:07 np0005531888 nova_compute[186788]: 2025-11-22 08:17:07.112 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-18bc318c-d706-498c-9736-fc71aaa6660c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:17:07 np0005531888 nova_compute[186788]: 2025-11-22 08:17:07.112 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:17:07 np0005531888 nova_compute[186788]: 2025-11-22 08:17:07.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:07 np0005531888 nova_compute[186788]: 2025-11-22 08:17:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:08 np0005531888 podman[238224]: 2025-11-22 08:17:08.706187665 +0000 UTC m=+0.070556066 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Nov 22 03:17:08 np0005531888 nova_compute[186788]: 2025-11-22 08:17:08.900 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:09 np0005531888 nova_compute[186788]: 2025-11-22 08:17:09.197 186792 INFO nova.compute.manager [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Rebuilding instance#033[00m
Nov 22 03:17:09 np0005531888 nova_compute[186788]: 2025-11-22 08:17:09.543 186792 DEBUG nova.compute.manager [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:17:09 np0005531888 nova_compute[186788]: 2025-11-22 08:17:09.793 186792 DEBUG nova.objects.instance [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'pci_requests' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:09 np0005531888 nova_compute[186788]: 2025-11-22 08:17:09.809 186792 DEBUG nova.objects.instance [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:09 np0005531888 nova_compute[186788]: 2025-11-22 08:17:09.827 186792 DEBUG nova.objects.instance [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'resources' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:09 np0005531888 nova_compute[186788]: 2025-11-22 08:17:09.850 186792 DEBUG nova.objects.instance [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:09 np0005531888 nova_compute[186788]: 2025-11-22 08:17:09.868 186792 DEBUG nova.objects.instance [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 03:17:09 np0005531888 nova_compute[186788]: 2025-11-22 08:17:09.873 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:17:10 np0005531888 nova_compute[186788]: 2025-11-22 08:17:10.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:11 np0005531888 nova_compute[186788]: 2025-11-22 08:17:11.451 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:11 np0005531888 podman[238245]: 2025-11-22 08:17:11.72508554 +0000 UTC m=+0.095301435 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:17:11 np0005531888 podman[238246]: 2025-11-22 08:17:11.742898977 +0000 UTC m=+0.116120237 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 03:17:13 np0005531888 nova_compute[186788]: 2025-11-22 08:17:13.903 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:14 np0005531888 nova_compute[186788]: 2025-11-22 08:17:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:14 np0005531888 nova_compute[186788]: 2025-11-22 08:17:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:14 np0005531888 nova_compute[186788]: 2025-11-22 08:17:14.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:14 np0005531888 nova_compute[186788]: 2025-11-22 08:17:14.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:14 np0005531888 nova_compute[186788]: 2025-11-22 08:17:14.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:14 np0005531888 nova_compute[186788]: 2025-11-22 08:17:14.989 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.080 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.132 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.134 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.193 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.359 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.361 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5591MB free_disk=73.2735366821289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.363 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.364 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.908 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 18bc318c-d706-498c-9736-fc71aaa6660c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.908 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.909 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:17:15 np0005531888 nova_compute[186788]: 2025-11-22 08:17:15.989 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:17:16 np0005531888 nova_compute[186788]: 2025-11-22 08:17:16.002 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:17:16 np0005531888 nova_compute[186788]: 2025-11-22 08:17:16.313 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:17:16 np0005531888 nova_compute[186788]: 2025-11-22 08:17:16.316 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.953s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:16 np0005531888 nova_compute[186788]: 2025-11-22 08:17:16.454 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:18 np0005531888 nova_compute[186788]: 2025-11-22 08:17:18.905 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:19 np0005531888 nova_compute[186788]: 2025-11-22 08:17:19.317 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:19 np0005531888 nova_compute[186788]: 2025-11-22 08:17:19.318 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:17:19 np0005531888 nova_compute[186788]: 2025-11-22 08:17:19.923 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:17:21 np0005531888 nova_compute[186788]: 2025-11-22 08:17:21.457 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:22 np0005531888 podman[238297]: 2025-11-22 08:17:22.696743067 +0000 UTC m=+0.059692450 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:17:22 np0005531888 podman[238296]: 2025-11-22 08:17:22.745632069 +0000 UTC m=+0.106469739 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:17:23 np0005531888 nova_compute[186788]: 2025-11-22 08:17:23.906 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:26 np0005531888 nova_compute[186788]: 2025-11-22 08:17:26.460 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:17:27Z|00572|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 22 03:17:28 np0005531888 nova_compute[186788]: 2025-11-22 08:17:28.907 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:30 np0005531888 nova_compute[186788]: 2025-11-22 08:17:30.970 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:17:31 np0005531888 nova_compute[186788]: 2025-11-22 08:17:31.463 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:33 np0005531888 podman[238351]: 2025-11-22 08:17:33.706539934 +0000 UTC m=+0.067781668 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:17:33 np0005531888 nova_compute[186788]: 2025-11-22 08:17:33.908 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:35 np0005531888 podman[238376]: 2025-11-22 08:17:35.715513781 +0000 UTC m=+0.076559955 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:17:36 np0005531888 nova_compute[186788]: 2025-11-22 08:17:36.466 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:17:36.832 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:17:36.833 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:17:36.833 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:38 np0005531888 nova_compute[186788]: 2025-11-22 08:17:38.909 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:39 np0005531888 podman[238401]: 2025-11-22 08:17:39.698355661 +0000 UTC m=+0.062495387 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:17:41 np0005531888 nova_compute[186788]: 2025-11-22 08:17:41.469 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:42 np0005531888 nova_compute[186788]: 2025-11-22 08:17:42.019 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance in state 1 after 32 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:17:42 np0005531888 podman[238422]: 2025-11-22 08:17:42.703826126 +0000 UTC m=+0.073314123 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 22 03:17:42 np0005531888 podman[238423]: 2025-11-22 08:17:42.746998788 +0000 UTC m=+0.105956447 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:17:43 np0005531888 nova_compute[186788]: 2025-11-22 08:17:43.911 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:45 np0005531888 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000094.scope: Deactivated successfully.
Nov 22 03:17:45 np0005531888 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000094.scope: Consumed 15.879s CPU time.
Nov 22 03:17:45 np0005531888 systemd-machined[153106]: Machine qemu-70-instance-00000094 terminated.
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.039 186792 INFO nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance shutdown successfully after 36 seconds.#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.044 186792 INFO nova.virt.libvirt.driver [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance destroyed successfully.#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.051 186792 INFO nova.virt.libvirt.driver [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance destroyed successfully.#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.051 186792 INFO nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Deleting instance files /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c_del#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.053 186792 INFO nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Deletion of /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c_del complete#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.365 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.366 186792 INFO nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Creating image(s)#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.366 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.366 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.367 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.378 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.468 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.469 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.469 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.481 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.497 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.553 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.554 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.753 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk 1073741824" returned: 0 in 0.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.754 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.755 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.810 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.811 186792 DEBUG nova.virt.disk.api [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Checking if we can resize image /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.812 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.906 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.908 186792 DEBUG nova.virt.disk.api [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Cannot resize image /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.909 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.909 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Ensure instance console log exists: /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.910 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.910 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.910 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.913 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.917 186792 WARNING nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.932 186792 DEBUG nova.virt.libvirt.host [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.933 186792 DEBUG nova.virt.libvirt.host [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.937 186792 DEBUG nova.virt.libvirt.host [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.938 186792 DEBUG nova.virt.libvirt.host [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.940 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.940 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.941 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.941 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.941 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.941 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.942 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.942 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.942 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.943 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.943 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.943 186792 DEBUG nova.virt.hardware [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.944 186792 DEBUG nova.objects.instance [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:46 np0005531888 nova_compute[186788]: 2025-11-22 08:17:46.977 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <uuid>18bc318c-d706-498c-9736-fc71aaa6660c</uuid>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <name>instance-00000094</name>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <nova:name>tempest-ServerShowV254Test-server-636223678</nova:name>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:17:46</nova:creationTime>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:        <nova:user uuid="18cd88673a4c4a8d99c2d8402355658f">tempest-ServerShowV254Test-1383239291-project-member</nova:user>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:        <nova:project uuid="f8f9152c32bc421ab7cc2d80a473c0d6">tempest-ServerShowV254Test-1383239291</nova:project>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <nova:ports/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <entry name="serial">18bc318c-d706-498c-9736-fc71aaa6660c</entry>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <entry name="uuid">18bc318c-d706-498c-9736-fc71aaa6660c</entry>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.config"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/console.log" append="off"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:17:46 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:17:46 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:17:46 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:17:46 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.065 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.066 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.068 186792 INFO nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Using config drive#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.102 186792 DEBUG nova.objects.instance [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.374 186792 INFO nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Creating config drive at /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.config#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.378 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw9lj4zn4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.504 186792 DEBUG oslo_concurrency.processutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw9lj4zn4" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:47 np0005531888 systemd-machined[153106]: New machine qemu-71-instance-00000094.
Nov 22 03:17:47 np0005531888 systemd[1]: Started Virtual Machine qemu-71-instance-00000094.
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.888 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for 18bc318c-d706-498c-9736-fc71aaa6660c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.889 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799467.8877575, 18bc318c-d706-498c-9736-fc71aaa6660c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.890 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.895 186792 DEBUG nova.compute.manager [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.895 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.901 186792 INFO nova.virt.libvirt.driver [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance spawned successfully.#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.902 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.907 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.910 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.923 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.923 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.924 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.925 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.925 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.926 186792 DEBUG nova.virt.libvirt.driver [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.949 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.950 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799467.889346, 18bc318c-d706-498c-9736-fc71aaa6660c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.950 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] VM Started (Lifecycle Event)#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.965 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.970 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:17:47 np0005531888 nova_compute[186788]: 2025-11-22 08:17:47.989 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 03:17:48 np0005531888 nova_compute[186788]: 2025-11-22 08:17:48.143 186792 DEBUG nova.compute.manager [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:17:48 np0005531888 nova_compute[186788]: 2025-11-22 08:17:48.279 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:48 np0005531888 nova_compute[186788]: 2025-11-22 08:17:48.280 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:48 np0005531888 nova_compute[186788]: 2025-11-22 08:17:48.280 186792 DEBUG nova.objects.instance [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 03:17:48 np0005531888 nova_compute[186788]: 2025-11-22 08:17:48.383 186792 DEBUG oslo_concurrency.lockutils [None req-a1179c23-a59e-4244-9a22-7e2bf1cc7282 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:48 np0005531888 nova_compute[186788]: 2025-11-22 08:17:48.912 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:51 np0005531888 nova_compute[186788]: 2025-11-22 08:17:51.502 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.444 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "18bc318c-d706-498c-9736-fc71aaa6660c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.445 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "18bc318c-d706-498c-9736-fc71aaa6660c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.445 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "18bc318c-d706-498c-9736-fc71aaa6660c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.445 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "18bc318c-d706-498c-9736-fc71aaa6660c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.445 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "18bc318c-d706-498c-9736-fc71aaa6660c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.453 186792 INFO nova.compute.manager [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Terminating instance#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.459 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "refresh_cache-18bc318c-d706-498c-9736-fc71aaa6660c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.459 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquired lock "refresh_cache-18bc318c-d706-498c-9736-fc71aaa6660c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.459 186792 DEBUG nova.network.neutron [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:17:52 np0005531888 nova_compute[186788]: 2025-11-22 08:17:52.636 186792 DEBUG nova.network.neutron [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.059 186792 DEBUG nova.network.neutron [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.074 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Releasing lock "refresh_cache-18bc318c-d706-498c-9736-fc71aaa6660c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.075 186792 DEBUG nova.compute.manager [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:17:53 np0005531888 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000094.scope: Deactivated successfully.
Nov 22 03:17:53 np0005531888 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000094.scope: Consumed 5.500s CPU time.
Nov 22 03:17:53 np0005531888 systemd-machined[153106]: Machine qemu-71-instance-00000094 terminated.
Nov 22 03:17:53 np0005531888 podman[238521]: 2025-11-22 08:17:53.181616957 +0000 UTC m=+0.053255081 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:17:53 np0005531888 podman[238520]: 2025-11-22 08:17:53.207230918 +0000 UTC m=+0.081852485 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.321 186792 INFO nova.virt.libvirt.driver [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance destroyed successfully.#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.322 186792 DEBUG nova.objects.instance [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lazy-loading 'resources' on Instance uuid 18bc318c-d706-498c-9736-fc71aaa6660c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.341 186792 INFO nova.virt.libvirt.driver [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Deleting instance files /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c_del#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.342 186792 INFO nova.virt.libvirt.driver [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Deletion of /var/lib/nova/instances/18bc318c-d706-498c-9736-fc71aaa6660c_del complete#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.428 186792 INFO nova.compute.manager [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.428 186792 DEBUG oslo.service.loopingcall [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.429 186792 DEBUG nova.compute.manager [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.429 186792 DEBUG nova.network.neutron [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.914 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.950 186792 DEBUG nova.network.neutron [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.963 186792 DEBUG nova.network.neutron [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:17:53 np0005531888 nova_compute[186788]: 2025-11-22 08:17:53.975 186792 INFO nova.compute.manager [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Took 0.55 seconds to deallocate network for instance.#033[00m
Nov 22 03:17:56 np0005531888 nova_compute[186788]: 2025-11-22 08:17:56.097 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:56 np0005531888 nova_compute[186788]: 2025-11-22 08:17:56.098 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:56 np0005531888 nova_compute[186788]: 2025-11-22 08:17:56.209 186792 DEBUG nova.compute.provider_tree [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:17:56 np0005531888 nova_compute[186788]: 2025-11-22 08:17:56.226 186792 DEBUG nova.scheduler.client.report [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:17:56 np0005531888 nova_compute[186788]: 2025-11-22 08:17:56.267 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:56 np0005531888 nova_compute[186788]: 2025-11-22 08:17:56.344 186792 INFO nova.scheduler.client.report [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Deleted allocations for instance 18bc318c-d706-498c-9736-fc71aaa6660c#033[00m
Nov 22 03:17:56 np0005531888 nova_compute[186788]: 2025-11-22 08:17:56.415 186792 DEBUG oslo_concurrency.lockutils [None req-5ca3f309-1a2f-45a4-be37-061280b3e1af 18cd88673a4c4a8d99c2d8402355658f f8f9152c32bc421ab7cc2d80a473c0d6 - - default default] Lock "18bc318c-d706-498c-9736-fc71aaa6660c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.971s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:56 np0005531888 nova_compute[186788]: 2025-11-22 08:17:56.506 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:17:58.507 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:17:58 np0005531888 nova_compute[186788]: 2025-11-22 08:17:58.507 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:17:58.508 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:17:58 np0005531888 nova_compute[186788]: 2025-11-22 08:17:58.917 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:01 np0005531888 nova_compute[186788]: 2025-11-22 08:18:01.508 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:01 np0005531888 nova_compute[186788]: 2025-11-22 08:18:01.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:03.511 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:03 np0005531888 nova_compute[186788]: 2025-11-22 08:18:03.919 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:04 np0005531888 podman[238574]: 2025-11-22 08:18:04.698899813 +0000 UTC m=+0.067353347 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:18:04 np0005531888 nova_compute[186788]: 2025-11-22 08:18:04.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:06 np0005531888 nova_compute[186788]: 2025-11-22 08:18:06.512 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531888 podman[238594]: 2025-11-22 08:18:06.676771466 +0000 UTC m=+0.052891262 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:18:07 np0005531888 nova_compute[186788]: 2025-11-22 08:18:07.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:07 np0005531888 nova_compute[186788]: 2025-11-22 08:18:07.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:18:07 np0005531888 nova_compute[186788]: 2025-11-22 08:18:07.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:18:07 np0005531888 nova_compute[186788]: 2025-11-22 08:18:07.968 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:18:07 np0005531888 nova_compute[186788]: 2025-11-22 08:18:07.969 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:08 np0005531888 nova_compute[186788]: 2025-11-22 08:18:08.320 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799473.3192747, 18bc318c-d706-498c-9736-fc71aaa6660c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:18:08 np0005531888 nova_compute[186788]: 2025-11-22 08:18:08.321 186792 INFO nova.compute.manager [-] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:18:08 np0005531888 nova_compute[186788]: 2025-11-22 08:18:08.348 186792 DEBUG nova.compute.manager [None req-5545a93e-ed11-4bdb-a34c-fb95e95b286d - - - - - -] [instance: 18bc318c-d706-498c-9736-fc71aaa6660c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:08 np0005531888 nova_compute[186788]: 2025-11-22 08:18:08.920 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:09 np0005531888 nova_compute[186788]: 2025-11-22 08:18:09.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:10 np0005531888 podman[238616]: 2025-11-22 08:18:10.685861412 +0000 UTC m=+0.058916900 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Nov 22 03:18:11 np0005531888 nova_compute[186788]: 2025-11-22 08:18:11.515 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:12 np0005531888 nova_compute[186788]: 2025-11-22 08:18:12.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:13 np0005531888 podman[238639]: 2025-11-22 08:18:13.687510112 +0000 UTC m=+0.062776485 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 03:18:13 np0005531888 podman[238640]: 2025-11-22 08:18:13.703858984 +0000 UTC m=+0.075634631 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:18:13 np0005531888 nova_compute[186788]: 2025-11-22 08:18:13.922 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:14 np0005531888 nova_compute[186788]: 2025-11-22 08:18:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:14 np0005531888 nova_compute[186788]: 2025-11-22 08:18:14.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:14 np0005531888 nova_compute[186788]: 2025-11-22 08:18:14.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:14 np0005531888 nova_compute[186788]: 2025-11-22 08:18:14.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:14 np0005531888 nova_compute[186788]: 2025-11-22 08:18:14.980 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.133 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.134 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.27429580688477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.134 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.134 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.185 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.185 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.211 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.221 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.239 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:18:15 np0005531888 nova_compute[186788]: 2025-11-22 08:18:15.239 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:16 np0005531888 nova_compute[186788]: 2025-11-22 08:18:16.518 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:17 np0005531888 nova_compute[186788]: 2025-11-22 08:18:17.238 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:18 np0005531888 nova_compute[186788]: 2025-11-22 08:18:18.922 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:19 np0005531888 nova_compute[186788]: 2025-11-22 08:18:19.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:19 np0005531888 nova_compute[186788]: 2025-11-22 08:18:19.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:18:21 np0005531888 nova_compute[186788]: 2025-11-22 08:18:21.521 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:23 np0005531888 podman[238683]: 2025-11-22 08:18:23.695946413 +0000 UTC m=+0.066459085 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:18:23 np0005531888 podman[238684]: 2025-11-22 08:18:23.710543782 +0000 UTC m=+0.076773679 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:18:23 np0005531888 nova_compute[186788]: 2025-11-22 08:18:23.923 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.483 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.484 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.558 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.825 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.828 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.838 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.838 186792 INFO nova.compute.claims [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.976 186792 DEBUG nova.compute.provider_tree [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:18:25 np0005531888 nova_compute[186788]: 2025-11-22 08:18:25.990 186792 DEBUG nova.scheduler.client.report [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.027 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.028 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.151 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.151 186792 DEBUG nova.network.neutron [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.222 186792 INFO nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.296 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.454 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.456 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.456 186792 INFO nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Creating image(s)#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.457 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.457 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.458 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.474 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.525 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.530 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.531 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.532 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.559 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.628 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.629 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.705 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk 1073741824" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.706 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.707 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.778 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.779 186792 DEBUG nova.virt.disk.api [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.779 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.841 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.843 186792 DEBUG nova.virt.disk.api [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.844 186792 DEBUG nova.objects.instance [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.866 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.867 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Ensure instance console log exists: /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.868 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.868 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:26 np0005531888 nova_compute[186788]: 2025-11-22 08:18:26.868 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:27 np0005531888 nova_compute[186788]: 2025-11-22 08:18:27.031 186792 DEBUG nova.policy [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:18:28 np0005531888 nova_compute[186788]: 2025-11-22 08:18:28.925 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:29 np0005531888 nova_compute[186788]: 2025-11-22 08:18:29.102 186792 DEBUG nova.network.neutron [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Successfully created port: a6be1de1-c2dd-4be7-89df-bfa4d9bc296c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:18:30 np0005531888 nova_compute[186788]: 2025-11-22 08:18:30.320 186792 DEBUG nova.network.neutron [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Successfully updated port: a6be1de1-c2dd-4be7-89df-bfa4d9bc296c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:18:30 np0005531888 nova_compute[186788]: 2025-11-22 08:18:30.411 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:30 np0005531888 nova_compute[186788]: 2025-11-22 08:18:30.411 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:30 np0005531888 nova_compute[186788]: 2025-11-22 08:18:30.412 186792 DEBUG nova.network.neutron [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:18:30 np0005531888 nova_compute[186788]: 2025-11-22 08:18:30.486 186792 DEBUG nova.compute.manager [req-91d20a54-a483-4abe-856c-5522150b2b34 req-4260fd18-3361-4ee1-96c3-82a235ce2231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:30 np0005531888 nova_compute[186788]: 2025-11-22 08:18:30.486 186792 DEBUG nova.compute.manager [req-91d20a54-a483-4abe-856c-5522150b2b34 req-4260fd18-3361-4ee1-96c3-82a235ce2231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing instance network info cache due to event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:18:30 np0005531888 nova_compute[186788]: 2025-11-22 08:18:30.486 186792 DEBUG oslo_concurrency.lockutils [req-91d20a54-a483-4abe-856c-5522150b2b34 req-4260fd18-3361-4ee1-96c3-82a235ce2231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:30 np0005531888 nova_compute[186788]: 2025-11-22 08:18:30.876 186792 DEBUG nova.network.neutron [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:18:31 np0005531888 nova_compute[186788]: 2025-11-22 08:18:31.529 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.001 186792 DEBUG nova.network.neutron [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.026 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.026 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance network_info: |[{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.027 186792 DEBUG oslo_concurrency.lockutils [req-91d20a54-a483-4abe-856c-5522150b2b34 req-4260fd18-3361-4ee1-96c3-82a235ce2231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.027 186792 DEBUG nova.network.neutron [req-91d20a54-a483-4abe-856c-5522150b2b34 req-4260fd18-3361-4ee1-96c3-82a235ce2231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.030 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Start _get_guest_xml network_info=[{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.036 186792 WARNING nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.042 186792 DEBUG nova.virt.libvirt.host [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.043 186792 DEBUG nova.virt.libvirt.host [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.049 186792 DEBUG nova.virt.libvirt.host [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.050 186792 DEBUG nova.virt.libvirt.host [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.051 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.051 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.052 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.052 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.053 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.053 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.054 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.054 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.054 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.055 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.055 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.055 186792 DEBUG nova.virt.hardware [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.059 186792 DEBUG nova.virt.libvirt.vif [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-104914358',display_name='tempest-TestNetworkAdvancedServerOps-server-104914358',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-104914358',id=150,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuYeAEoXXbFDPuWNPKdh/K1JH4L9ZCXU/SY8Quy5TL9WW/Qq6H4zQToZJbmU7x96LpJWQ/NfkaUrq1jAo7d4tTwPh3rAycu6tk9EuY65V+7L7m3g1sqWP9C3rGfSGoErQ==',key_name='tempest-TestNetworkAdvancedServerOps-1623117955',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-mnvd2q8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:18:26Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=ff7656a5-6680-4acd-a89d-fdc5e9fb914a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.060 186792 DEBUG nova.network.os_vif_util [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.061 186792 DEBUG nova.network.os_vif_util [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.062 186792 DEBUG nova.objects.instance [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.077 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <uuid>ff7656a5-6680-4acd-a89d-fdc5e9fb914a</uuid>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <name>instance-00000096</name>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-104914358</nova:name>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:18:32</nova:creationTime>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        <nova:port uuid="a6be1de1-c2dd-4be7-89df-bfa4d9bc296c">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <entry name="serial">ff7656a5-6680-4acd-a89d-fdc5e9fb914a</entry>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <entry name="uuid">ff7656a5-6680-4acd-a89d-fdc5e9fb914a</entry>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:e8:eb:ea"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <target dev="tapa6be1de1-c2"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/console.log" append="off"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:18:32 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:18:32 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:18:32 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:18:32 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.078 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Preparing to wait for external event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.078 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.079 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.079 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.080 186792 DEBUG nova.virt.libvirt.vif [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-104914358',display_name='tempest-TestNetworkAdvancedServerOps-server-104914358',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-104914358',id=150,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuYeAEoXXbFDPuWNPKdh/K1JH4L9ZCXU/SY8Quy5TL9WW/Qq6H4zQToZJbmU7x96LpJWQ/NfkaUrq1jAo7d4tTwPh3rAycu6tk9EuY65V+7L7m3g1sqWP9C3rGfSGoErQ==',key_name='tempest-TestNetworkAdvancedServerOps-1623117955',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-mnvd2q8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:18:26Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=ff7656a5-6680-4acd-a89d-fdc5e9fb914a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.080 186792 DEBUG nova.network.os_vif_util [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.081 186792 DEBUG nova.network.os_vif_util [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.081 186792 DEBUG os_vif [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.082 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.082 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.083 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.085 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.086 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6be1de1-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.086 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa6be1de1-c2, col_values=(('external_ids', {'iface-id': 'a6be1de1-c2dd-4be7-89df-bfa4d9bc296c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:eb:ea', 'vm-uuid': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.087 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531888 NetworkManager[55166]: <info>  [1763799512.0892] manager: (tapa6be1de1-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.089 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.094 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.095 186792 INFO os_vif [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2')#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.144 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.144 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.144 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:e8:eb:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.145 186792 INFO nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Using config drive#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.553 186792 INFO nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Creating config drive at /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.558 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoprp028g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.684 186792 DEBUG oslo_concurrency.processutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoprp028g" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:32 np0005531888 kernel: tapa6be1de1-c2: entered promiscuous mode
Nov 22 03:18:32 np0005531888 NetworkManager[55166]: <info>  [1763799512.7638] manager: (tapa6be1de1-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/268)
Nov 22 03:18:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:32Z|00573|binding|INFO|Claiming lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for this chassis.
Nov 22 03:18:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:32Z|00574|binding|INFO|a6be1de1-c2dd-4be7-89df-bfa4d9bc296c: Claiming fa:16:3e:e8:eb:ea 10.100.0.3
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.766 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.782 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:eb:ea 10.100.0.3'], port_security=['fa:16:3e:e8:eb:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec72ffac-7400-49d0-9e0a-60c991449755', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': '84df0425-e3b1-4ba9-b876-812d98417396', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=271b2087-b100-40c2-aba4-df256e37c26c, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.784 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c in datapath ec72ffac-7400-49d0-9e0a-60c991449755 bound to our chassis#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.786 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec72ffac-7400-49d0-9e0a-60c991449755#033[00m
Nov 22 03:18:32 np0005531888 systemd-udevd[238761]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.799 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[188002a8-2717-444d-86cd-558d6f2e9289]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.800 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec72ffac-71 in ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.801 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec72ffac-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.802 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4699bf45-3449-4761-9450-2b6d001c287e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.803 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f62a726c-d275-4c40-8447-76d1ed3e8e2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 NetworkManager[55166]: <info>  [1763799512.8096] device (tapa6be1de1-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:18:32 np0005531888 NetworkManager[55166]: <info>  [1763799512.8110] device (tapa6be1de1-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.815 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5037e2-5a14-4b28-884d-eebdb99ed978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 systemd-machined[153106]: New machine qemu-72-instance-00000096.
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.822 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:32Z|00575|binding|INFO|Setting lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c ovn-installed in OVS
Nov 22 03:18:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:32Z|00576|binding|INFO|Setting lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c up in Southbound
Nov 22 03:18:32 np0005531888 nova_compute[186788]: 2025-11-22 08:18:32.828 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531888 systemd[1]: Started Virtual Machine qemu-72-instance-00000096.
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.839 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[21f9a1dc-5197-4b0f-b386-9ea699794959]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.866 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[108e1ab4-5eb7-48fe-9baa-bf2cadf4dcf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.871 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c642f9-6dd9-4e2d-867e-a39fa7e37c4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 NetworkManager[55166]: <info>  [1763799512.8732] manager: (tapec72ffac-70): new Veth device (/org/freedesktop/NetworkManager/Devices/269)
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.898 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2357c604-04d2-42e6-ac38-a4aa37a3e7b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.901 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[0fefae3b-2f12-4168-b3ea-33895c9ab51e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 NetworkManager[55166]: <info>  [1763799512.9223] device (tapec72ffac-70): carrier: link connected
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.927 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a0528a-70c8-40df-8e08-06f1364f7b95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.943 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[03b32f6a-140e-4d59-8bdc-57dba6578353]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec72ffac-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:5f:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625356, 'reachable_time': 27067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238795, 'error': None, 'target': 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.959 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa3d6b9-0656-4ec7-8a96-1dc395f38205]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe31:5f6b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625356, 'tstamp': 625356}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238799, 'error': None, 'target': 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:32.975 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1407e103-bb9c-42a6-bed0-4745cc8548d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec72ffac-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:5f:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625356, 'reachable_time': 27067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238802, 'error': None, 'target': 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.002 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2b10b037-10a3-4a99-b211-60b9f29456aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.035 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799513.0346816, ff7656a5-6680-4acd-a89d-fdc5e9fb914a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.035 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] VM Started (Lifecycle Event)#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.055 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.061 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799513.0357249, ff7656a5-6680-4acd-a89d-fdc5e9fb914a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.062 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.070 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e340b4-79c5-493d-a2fe-d21613f6ca31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.072 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec72ffac-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.100 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.105 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.102 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.106 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec72ffac-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.110 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:33 np0005531888 NetworkManager[55166]: <info>  [1763799513.1118] manager: (tapec72ffac-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Nov 22 03:18:33 np0005531888 kernel: tapec72ffac-70: entered promiscuous mode
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.114 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.116 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec72ffac-70, col_values=(('external_ids', {'iface-id': 'a04e532b-8aea-4e90-9617-f7d5299315eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.118 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:33 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:33Z|00577|binding|INFO|Releasing lport a04e532b-8aea-4e90-9617-f7d5299315eb from this chassis (sb_readonly=0)
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.118 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.119 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec72ffac-7400-49d0-9e0a-60c991449755.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec72ffac-7400-49d0-9e0a-60c991449755.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.120 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0f84166b-66c1-4adf-8655-3c620b2b6afd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.121 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-ec72ffac-7400-49d0-9e0a-60c991449755
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/ec72ffac-7400-49d0-9e0a-60c991449755.pid.haproxy
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID ec72ffac-7400-49d0-9e0a-60c991449755
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:18:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:33.123 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'env', 'PROCESS_TAG=haproxy-ec72ffac-7400-49d0-9e0a-60c991449755', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec72ffac-7400-49d0-9e0a-60c991449755.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.126 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.130 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.272 186792 DEBUG nova.compute.manager [req-03858718-539d-4475-95d6-13115e04156a req-a62656a0-f668-4f8a-b91c-ce284d1ec1aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.272 186792 DEBUG oslo_concurrency.lockutils [req-03858718-539d-4475-95d6-13115e04156a req-a62656a0-f668-4f8a-b91c-ce284d1ec1aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.272 186792 DEBUG oslo_concurrency.lockutils [req-03858718-539d-4475-95d6-13115e04156a req-a62656a0-f668-4f8a-b91c-ce284d1ec1aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.272 186792 DEBUG oslo_concurrency.lockutils [req-03858718-539d-4475-95d6-13115e04156a req-a62656a0-f668-4f8a-b91c-ce284d1ec1aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.273 186792 DEBUG nova.compute.manager [req-03858718-539d-4475-95d6-13115e04156a req-a62656a0-f668-4f8a-b91c-ce284d1ec1aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Processing event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.273 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.283 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799513.2830954, ff7656a5-6680-4acd-a89d-fdc5e9fb914a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.284 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.286 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.290 186792 INFO nova.virt.libvirt.driver [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance spawned successfully.#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.291 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.310 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.320 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.325 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.326 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.326 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.327 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.328 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.328 186792 DEBUG nova.virt.libvirt.driver [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.355 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.440 186792 INFO nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Took 6.99 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.441 186792 DEBUG nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.531 186792 INFO nova.compute.manager [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Took 7.75 seconds to build instance.#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.571 186792 DEBUG oslo_concurrency.lockutils [None req-8e46e46c-2003-4335-80c1-ac8a44f5d0da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:33 np0005531888 podman[238835]: 2025-11-22 08:18:33.534473064 +0000 UTC m=+0.025970880 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.738 186792 DEBUG nova.network.neutron [req-91d20a54-a483-4abe-856c-5522150b2b34 req-4260fd18-3361-4ee1-96c3-82a235ce2231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updated VIF entry in instance network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.739 186792 DEBUG nova.network.neutron [req-91d20a54-a483-4abe-856c-5522150b2b34 req-4260fd18-3361-4ee1-96c3-82a235ce2231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.762 186792 DEBUG oslo_concurrency.lockutils [req-91d20a54-a483-4abe-856c-5522150b2b34 req-4260fd18-3361-4ee1-96c3-82a235ce2231 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:18:33 np0005531888 podman[238835]: 2025-11-22 08:18:33.897592454 +0000 UTC m=+0.389090250 container create 68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.927 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:33 np0005531888 nova_compute[186788]: 2025-11-22 08:18:33.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:34 np0005531888 systemd[1]: Started libpod-conmon-68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6.scope.
Nov 22 03:18:34 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:18:34 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a359373dbd2806d7434f94ccb021abf6a60693fecb7b65661c8bacd70c7054c7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:18:34 np0005531888 podman[238835]: 2025-11-22 08:18:34.123228723 +0000 UTC m=+0.614726539 container init 68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:18:34 np0005531888 podman[238835]: 2025-11-22 08:18:34.128988646 +0000 UTC m=+0.620486442 container start 68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:18:34 np0005531888 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[238850]: [NOTICE]   (238854) : New worker (238856) forked
Nov 22 03:18:34 np0005531888 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[238850]: [NOTICE]   (238854) : Loading success.
Nov 22 03:18:35 np0005531888 nova_compute[186788]: 2025-11-22 08:18:35.499 186792 DEBUG nova.compute.manager [req-1321042d-fae1-41a9-a1dc-65048b8ff761 req-4378a42d-2116-4996-a260-394900a203f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:35 np0005531888 nova_compute[186788]: 2025-11-22 08:18:35.499 186792 DEBUG oslo_concurrency.lockutils [req-1321042d-fae1-41a9-a1dc-65048b8ff761 req-4378a42d-2116-4996-a260-394900a203f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:35 np0005531888 nova_compute[186788]: 2025-11-22 08:18:35.499 186792 DEBUG oslo_concurrency.lockutils [req-1321042d-fae1-41a9-a1dc-65048b8ff761 req-4378a42d-2116-4996-a260-394900a203f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:35 np0005531888 nova_compute[186788]: 2025-11-22 08:18:35.499 186792 DEBUG oslo_concurrency.lockutils [req-1321042d-fae1-41a9-a1dc-65048b8ff761 req-4378a42d-2116-4996-a260-394900a203f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:35 np0005531888 nova_compute[186788]: 2025-11-22 08:18:35.500 186792 DEBUG nova.compute.manager [req-1321042d-fae1-41a9-a1dc-65048b8ff761 req-4378a42d-2116-4996-a260-394900a203f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:18:35 np0005531888 nova_compute[186788]: 2025-11-22 08:18:35.500 186792 WARNING nova.compute.manager [req-1321042d-fae1-41a9-a1dc-65048b8ff761 req-4378a42d-2116-4996-a260-394900a203f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state active and task_state None.#033[00m
Nov 22 03:18:35 np0005531888 podman[238865]: 2025-11-22 08:18:35.710458519 +0000 UTC m=+0.087095723 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:18:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:36.833 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:36.834 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:36.834 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.848 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000096', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '042f6d127720471aaedb8a1fb7535416', 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'hostId': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.849 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.851 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ff7656a5-6680-4acd-a89d-fdc5e9fb914a / tapa6be1de1-c2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.852 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33e1ab93-d271-4958-ae31-81fe4ef91429', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.849296', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd8132fc8-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': 'b1b68d2f547c8444e3381653007bad0948610a487108aac86e26e0edb8a5b79d'}]}, 'timestamp': '2025-11-22 08:18:36.852797', '_unique_id': 'a9e6399df88145b389d02de4098b97a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.853 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.855 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.855 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3698072d-bb2d-46dc-bb1b-3fc61fe05fd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.855128', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd8139814-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': '6dfce59dcd43d81365c2dc95624d3c74fc27ce24323375542433bbe9ee5eb790'}]}, 'timestamp': '2025-11-22 08:18:36.855457', '_unique_id': '66f0e98b9489402fb09828243e23bd31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.856 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-104914358>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-104914358>]
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.857 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.898 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.899 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd951093e-3ca6-464e-bb77-5a760210516b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.857093', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd81a4b6e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': '4c014a3dee925bc2e487ed894759bdd4d899f9fa551d04a57175e18c99a3147f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.857093', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd81a57c6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': 'd5bb1d065c0e6ce7c3bf0e96838ee0bbfe48722b3d471a662e43c6723c134e47'}]}, 'timestamp': '2025-11-22 08:18:36.899669', '_unique_id': '89d702011e004441a5839c29afbb5f38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.900 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.901 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.901 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '192ae069-3ae2-4c74-bbb3-f2e0881dcdf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.901771', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd81ab626-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': '5e10b7d152cb995433a50196d01d6ea20d25bcd4e04fbe90c99a83f18f1087c7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.901771', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd81ac076-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': 'b4f652a0b2e2051258609236027d6ac073d3e6064081dd2776ae3e9e5de9029a'}]}, 'timestamp': '2025-11-22 08:18:36.902308', '_unique_id': 'b84effafa78845ec86bbed513dc79dfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.902 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.903 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.918 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/cpu volume: 3520000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8abc186-6a65-41b7-81d3-d4fd7bdb1481', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3520000000, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'timestamp': '2025-11-22T08:18:36.903959', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd81d5da4-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.61832954, 'message_signature': 'c51d4c74e135f4b1200abd82cd3d631a6a0a9cf8f5466c7594e5063131624bb9'}]}, 'timestamp': '2025-11-22 08:18:36.919504', '_unique_id': 'ad80db33a41b4d839e964abf09a6aa03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.921 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8bfed60-ac08-4e57-a332-1574e2f6b20e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.921534', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd81dbbe6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': '226922afd7801fb506b0858b0d7efc676158150a3954668af1f776251d0b2ab6'}]}, 'timestamp': '2025-11-22 08:18:36.921901', '_unique_id': '2f021157cb1f44e3a0ec501170deb80f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.922 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.923 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.read.latency volume: 1147669037 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.923 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.read.latency volume: 21037988 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cb35d53-7a5c-4f50-b980-fd4a339f9ea3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1147669037, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.923489', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd81e06c8-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': '843e25c26d573952508a20ae181bd5bf6431d16f7f09a369631463111c8fc5d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21037988, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.923489', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd81e1046-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': '6eac7f0da241cbd6471226d70086c365d2c20db84df062f82dca13c463b65fbb'}]}, 'timestamp': '2025-11-22 08:18:36.924009', '_unique_id': 'f08badabe8a84d0eb47c8809b5265e66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.925 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11fccf7a-cc5f-4a60-bbd9-1c39641e568a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.925490', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd81e55a6-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': '4bf7eb86c5c93b41b1e240c65edbd7df49a686d2f699954c92998aa5ac414399'}]}, 'timestamp': '2025-11-22 08:18:36.925798', '_unique_id': '10a573348c4341e4a5d1852b83c0c8e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.927 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.927 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91fc19d0-6415-4418-9f5d-60180fb9f2ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.927248', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd81e9890-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': '6d6c035ca94889de244b60f2e44bc3c0b1bab67aa8cb901d83437646b72d7344'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.927248', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd81ea2fe-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': '90d00769cf5d71996540b4e4c9bb8afd794bf8da4f3b7ac72dc23a061111e60a'}]}, 'timestamp': '2025-11-22 08:18:36.927765', '_unique_id': 'c34ae256e12b45bda4840f4097d7fa86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.928 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.941 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.942 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15d34f83-f086-4a9a-943b-995ec8b8cd66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.929250', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd820cd90-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.628750037, 'message_signature': '6c63029446f1e0fb2143b617219ca46749ac9281134bc516d1e1486153ead563'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.929250', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd820dcae-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.628750037, 'message_signature': '9bf2be570d9819549451f97b69628fc747e26717979f4149a5633d102efe5cc4'}]}, 'timestamp': '2025-11-22 08:18:36.942382', '_unique_id': '5083b377f4c845c9856a00ff15f4a550'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.944 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.944 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.944 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-104914358>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-104914358>]
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.945 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.945 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.945 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f730d4af-b6e4-4ef4-88a7-0aa5e47e6a5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.945189', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd8215710-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.628750037, 'message_signature': '300aac5659d49679043f0d316efea0020531f744f6af46c5e46b0ec5c4de5dd1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.945189', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd821630e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.628750037, 'message_signature': 'b1270bba07a6cd7663692f5794ee4039890603811631dc5ec58ef98e37637c48'}]}, 'timestamp': '2025-11-22 08:18:36.945803', '_unique_id': '33f862b524014b7b897d4aab6046f17a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.947 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.947 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-104914358>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-104914358>]
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.948 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.948 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.948 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69abca67-a4ce-4f04-b057-e154b23a3178', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.948210', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd821cd26-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.628750037, 'message_signature': '99895af53f32f3eb5f4c2136e51f1dae3842b568d53b1f1baea3be22a6d2e320'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.948210', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd821d9ba-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.628750037, 'message_signature': 'a6e8ea788bc7d01b24b0291724128f2e501098602bb2259c64f47589d0c3d2fc'}]}, 'timestamp': '2025-11-22 08:18:36.948880', '_unique_id': '30316311837c45c7970ad90991b50112'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.950 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26fcd7ca-e760-4ba3-83d4-1d8a23dcb89a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.950734', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd8222f32-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': 'cde3b60d436206e129c33060ecadc2945a79ac4828ad229eca518fbc0a376dfa'}]}, 'timestamp': '2025-11-22 08:18:36.951036', '_unique_id': '9c89d13d369a4a9b802b1ff7247410a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.952 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc66d44f-0eaa-413d-8d61-5e71bf8b86d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.952757', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd8227da2-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': 'b01d68b6c564540d0b200acc2bcd801e65798da1f6961d3dfb7adff79c79efa4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.952757', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd82289c8-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': 'deeee136a5655773ae0071bb75c04d2d4e27b3af12296e08266e659b5e120585'}]}, 'timestamp': '2025-11-22 08:18:36.953337', '_unique_id': 'e0225f1de79b44e486da3cda285e0489'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.954 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a00dc48-de87-43bb-831c-06164f943309', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.954984', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd822d4fa-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': '11f5eb082b49cd6bf2a2c6b082d700680ad809d73cc8a74b0e5a563081546457'}]}, 'timestamp': '2025-11-22 08:18:36.955295', '_unique_id': 'ec1a8ae676ad4cc8873a5e01ffb63436'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.956 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.956 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c140c513-99a1-4ec9-947e-2e32a3ac85b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.956840', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd8231d8e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': '7b5e29ee2f50ddbbdb591a737579fa43229ff0aec30c26b480fde8b1eaed2269'}]}, 'timestamp': '2025-11-22 08:18:36.957142', '_unique_id': '2582d024d59440a4b8740c8740bc92d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.957 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.958 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84337456-1a27-44d4-8244-8e20fef0972b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.958649', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd8236384-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': '94392136292632c665c24667f84a0fb84612ce7c5ac72124452864e40fa13fdc'}]}, 'timestamp': '2025-11-22 08:18:36.958923', '_unique_id': 'a3461c7a79ad4494b7f03a89cf402031'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.960 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '838d84f2-55ba-4f62-854f-b1af14dfa619', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.960596', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd823afce-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': '4512369881d56a192ee4dbc9a3b8646e38f23e17e3b0bc117f1ad111a49953a0'}]}, 'timestamp': '2025-11-22 08:18:36.960884', '_unique_id': 'e7af7e97d1194543a0bd9e4fbdc28673'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.962 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b7fdd83-c0c5-4361-99ac-75eaba8a9b1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000096-ff7656a5-6680-4acd-a89d-fdc5e9fb914a-tapa6be1de1-c2', 'timestamp': '2025-11-22T08:18:36.962522', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'tapa6be1de1-c2', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e8:eb:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa6be1de1-c2'}, 'message_id': 'd823fc18-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.548782599, 'message_signature': 'db16147271a04572da227c9e378ba47267ba0cabf38739faace383134319bdef'}]}, 'timestamp': '2025-11-22 08:18:36.962831', '_unique_id': '5b878982b176487d9e63f1662c0876b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.964 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.964 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.964 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance ff7656a5-6680-4acd-a89d-fdc5e9fb914a: ceilometer.compute.pollsters.NoVolumeException
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.964 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.964 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-104914358>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-104914358>]
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.965 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.965 12 DEBUG ceilometer.compute.pollsters [-] ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fa88a1f-9c20-461b-a325-ce376ec93e51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-vda', 'timestamp': '2025-11-22T08:18:36.965153', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd824614e-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': '782372f2099d8e8273deda4a00e971e64a2be8fe9a97c2894b7c6176846ff783'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a-sda', 'timestamp': '2025-11-22T08:18:36.965153', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-104914358', 'name': 'instance-00000096', 'instance_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'instance_type': 'm1.nano', 'host': '113fd448f7cebbc8a6fc61ef41881397c4a6a694dbe57b1387709825', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd8246aae-c77b-11f0-941d-fa163e6775e5', 'monotonic_time': 6257.556590371, 'message_signature': '37ee6a97ae7ee2405c268c817742f63688c4b7493871fb7dabba5ec9fb76e4a4'}]}, 'timestamp': '2025-11-22 08:18:36.965662', '_unique_id': 'c29a57c318fc4261a0cb052b51c7cf5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:18:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:37 np0005531888 nova_compute[186788]: 2025-11-22 08:18:37.089 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531888 nova_compute[186788]: 2025-11-22 08:18:37.302 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531888 NetworkManager[55166]: <info>  [1763799517.3030] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Nov 22 03:18:37 np0005531888 NetworkManager[55166]: <info>  [1763799517.3043] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Nov 22 03:18:37 np0005531888 nova_compute[186788]: 2025-11-22 08:18:37.387 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:37Z|00578|binding|INFO|Releasing lport a04e532b-8aea-4e90-9617-f7d5299315eb from this chassis (sb_readonly=0)
Nov 22 03:18:37 np0005531888 nova_compute[186788]: 2025-11-22 08:18:37.399 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531888 podman[238887]: 2025-11-22 08:18:37.671779974 +0000 UTC m=+0.049643932 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:18:38 np0005531888 nova_compute[186788]: 2025-11-22 08:18:38.929 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:39 np0005531888 nova_compute[186788]: 2025-11-22 08:18:39.542 186792 DEBUG nova.compute.manager [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:39 np0005531888 nova_compute[186788]: 2025-11-22 08:18:39.543 186792 DEBUG nova.compute.manager [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing instance network info cache due to event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:18:39 np0005531888 nova_compute[186788]: 2025-11-22 08:18:39.543 186792 DEBUG oslo_concurrency.lockutils [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:39 np0005531888 nova_compute[186788]: 2025-11-22 08:18:39.543 186792 DEBUG oslo_concurrency.lockutils [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:39 np0005531888 nova_compute[186788]: 2025-11-22 08:18:39.544 186792 DEBUG nova.network.neutron [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:18:41 np0005531888 nova_compute[186788]: 2025-11-22 08:18:41.057 186792 DEBUG nova.network.neutron [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updated VIF entry in instance network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:18:41 np0005531888 nova_compute[186788]: 2025-11-22 08:18:41.059 186792 DEBUG nova.network.neutron [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:41 np0005531888 nova_compute[186788]: 2025-11-22 08:18:41.081 186792 DEBUG oslo_concurrency.lockutils [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:18:41 np0005531888 podman[238909]: 2025-11-22 08:18:41.689971634 +0000 UTC m=+0.062142930 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container)
Nov 22 03:18:42 np0005531888 nova_compute[186788]: 2025-11-22 08:18:42.096 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:43 np0005531888 nova_compute[186788]: 2025-11-22 08:18:43.930 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:44 np0005531888 podman[238931]: 2025-11-22 08:18:44.681658369 +0000 UTC m=+0.059115265 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:18:44 np0005531888 podman[238932]: 2025-11-22 08:18:44.710758844 +0000 UTC m=+0.079129146 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:18:47 np0005531888 nova_compute[186788]: 2025-11-22 08:18:47.099 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:47Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e8:eb:ea 10.100.0.3
Nov 22 03:18:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:47Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:eb:ea 10.100.0.3
Nov 22 03:18:48 np0005531888 nova_compute[186788]: 2025-11-22 08:18:48.933 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:52 np0005531888 nova_compute[186788]: 2025-11-22 08:18:52.104 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:53 np0005531888 nova_compute[186788]: 2025-11-22 08:18:53.878 186792 INFO nova.compute.manager [None req-e1e336be-20a0-4777-b541-5c29f59aada4 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Get console output#033[00m
Nov 22 03:18:53 np0005531888 nova_compute[186788]: 2025-11-22 08:18:53.884 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:18:53 np0005531888 nova_compute[186788]: 2025-11-22 08:18:53.935 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:54 np0005531888 podman[238997]: 2025-11-22 08:18:54.919305536 +0000 UTC m=+0.046169246 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:18:54 np0005531888 podman[238996]: 2025-11-22 08:18:54.929700153 +0000 UTC m=+0.058731426 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:18:55 np0005531888 ovn_controller[95067]: 2025-11-22T08:18:55Z|00579|binding|INFO|Releasing lport a04e532b-8aea-4e90-9617-f7d5299315eb from this chassis (sb_readonly=0)
Nov 22 03:18:55 np0005531888 nova_compute[186788]: 2025-11-22 08:18:55.136 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:55 np0005531888 nova_compute[186788]: 2025-11-22 08:18:55.901 186792 INFO nova.compute.manager [None req-b702c61c-d82e-4a8d-bfd2-d782bf6138c9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Get console output#033[00m
Nov 22 03:18:55 np0005531888 nova_compute[186788]: 2025-11-22 08:18:55.907 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:18:57 np0005531888 nova_compute[186788]: 2025-11-22 08:18:57.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:58 np0005531888 nova_compute[186788]: 2025-11-22 08:18:58.937 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:59 np0005531888 nova_compute[186788]: 2025-11-22 08:18:59.088 186792 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:59 np0005531888 nova_compute[186788]: 2025-11-22 08:18:59.089 186792 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:59 np0005531888 nova_compute[186788]: 2025-11-22 08:18:59.089 186792 DEBUG nova.network.neutron [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:18:59 np0005531888 nova_compute[186788]: 2025-11-22 08:18:59.498 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:59.499 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:18:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:18:59.502 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:19:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:00.505 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:00 np0005531888 nova_compute[186788]: 2025-11-22 08:19:00.525 186792 DEBUG nova.network.neutron [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:00 np0005531888 nova_compute[186788]: 2025-11-22 08:19:00.542 186792 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:00 np0005531888 nova_compute[186788]: 2025-11-22 08:19:00.640 186792 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 03:19:00 np0005531888 nova_compute[186788]: 2025-11-22 08:19:00.640 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Creating file /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/9ecbc19b4c9b473c94a7209fe97fdb45.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 03:19:00 np0005531888 nova_compute[186788]: 2025-11-22 08:19:00.640 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/9ecbc19b4c9b473c94a7209fe97fdb45.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:01 np0005531888 nova_compute[186788]: 2025-11-22 08:19:01.078 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/9ecbc19b4c9b473c94a7209fe97fdb45.tmp" returned: 1 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:01 np0005531888 nova_compute[186788]: 2025-11-22 08:19:01.079 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/9ecbc19b4c9b473c94a7209fe97fdb45.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 03:19:01 np0005531888 nova_compute[186788]: 2025-11-22 08:19:01.079 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Creating directory /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 03:19:01 np0005531888 nova_compute[186788]: 2025-11-22 08:19:01.080 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:01 np0005531888 nova_compute[186788]: 2025-11-22 08:19:01.294 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a" returned: 0 in 0.214s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:01 np0005531888 nova_compute[186788]: 2025-11-22 08:19:01.299 186792 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:19:02 np0005531888 nova_compute[186788]: 2025-11-22 08:19:02.111 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:02 np0005531888 nova_compute[186788]: 2025-11-22 08:19:02.644 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:02 np0005531888 nova_compute[186788]: 2025-11-22 08:19:02.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:03 np0005531888 kernel: tapa6be1de1-c2 (unregistering): left promiscuous mode
Nov 22 03:19:03 np0005531888 NetworkManager[55166]: <info>  [1763799543.6620] device (tapa6be1de1-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:19:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:03Z|00580|binding|INFO|Releasing lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c from this chassis (sb_readonly=0)
Nov 22 03:19:03 np0005531888 nova_compute[186788]: 2025-11-22 08:19:03.671 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:03Z|00581|binding|INFO|Setting lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c down in Southbound
Nov 22 03:19:03 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:03Z|00582|binding|INFO|Removing iface tapa6be1de1-c2 ovn-installed in OVS
Nov 22 03:19:03 np0005531888 nova_compute[186788]: 2025-11-22 08:19:03.674 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:03.679 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:eb:ea 10.100.0.3'], port_security=['fa:16:3e:e8:eb:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec72ffac-7400-49d0-9e0a-60c991449755', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '84df0425-e3b1-4ba9-b876-812d98417396', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=271b2087-b100-40c2-aba4-df256e37c26c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:19:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:03.682 104023 INFO neutron.agent.ovn.metadata.agent [-] Port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c in datapath ec72ffac-7400-49d0-9e0a-60c991449755 unbound from our chassis#033[00m
Nov 22 03:19:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:03.683 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec72ffac-7400-49d0-9e0a-60c991449755, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:19:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:03.685 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[562471fe-aaad-423c-86db-036e1144ea3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:03.686 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 namespace which is not needed anymore#033[00m
Nov 22 03:19:03 np0005531888 nova_compute[186788]: 2025-11-22 08:19:03.688 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:03 np0005531888 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000096.scope: Deactivated successfully.
Nov 22 03:19:03 np0005531888 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000096.scope: Consumed 15.102s CPU time.
Nov 22 03:19:03 np0005531888 systemd-machined[153106]: Machine qemu-72-instance-00000096 terminated.
Nov 22 03:19:03 np0005531888 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[238850]: [NOTICE]   (238854) : haproxy version is 2.8.14-c23fe91
Nov 22 03:19:03 np0005531888 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[238850]: [NOTICE]   (238854) : path to executable is /usr/sbin/haproxy
Nov 22 03:19:03 np0005531888 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[238850]: [WARNING]  (238854) : Exiting Master process...
Nov 22 03:19:03 np0005531888 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[238850]: [ALERT]    (238854) : Current worker (238856) exited with code 143 (Terminated)
Nov 22 03:19:03 np0005531888 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[238850]: [WARNING]  (238854) : All workers exited. Exiting... (0)
Nov 22 03:19:03 np0005531888 systemd[1]: libpod-68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6.scope: Deactivated successfully.
Nov 22 03:19:03 np0005531888 podman[239066]: 2025-11-22 08:19:03.835787743 +0000 UTC m=+0.065064592 container died 68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:19:03 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6-userdata-shm.mount: Deactivated successfully.
Nov 22 03:19:03 np0005531888 systemd[1]: var-lib-containers-storage-overlay-a359373dbd2806d7434f94ccb021abf6a60693fecb7b65661c8bacd70c7054c7-merged.mount: Deactivated successfully.
Nov 22 03:19:03 np0005531888 podman[239066]: 2025-11-22 08:19:03.88243596 +0000 UTC m=+0.111712819 container cleanup 68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:19:03 np0005531888 systemd[1]: libpod-conmon-68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6.scope: Deactivated successfully.
Nov 22 03:19:03 np0005531888 nova_compute[186788]: 2025-11-22 08:19:03.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:03 np0005531888 podman[239095]: 2025-11-22 08:19:03.978212615 +0000 UTC m=+0.073945730 container remove 68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 03:19:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:03.984 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5b0f01-6ed7-4b76-89c3-8aeef82a46c2]: (4, ('Sat Nov 22 08:19:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 (68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6)\n68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6\nSat Nov 22 08:19:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 (68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6)\n68614ccf9c9458eb3de81ed308995c4ff1d1b10b19772bc48b9dc59098ef69e6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:03.986 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7684afe4-9b92-4b42-b04b-5808eaa5f54d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:03 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:03.986 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec72ffac-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:03 np0005531888 nova_compute[186788]: 2025-11-22 08:19:03.988 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:03 np0005531888 kernel: tapec72ffac-70: left promiscuous mode
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.004 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:04.007 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4f784a-4a06-447d-ad99-35713a4ada80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:04.034 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[be87a2ec-2d2d-4a07-a5ce-bc2b6e39ec53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:04.035 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[675d5547-354a-4b9e-976c-509daf4e768b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:04.049 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[85c348b7-1091-4dea-9f87-8be916c87337]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625350, 'reachable_time': 31047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239130, 'error': None, 'target': 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:04.051 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:19:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:04.051 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[044b9e6b-642b-4cd6-96cd-14a6a6663cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:04 np0005531888 systemd[1]: run-netns-ovnmeta\x2dec72ffac\x2d7400\x2d49d0\x2d9e0a\x2d60c991449755.mount: Deactivated successfully.
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.143 186792 DEBUG nova.compute.manager [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.143 186792 DEBUG oslo_concurrency.lockutils [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.143 186792 DEBUG oslo_concurrency.lockutils [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.144 186792 DEBUG oslo_concurrency.lockutils [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.144 186792 DEBUG nova.compute.manager [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.144 186792 WARNING nova.compute.manager [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.316 186792 INFO nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance shutdown successfully after 3 seconds.#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.324 186792 INFO nova.virt.libvirt.driver [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance destroyed successfully.#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.325 186792 DEBUG nova.virt.libvirt.vif [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-104914358',display_name='tempest-TestNetworkAdvancedServerOps-server-104914358',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-104914358',id=150,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuYeAEoXXbFDPuWNPKdh/K1JH4L9ZCXU/SY8Quy5TL9WW/Qq6H4zQToZJbmU7x96LpJWQ/NfkaUrq1jAo7d4tTwPh3rAycu6tk9EuY65V+7L7m3g1sqWP9C3rGfSGoErQ==',key_name='tempest-TestNetworkAdvancedServerOps-1623117955',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:18:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-mnvd2q8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:18:58Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=ff7656a5-6680-4acd-a89d-fdc5e9fb914a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1838609208", "vif_mac": "fa:16:3e:e8:eb:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.326 186792 DEBUG nova.network.os_vif_util [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1838609208", "vif_mac": "fa:16:3e:e8:eb:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.326 186792 DEBUG nova.network.os_vif_util [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.327 186792 DEBUG os_vif [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.329 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.329 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6be1de1-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.331 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.335 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.337 186792 INFO os_vif [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2')#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.343 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.420 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.421 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.487 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.488 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Copying file /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk to 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.488 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:04 np0005531888 nova_compute[186788]: 2025-11-22 08:19:04.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.183 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "scp -r /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk" returned: 0 in 0.695s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.185 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Copying file /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.185 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk.config 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.425 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "scp -C -r /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk.config 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.426 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Copying file /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.426 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk.info 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.641 186792 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "scp -C -r /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_resize/disk.info 192.168.122.101:/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info" returned: 0 in 0.214s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.768 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:05 np0005531888 nova_compute[186788]: 2025-11-22 08:19:05.998 186792 DEBUG neutronclient.v2_0.client [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.181 186792 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.182 186792 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.182 186792 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.272 186792 DEBUG nova.compute.manager [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.273 186792 DEBUG oslo_concurrency.lockutils [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.273 186792 DEBUG oslo_concurrency.lockutils [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.273 186792 DEBUG oslo_concurrency.lockutils [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.274 186792 DEBUG nova.compute.manager [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:06 np0005531888 nova_compute[186788]: 2025-11-22 08:19:06.274 186792 WARNING nova.compute.manager [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:19:06 np0005531888 podman[239144]: 2025-11-22 08:19:06.712391937 +0000 UTC m=+0.078773388 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 22 03:19:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:07.550 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8::f816:3eff:fe2c:5b22'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ac76a812-5ead-4b51-8c63-4eaca1b65820) old=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:19:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:07.552 104023 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ac76a812-5ead-4b51-8c63-4eaca1b65820 in datapath cfb1249f-37ac-4df7-b559-e7968406997d updated#033[00m
Nov 22 03:19:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:07.555 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfb1249f-37ac-4df7-b559-e7968406997d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:19:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:07.556 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[15703999-2331-431e-aa7e-40bed53ff1f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:07 np0005531888 nova_compute[186788]: 2025-11-22 08:19:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:07 np0005531888 nova_compute[186788]: 2025-11-22 08:19:07.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:19:07 np0005531888 nova_compute[186788]: 2025-11-22 08:19:07.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:19:07 np0005531888 nova_compute[186788]: 2025-11-22 08:19:07.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:19:08 np0005531888 nova_compute[186788]: 2025-11-22 08:19:08.412 186792 DEBUG nova.compute.manager [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:08 np0005531888 nova_compute[186788]: 2025-11-22 08:19:08.412 186792 DEBUG nova.compute.manager [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing instance network info cache due to event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:19:08 np0005531888 nova_compute[186788]: 2025-11-22 08:19:08.413 186792 DEBUG oslo_concurrency.lockutils [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:08 np0005531888 nova_compute[186788]: 2025-11-22 08:19:08.413 186792 DEBUG oslo_concurrency.lockutils [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:08 np0005531888 nova_compute[186788]: 2025-11-22 08:19:08.413 186792 DEBUG nova.network.neutron [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:19:08 np0005531888 podman[239164]: 2025-11-22 08:19:08.681333629 +0000 UTC m=+0.052031360 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:19:08 np0005531888 nova_compute[186788]: 2025-11-22 08:19:08.941 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:08 np0005531888 nova_compute[186788]: 2025-11-22 08:19:08.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:09 np0005531888 nova_compute[186788]: 2025-11-22 08:19:09.331 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531888 nova_compute[186788]: 2025-11-22 08:19:09.826 186792 DEBUG nova.network.neutron [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updated VIF entry in instance network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:19:09 np0005531888 nova_compute[186788]: 2025-11-22 08:19:09.826 186792 DEBUG nova.network.neutron [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:09 np0005531888 nova_compute[186788]: 2025-11-22 08:19:09.875 186792 DEBUG oslo_concurrency.lockutils [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:09 np0005531888 nova_compute[186788]: 2025-11-22 08:19:09.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:10 np0005531888 nova_compute[186788]: 2025-11-22 08:19:10.816 186792 DEBUG nova.compute.manager [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:10 np0005531888 nova_compute[186788]: 2025-11-22 08:19:10.816 186792 DEBUG oslo_concurrency.lockutils [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:10 np0005531888 nova_compute[186788]: 2025-11-22 08:19:10.816 186792 DEBUG oslo_concurrency.lockutils [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:10 np0005531888 nova_compute[186788]: 2025-11-22 08:19:10.817 186792 DEBUG oslo_concurrency.lockutils [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:10 np0005531888 nova_compute[186788]: 2025-11-22 08:19:10.817 186792 DEBUG nova.compute.manager [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:10 np0005531888 nova_compute[186788]: 2025-11-22 08:19:10.817 186792 WARNING nova.compute.manager [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state resized and task_state None.#033[00m
Nov 22 03:19:12 np0005531888 podman[239188]: 2025-11-22 08:19:12.724046453 +0000 UTC m=+0.098577086 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:19:12 np0005531888 nova_compute[186788]: 2025-11-22 08:19:12.942 186792 DEBUG nova.compute.manager [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:12 np0005531888 nova_compute[186788]: 2025-11-22 08:19:12.943 186792 DEBUG oslo_concurrency.lockutils [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:12 np0005531888 nova_compute[186788]: 2025-11-22 08:19:12.943 186792 DEBUG oslo_concurrency.lockutils [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:12 np0005531888 nova_compute[186788]: 2025-11-22 08:19:12.944 186792 DEBUG oslo_concurrency.lockutils [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:12 np0005531888 nova_compute[186788]: 2025-11-22 08:19:12.944 186792 DEBUG nova.compute.manager [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:12 np0005531888 nova_compute[186788]: 2025-11-22 08:19:12.945 186792 WARNING nova.compute.manager [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state resized and task_state None.#033[00m
Nov 22 03:19:13 np0005531888 nova_compute[186788]: 2025-11-22 08:19:13.942 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:13 np0005531888 nova_compute[186788]: 2025-11-22 08:19:13.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:14 np0005531888 nova_compute[186788]: 2025-11-22 08:19:14.333 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:14 np0005531888 nova_compute[186788]: 2025-11-22 08:19:14.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:14 np0005531888 nova_compute[186788]: 2025-11-22 08:19:14.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:14 np0005531888 nova_compute[186788]: 2025-11-22 08:19:14.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:14 np0005531888 nova_compute[186788]: 2025-11-22 08:19:14.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:14 np0005531888 nova_compute[186788]: 2025-11-22 08:19:14.989 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.076 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000096, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk#033[00m
Nov 22 03:19:15 np0005531888 podman[239209]: 2025-11-22 08:19:15.103598313 +0000 UTC m=+0.073962310 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:19:15 np0005531888 podman[239211]: 2025-11-22 08:19:15.121987795 +0000 UTC m=+0.088841935 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.243 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.244 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.244 186792 DEBUG nova.compute.manager [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Going to confirm migration 17 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.282 186792 DEBUG nova.objects.instance [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'info_cache' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.319 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.320 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5698MB free_disk=73.23789596557617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.320 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.320 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.362 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Migration for instance ff7656a5-6680-4acd-a89d-fdc5e9fb914a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.396 186792 INFO nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating resource usage from migration 86738dcc-794c-48e8-bcb5-a8fc825b0b3f#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.397 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Starting to track outgoing migration 86738dcc-794c-48e8-bcb5-a8fc825b0b3f with flavor 31612188-3cd6-428b-9166-9568f0affd4a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.607 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Migration 86738dcc-794c-48e8-bcb5-a8fc825b0b3f is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.608 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.608 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.756 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.782 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.836 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:19:15 np0005531888 nova_compute[186788]: 2025-11-22 08:19:15.837 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:16.048 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8:0:1:f816:3eff:fe2c:5b22 2001:db8::f816:3eff:fe2c:5b22'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe2c:5b22/64 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ac76a812-5ead-4b51-8c63-4eaca1b65820) old=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8::f816:3eff:fe2c:5b22'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:19:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:16.049 104023 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ac76a812-5ead-4b51-8c63-4eaca1b65820 in datapath cfb1249f-37ac-4df7-b559-e7968406997d updated#033[00m
Nov 22 03:19:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:16.050 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfb1249f-37ac-4df7-b559-e7968406997d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:19:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:16.051 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[09895ffe-2733-47cc-aa10-a5591c2726a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:16 np0005531888 nova_compute[186788]: 2025-11-22 08:19:16.150 186792 DEBUG neutronclient.v2_0.client [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:19:16 np0005531888 nova_compute[186788]: 2025-11-22 08:19:16.151 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:16 np0005531888 nova_compute[186788]: 2025-11-22 08:19:16.152 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:16 np0005531888 nova_compute[186788]: 2025-11-22 08:19:16.152 186792 DEBUG nova.network.neutron [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:19:18 np0005531888 nova_compute[186788]: 2025-11-22 08:19:18.945 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:18 np0005531888 nova_compute[186788]: 2025-11-22 08:19:18.951 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799543.9504375, ff7656a5-6680-4acd-a89d-fdc5e9fb914a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:19:18 np0005531888 nova_compute[186788]: 2025-11-22 08:19:18.951 186792 INFO nova.compute.manager [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:19:18 np0005531888 nova_compute[186788]: 2025-11-22 08:19:18.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:18 np0005531888 nova_compute[186788]: 2025-11-22 08:19:18.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:18 np0005531888 nova_compute[186788]: 2025-11-22 08:19:18.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:19:18 np0005531888 nova_compute[186788]: 2025-11-22 08:19:18.984 186792 DEBUG nova.compute.manager [None req-15557c0c-076f-407e-b19e-e972e67e47d1 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:19:18 np0005531888 nova_compute[186788]: 2025-11-22 08:19:18.988 186792 DEBUG nova.compute.manager [None req-15557c0c-076f-407e-b19e-e972e67e47d1 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.018 186792 INFO nova.compute.manager [None req-15557c0c-076f-407e-b19e-e972e67e47d1 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.048 186792 DEBUG nova.network.neutron [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.066 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.066 186792 DEBUG nova.objects.instance [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.093 186792 DEBUG nova.virt.libvirt.vif [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-104914358',display_name='tempest-TestNetworkAdvancedServerOps-server-104914358',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-104914358',id=150,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuYeAEoXXbFDPuWNPKdh/K1JH4L9ZCXU/SY8Quy5TL9WW/Qq6H4zQToZJbmU7x96LpJWQ/NfkaUrq1jAo7d4tTwPh3rAycu6tk9EuY65V+7L7m3g1sqWP9C3rGfSGoErQ==',key_name='tempest-TestNetworkAdvancedServerOps-1623117955',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:19:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-mnvd2q8w',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:19:10Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=ff7656a5-6680-4acd-a89d-fdc5e9fb914a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.094 186792 DEBUG nova.network.os_vif_util [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.095 186792 DEBUG nova.network.os_vif_util [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.096 186792 DEBUG os_vif [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.097 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.098 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6be1de1-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.098 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.100 186792 INFO os_vif [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2')#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.100 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.101 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.198 186792 DEBUG nova.compute.provider_tree [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.212 186792 DEBUG nova.scheduler.client.report [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.335 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.341 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.769 186792 INFO nova.scheduler.client.report [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocation for migration 86738dcc-794c-48e8-bcb5-a8fc825b0b3f#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.981 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:19 np0005531888 nova_compute[186788]: 2025-11-22 08:19:19.981 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:19:21 np0005531888 nova_compute[186788]: 2025-11-22 08:19:21.271 186792 DEBUG oslo_concurrency.lockutils [None req-e336a594-fea5-4a82-8b0d-df8522c31990 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:22 np0005531888 nova_compute[186788]: 2025-11-22 08:19:22.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:23 np0005531888 nova_compute[186788]: 2025-11-22 08:19:23.947 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:23 np0005531888 nova_compute[186788]: 2025-11-22 08:19:23.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:23 np0005531888 nova_compute[186788]: 2025-11-22 08:19:23.972 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:19:23 np0005531888 nova_compute[186788]: 2025-11-22 08:19:23.992 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:19:24 np0005531888 nova_compute[186788]: 2025-11-22 08:19:24.337 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:25 np0005531888 podman[239254]: 2025-11-22 08:19:25.678639647 +0000 UTC m=+0.052123563 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:19:25 np0005531888 podman[239255]: 2025-11-22 08:19:25.684722426 +0000 UTC m=+0.051436785 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.460 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "623aa62d-3837-4517-b0c9-c705520c154b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.461 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.475 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.591 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.592 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.601 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.601 186792 INFO nova.compute.claims [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.742 186792 DEBUG nova.compute.provider_tree [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.761 186792 DEBUG nova.scheduler.client.report [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.800 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.801 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.871 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.872 186792 DEBUG nova.network.neutron [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.897 186792 INFO nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:19:27 np0005531888 nova_compute[186788]: 2025-11-22 08:19:27.915 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.041 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.043 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.044 186792 INFO nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Creating image(s)#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.045 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.046 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.047 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.066 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.125 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.126 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.127 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.144 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.233 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.235 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.278 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.280 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.280 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.358 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.359 186792 DEBUG nova.virt.disk.api [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.360 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.416 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.418 186792 DEBUG nova.virt.disk.api [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.418 186792 DEBUG nova.objects.instance [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 623aa62d-3837-4517-b0c9-c705520c154b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.436 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.437 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Ensure instance console log exists: /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.437 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.438 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.438 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:28 np0005531888 nova_compute[186788]: 2025-11-22 08:19:28.949 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:29 np0005531888 nova_compute[186788]: 2025-11-22 08:19:29.034 186792 DEBUG nova.policy [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:19:29 np0005531888 nova_compute[186788]: 2025-11-22 08:19:29.339 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:32 np0005531888 nova_compute[186788]: 2025-11-22 08:19:32.167 186792 DEBUG nova.network.neutron [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Successfully created port: 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:19:33 np0005531888 nova_compute[186788]: 2025-11-22 08:19:33.951 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:34 np0005531888 nova_compute[186788]: 2025-11-22 08:19:34.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:35 np0005531888 nova_compute[186788]: 2025-11-22 08:19:35.043 186792 DEBUG nova.network.neutron [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Successfully updated port: 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:19:35 np0005531888 nova_compute[186788]: 2025-11-22 08:19:35.059 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:35 np0005531888 nova_compute[186788]: 2025-11-22 08:19:35.059 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:35 np0005531888 nova_compute[186788]: 2025-11-22 08:19:35.060 186792 DEBUG nova.network.neutron [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:19:35 np0005531888 nova_compute[186788]: 2025-11-22 08:19:35.199 186792 DEBUG nova.compute.manager [req-ec62fbc3-fc4f-4cd3-be9a-d3b8902db1f1 req-5ee69da6-3304-4fbe-a5e7-edde7f8f0d8d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received event network-changed-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:35 np0005531888 nova_compute[186788]: 2025-11-22 08:19:35.199 186792 DEBUG nova.compute.manager [req-ec62fbc3-fc4f-4cd3-be9a-d3b8902db1f1 req-5ee69da6-3304-4fbe-a5e7-edde7f8f0d8d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Refreshing instance network info cache due to event network-changed-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:19:35 np0005531888 nova_compute[186788]: 2025-11-22 08:19:35.200 186792 DEBUG oslo_concurrency.lockutils [req-ec62fbc3-fc4f-4cd3-be9a-d3b8902db1f1 req-5ee69da6-3304-4fbe-a5e7-edde7f8f0d8d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:35 np0005531888 nova_compute[186788]: 2025-11-22 08:19:35.245 186792 DEBUG nova.network.neutron [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:19:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:36.834 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:36.835 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:36.835 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:37 np0005531888 podman[239312]: 2025-11-22 08:19:37.672083345 +0000 UTC m=+0.049849557 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.673 186792 DEBUG nova.network.neutron [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updating instance_info_cache with network_info: [{"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.747 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.747 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Instance network_info: |[{"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.748 186792 DEBUG oslo_concurrency.lockutils [req-ec62fbc3-fc4f-4cd3-be9a-d3b8902db1f1 req-5ee69da6-3304-4fbe-a5e7-edde7f8f0d8d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.748 186792 DEBUG nova.network.neutron [req-ec62fbc3-fc4f-4cd3-be9a-d3b8902db1f1 req-5ee69da6-3304-4fbe-a5e7-edde7f8f0d8d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Refreshing network info cache for port 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.752 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Start _get_guest_xml network_info=[{"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.756 186792 WARNING nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.762 186792 DEBUG nova.virt.libvirt.host [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.763 186792 DEBUG nova.virt.libvirt.host [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.766 186792 DEBUG nova.virt.libvirt.host [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.767 186792 DEBUG nova.virt.libvirt.host [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.768 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.768 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.768 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.768 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.769 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.769 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.769 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.770 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.770 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.770 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.770 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.771 186792 DEBUG nova.virt.hardware [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.774 186792 DEBUG nova.virt.libvirt.vif [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:19:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-854443914',display_name='tempest-TestGettingAddress-server-854443914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-854443914',id=151,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFe+Iivl03JqMU254FLdbdmYsFU6DbbfEEAr4K/FY8GDuQ0mBhcnts9hxMb1kzVXY50lm7S9mYwpnOmQECnf5XsVq5CeQ5VY2CUHiqO5dq+d/xeUGYNH940WARTUGprt0Q==',key_name='tempest-TestGettingAddress-223844638',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-u012fm3c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:19:27Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=623aa62d-3837-4517-b0c9-c705520c154b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.775 186792 DEBUG nova.network.os_vif_util [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.776 186792 DEBUG nova.network.os_vif_util [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:20:c3,bridge_name='br-int',has_traffic_filtering=True,id=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b3fde18-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.778 186792 DEBUG nova.objects.instance [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 623aa62d-3837-4517-b0c9-c705520c154b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.789 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <uuid>623aa62d-3837-4517-b0c9-c705520c154b</uuid>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <name>instance-00000097</name>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestGettingAddress-server-854443914</nova:name>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:19:38</nova:creationTime>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        <nova:port uuid="2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe1c:20c3" ipVersion="6"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe1c:20c3" ipVersion="6"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <entry name="serial">623aa62d-3837-4517-b0c9-c705520c154b</entry>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <entry name="uuid">623aa62d-3837-4517-b0c9-c705520c154b</entry>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk.config"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:1c:20:c3"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <target dev="tap2b3fde18-35"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/console.log" append="off"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:19:38 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:19:38 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:19:38 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:19:38 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.790 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Preparing to wait for external event network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.790 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "623aa62d-3837-4517-b0c9-c705520c154b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.790 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.791 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.791 186792 DEBUG nova.virt.libvirt.vif [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:19:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-854443914',display_name='tempest-TestGettingAddress-server-854443914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-854443914',id=151,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFe+Iivl03JqMU254FLdbdmYsFU6DbbfEEAr4K/FY8GDuQ0mBhcnts9hxMb1kzVXY50lm7S9mYwpnOmQECnf5XsVq5CeQ5VY2CUHiqO5dq+d/xeUGYNH940WARTUGprt0Q==',key_name='tempest-TestGettingAddress-223844638',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-u012fm3c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:19:27Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=623aa62d-3837-4517-b0c9-c705520c154b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.791 186792 DEBUG nova.network.os_vif_util [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.792 186792 DEBUG nova.network.os_vif_util [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:20:c3,bridge_name='br-int',has_traffic_filtering=True,id=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b3fde18-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.793 186792 DEBUG os_vif [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:20:c3,bridge_name='br-int',has_traffic_filtering=True,id=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b3fde18-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.793 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.794 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.794 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.798 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.799 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b3fde18-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.799 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2b3fde18-35, col_values=(('external_ids', {'iface-id': '2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:20:c3', 'vm-uuid': '623aa62d-3837-4517-b0c9-c705520c154b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.800 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:38 np0005531888 NetworkManager[55166]: <info>  [1763799578.8027] manager: (tap2b3fde18-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.803 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.807 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.808 186792 INFO os_vif [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:20:c3,bridge_name='br-int',has_traffic_filtering=True,id=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b3fde18-35')#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.853 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.853 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.853 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:1c:20:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.854 186792 INFO nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Using config drive#033[00m
Nov 22 03:19:38 np0005531888 nova_compute[186788]: 2025-11-22 08:19:38.953 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.177 186792 INFO nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Creating config drive at /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk.config#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.184 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4p58h6h2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.310 186792 DEBUG oslo_concurrency.processutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4p58h6h2" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:39 np0005531888 kernel: tap2b3fde18-35: entered promiscuous mode
Nov 22 03:19:39 np0005531888 NetworkManager[55166]: <info>  [1763799579.3801] manager: (tap2b3fde18-35): new Tun device (/org/freedesktop/NetworkManager/Devices/274)
Nov 22 03:19:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:39Z|00583|binding|INFO|Claiming lport 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 for this chassis.
Nov 22 03:19:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:39Z|00584|binding|INFO|2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2: Claiming fa:16:3e:1c:20:c3 10.100.0.12 2001:db8:0:1:f816:3eff:fe1c:20c3 2001:db8::f816:3eff:fe1c:20c3
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.381 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.390 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:20:c3 10.100.0.12 2001:db8:0:1:f816:3eff:fe1c:20c3 2001:db8::f816:3eff:fe1c:20c3'], port_security=['fa:16:3e:1c:20:c3 10.100.0.12 2001:db8:0:1:f816:3eff:fe1c:20c3 2001:db8::f816:3eff:fe1c:20c3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8:0:1:f816:3eff:fe1c:20c3/64 2001:db8::f816:3eff:fe1c:20c3/64', 'neutron:device_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '04c6695d-d046-49fb-a069-528067303a16', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.391 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 in datapath cfb1249f-37ac-4df7-b559-e7968406997d bound to our chassis#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.392 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cfb1249f-37ac-4df7-b559-e7968406997d#033[00m
Nov 22 03:19:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:39Z|00585|binding|INFO|Setting lport 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 ovn-installed in OVS
Nov 22 03:19:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:39Z|00586|binding|INFO|Setting lport 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 up in Southbound
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.398 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.400 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.404 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aeb36513-3e72-4828-aab9-fc0cda5f708e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.405 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcfb1249f-31 in ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.407 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcfb1249f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.407 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3acaf680-c656-4b67-842b-507e110e3e1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.412 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[60e16667-cb57-4ad8-a7ee-9a96c610554f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.422 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[37c76b22-6da0-4ca6-993c-70c13a4c4a6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 systemd-udevd[239373]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:19:39 np0005531888 systemd-machined[153106]: New machine qemu-73-instance-00000097.
Nov 22 03:19:39 np0005531888 podman[239342]: 2025-11-22 08:19:39.436477227 +0000 UTC m=+0.059474703 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:19:39 np0005531888 NetworkManager[55166]: <info>  [1763799579.4386] device (tap2b3fde18-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:19:39 np0005531888 systemd[1]: Started Virtual Machine qemu-73-instance-00000097.
Nov 22 03:19:39 np0005531888 NetworkManager[55166]: <info>  [1763799579.4393] device (tap2b3fde18-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.445 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0a99afc5-c73e-4764-a96b-a2975e1dda4b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.474 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[32e38a5a-8b8a-4c14-914e-cc52a523a2a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 systemd-udevd[239376]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:19:39 np0005531888 NetworkManager[55166]: <info>  [1763799579.4799] manager: (tapcfb1249f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/275)
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.479 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb0a6a6-ff69-436f-b247-d996ebd63eb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.505 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4632f763-dc9a-482d-bda4-85f343e66e35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.508 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5a58624a-5425-40a9-92ed-6e37d0b62f07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 NetworkManager[55166]: <info>  [1763799579.5266] device (tapcfb1249f-30): carrier: link connected
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.532 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[67e93a25-4e51-46d8-be7c-8d02bbecd7c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.549 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[21e75759-99f1-4bbc-a47d-6e49b5a79b29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfb1249f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:5b:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632017, 'reachable_time': 29573, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239406, 'error': None, 'target': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.564 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[311ee81e-b143-4029-ac8f-5dd8e2f6d01d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:5b22'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 632017, 'tstamp': 632017}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239407, 'error': None, 'target': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.587 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1d02f40b-92b2-422e-a8a8-a92a501461ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfb1249f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:5b:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632017, 'reachable_time': 29573, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239408, 'error': None, 'target': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.615 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[234ec952-b70c-446f-849a-6eb88fa9e581]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.672 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d15a35d7-ae96-4f1c-a8bb-43f4b9eb8c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.673 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfb1249f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.673 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.674 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfb1249f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:39 np0005531888 kernel: tapcfb1249f-30: entered promiscuous mode
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.676 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:39 np0005531888 NetworkManager[55166]: <info>  [1763799579.6780] manager: (tapcfb1249f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.679 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcfb1249f-30, col_values=(('external_ids', {'iface-id': 'ac76a812-5ead-4b51-8c63-4eaca1b65820'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:39Z|00587|binding|INFO|Releasing lport ac76a812-5ead-4b51-8c63-4eaca1b65820 from this chassis (sb_readonly=0)
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.681 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cfb1249f-37ac-4df7-b559-e7968406997d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cfb1249f-37ac-4df7-b559-e7968406997d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.682 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9233b103-4c01-4ebd-bca1-4086ec5211ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.684 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-cfb1249f-37ac-4df7-b559-e7968406997d
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/cfb1249f-37ac-4df7-b559-e7968406997d.pid.haproxy
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID cfb1249f-37ac-4df7-b559-e7968406997d
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:19:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:19:39.684 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'env', 'PROCESS_TAG=haproxy-cfb1249f-37ac-4df7-b559-e7968406997d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cfb1249f-37ac-4df7-b559-e7968406997d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.693 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.871 186792 DEBUG nova.compute.manager [req-30c2ee54-4cb2-4141-99bb-4f401a95a660 req-04b31187-bad0-420a-a690-84ab02ffa1c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received event network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.872 186792 DEBUG oslo_concurrency.lockutils [req-30c2ee54-4cb2-4141-99bb-4f401a95a660 req-04b31187-bad0-420a-a690-84ab02ffa1c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "623aa62d-3837-4517-b0c9-c705520c154b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.873 186792 DEBUG oslo_concurrency.lockutils [req-30c2ee54-4cb2-4141-99bb-4f401a95a660 req-04b31187-bad0-420a-a690-84ab02ffa1c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.873 186792 DEBUG oslo_concurrency.lockutils [req-30c2ee54-4cb2-4141-99bb-4f401a95a660 req-04b31187-bad0-420a-a690-84ab02ffa1c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.874 186792 DEBUG nova.compute.manager [req-30c2ee54-4cb2-4141-99bb-4f401a95a660 req-04b31187-bad0-420a-a690-84ab02ffa1c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Processing event network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.915 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.917 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799579.916335, 623aa62d-3837-4517-b0c9-c705520c154b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.917 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] VM Started (Lifecycle Event)#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.922 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.925 186792 INFO nova.virt.libvirt.driver [-] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Instance spawned successfully.#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.925 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.948 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.949 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.949 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.950 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.950 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.951 186792 DEBUG nova.virt.libvirt.driver [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.982 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:19:39 np0005531888 nova_compute[186788]: 2025-11-22 08:19:39.985 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.023 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.023 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799579.916738, 623aa62d-3837-4517-b0c9-c705520c154b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.024 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.058 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.059 186792 INFO nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Took 12.02 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.060 186792 DEBUG nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.063 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799579.9212646, 623aa62d-3837-4517-b0c9-c705520c154b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.063 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.091 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.094 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:19:40 np0005531888 podman[239448]: 2025-11-22 08:19:40.10548012 +0000 UTC m=+0.075158450 container create 59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.113 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.137 186792 INFO nova.compute.manager [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Took 12.59 seconds to build instance.#033[00m
Nov 22 03:19:40 np0005531888 systemd[1]: Started libpod-conmon-59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0.scope.
Nov 22 03:19:40 np0005531888 podman[239448]: 2025-11-22 08:19:40.051926043 +0000 UTC m=+0.021604393 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.151 186792 DEBUG oslo_concurrency.lockutils [None req-a13e9b93-0cf6-4bfd-aeb2-0d45d84d35f0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:40 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:19:40 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81a4d96002b6bca77da64a0f8de78a7950010299436ba6e48f4803dfd6b4689c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:19:40 np0005531888 podman[239448]: 2025-11-22 08:19:40.191499615 +0000 UTC m=+0.161177965 container init 59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:19:40 np0005531888 podman[239448]: 2025-11-22 08:19:40.199986204 +0000 UTC m=+0.169664534 container start 59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:19:40 np0005531888 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[239463]: [NOTICE]   (239467) : New worker (239469) forked
Nov 22 03:19:40 np0005531888 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[239463]: [NOTICE]   (239467) : Loading success.
Nov 22 03:19:40 np0005531888 nova_compute[186788]: 2025-11-22 08:19:40.999 186792 DEBUG nova.network.neutron [req-ec62fbc3-fc4f-4cd3-be9a-d3b8902db1f1 req-5ee69da6-3304-4fbe-a5e7-edde7f8f0d8d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updated VIF entry in instance network info cache for port 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:19:41 np0005531888 nova_compute[186788]: 2025-11-22 08:19:41.000 186792 DEBUG nova.network.neutron [req-ec62fbc3-fc4f-4cd3-be9a-d3b8902db1f1 req-5ee69da6-3304-4fbe-a5e7-edde7f8f0d8d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updating instance_info_cache with network_info: [{"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:41 np0005531888 nova_compute[186788]: 2025-11-22 08:19:41.019 186792 DEBUG oslo_concurrency.lockutils [req-ec62fbc3-fc4f-4cd3-be9a-d3b8902db1f1 req-5ee69da6-3304-4fbe-a5e7-edde7f8f0d8d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:41 np0005531888 nova_compute[186788]: 2025-11-22 08:19:41.953 186792 DEBUG nova.compute.manager [req-d10f2d19-9ebc-4db7-bfd5-2cb7fbb44fd2 req-1920d94f-adcb-4fe3-b8b8-f5caaa30c6d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received event network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:41 np0005531888 nova_compute[186788]: 2025-11-22 08:19:41.954 186792 DEBUG oslo_concurrency.lockutils [req-d10f2d19-9ebc-4db7-bfd5-2cb7fbb44fd2 req-1920d94f-adcb-4fe3-b8b8-f5caaa30c6d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "623aa62d-3837-4517-b0c9-c705520c154b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:41 np0005531888 nova_compute[186788]: 2025-11-22 08:19:41.954 186792 DEBUG oslo_concurrency.lockutils [req-d10f2d19-9ebc-4db7-bfd5-2cb7fbb44fd2 req-1920d94f-adcb-4fe3-b8b8-f5caaa30c6d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:41 np0005531888 nova_compute[186788]: 2025-11-22 08:19:41.955 186792 DEBUG oslo_concurrency.lockutils [req-d10f2d19-9ebc-4db7-bfd5-2cb7fbb44fd2 req-1920d94f-adcb-4fe3-b8b8-f5caaa30c6d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:41 np0005531888 nova_compute[186788]: 2025-11-22 08:19:41.955 186792 DEBUG nova.compute.manager [req-d10f2d19-9ebc-4db7-bfd5-2cb7fbb44fd2 req-1920d94f-adcb-4fe3-b8b8-f5caaa30c6d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] No waiting events found dispatching network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:41 np0005531888 nova_compute[186788]: 2025-11-22 08:19:41.956 186792 WARNING nova.compute.manager [req-d10f2d19-9ebc-4db7-bfd5-2cb7fbb44fd2 req-1920d94f-adcb-4fe3-b8b8-f5caaa30c6d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received unexpected event network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:19:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:42Z|00588|binding|INFO|Releasing lport ac76a812-5ead-4b51-8c63-4eaca1b65820 from this chassis (sb_readonly=0)
Nov 22 03:19:42 np0005531888 nova_compute[186788]: 2025-11-22 08:19:42.208 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:42Z|00589|binding|INFO|Releasing lport ac76a812-5ead-4b51-8c63-4eaca1b65820 from this chassis (sb_readonly=0)
Nov 22 03:19:42 np0005531888 nova_compute[186788]: 2025-11-22 08:19:42.278 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:43Z|00590|binding|INFO|Releasing lport ac76a812-5ead-4b51-8c63-4eaca1b65820 from this chassis (sb_readonly=0)
Nov 22 03:19:43 np0005531888 nova_compute[186788]: 2025-11-22 08:19:43.445 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:43 np0005531888 NetworkManager[55166]: <info>  [1763799583.4505] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Nov 22 03:19:43 np0005531888 NetworkManager[55166]: <info>  [1763799583.4514] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Nov 22 03:19:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:43Z|00591|binding|INFO|Releasing lport ac76a812-5ead-4b51-8c63-4eaca1b65820 from this chassis (sb_readonly=0)
Nov 22 03:19:43 np0005531888 nova_compute[186788]: 2025-11-22 08:19:43.474 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:43 np0005531888 nova_compute[186788]: 2025-11-22 08:19:43.480 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:43 np0005531888 podman[239480]: 2025-11-22 08:19:43.67434715 +0000 UTC m=+0.049047317 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, name=ubi9-minimal)
Nov 22 03:19:43 np0005531888 nova_compute[186788]: 2025-11-22 08:19:43.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:43 np0005531888 nova_compute[186788]: 2025-11-22 08:19:43.955 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:44 np0005531888 nova_compute[186788]: 2025-11-22 08:19:44.032 186792 DEBUG nova.compute.manager [req-5933bc26-78ef-4535-9c49-16388f52ed4e req-cfd135e4-83eb-4478-b189-d25d7a2a8448 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received event network-changed-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:44 np0005531888 nova_compute[186788]: 2025-11-22 08:19:44.033 186792 DEBUG nova.compute.manager [req-5933bc26-78ef-4535-9c49-16388f52ed4e req-cfd135e4-83eb-4478-b189-d25d7a2a8448 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Refreshing instance network info cache due to event network-changed-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:19:44 np0005531888 nova_compute[186788]: 2025-11-22 08:19:44.033 186792 DEBUG oslo_concurrency.lockutils [req-5933bc26-78ef-4535-9c49-16388f52ed4e req-cfd135e4-83eb-4478-b189-d25d7a2a8448 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:44 np0005531888 nova_compute[186788]: 2025-11-22 08:19:44.033 186792 DEBUG oslo_concurrency.lockutils [req-5933bc26-78ef-4535-9c49-16388f52ed4e req-cfd135e4-83eb-4478-b189-d25d7a2a8448 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:44 np0005531888 nova_compute[186788]: 2025-11-22 08:19:44.035 186792 DEBUG nova.network.neutron [req-5933bc26-78ef-4535-9c49-16388f52ed4e req-cfd135e4-83eb-4478-b189-d25d7a2a8448 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Refreshing network info cache for port 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:19:45 np0005531888 podman[239501]: 2025-11-22 08:19:45.706087267 +0000 UTC m=+0.077975689 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:19:45 np0005531888 podman[239502]: 2025-11-22 08:19:45.715105648 +0000 UTC m=+0.082531891 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 03:19:45 np0005531888 nova_compute[186788]: 2025-11-22 08:19:45.739 186792 DEBUG nova.network.neutron [req-5933bc26-78ef-4535-9c49-16388f52ed4e req-cfd135e4-83eb-4478-b189-d25d7a2a8448 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updated VIF entry in instance network info cache for port 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:19:45 np0005531888 nova_compute[186788]: 2025-11-22 08:19:45.740 186792 DEBUG nova.network.neutron [req-5933bc26-78ef-4535-9c49-16388f52ed4e req-cfd135e4-83eb-4478-b189-d25d7a2a8448 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updating instance_info_cache with network_info: [{"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:45 np0005531888 nova_compute[186788]: 2025-11-22 08:19:45.756 186792 DEBUG oslo_concurrency.lockutils [req-5933bc26-78ef-4535-9c49-16388f52ed4e req-cfd135e4-83eb-4478-b189-d25d7a2a8448 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:47 np0005531888 nova_compute[186788]: 2025-11-22 08:19:47.824 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:48 np0005531888 nova_compute[186788]: 2025-11-22 08:19:48.804 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:48 np0005531888 nova_compute[186788]: 2025-11-22 08:19:48.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:53 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:53Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:20:c3 10.100.0.12
Nov 22 03:19:53 np0005531888 ovn_controller[95067]: 2025-11-22T08:19:53Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:20:c3 10.100.0.12
Nov 22 03:19:53 np0005531888 nova_compute[186788]: 2025-11-22 08:19:53.808 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:53 np0005531888 nova_compute[186788]: 2025-11-22 08:19:53.960 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:54 np0005531888 nova_compute[186788]: 2025-11-22 08:19:54.209 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:56 np0005531888 podman[239558]: 2025-11-22 08:19:56.675500219 +0000 UTC m=+0.049791035 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:19:56 np0005531888 podman[239559]: 2025-11-22 08:19:56.706621406 +0000 UTC m=+0.077678252 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:19:58 np0005531888 nova_compute[186788]: 2025-11-22 08:19:58.811 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:58 np0005531888 nova_compute[186788]: 2025-11-22 08:19:58.961 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:03 np0005531888 nova_compute[186788]: 2025-11-22 08:20:03.815 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:03 np0005531888 nova_compute[186788]: 2025-11-22 08:20:03.963 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:04.030 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:04 np0005531888 nova_compute[186788]: 2025-11-22 08:20:04.031 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:04.032 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:20:04 np0005531888 nova_compute[186788]: 2025-11-22 08:20:04.976 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:05.035 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:05 np0005531888 nova_compute[186788]: 2025-11-22 08:20:05.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:07 np0005531888 nova_compute[186788]: 2025-11-22 08:20:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:07 np0005531888 nova_compute[186788]: 2025-11-22 08:20:07.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:20:07 np0005531888 nova_compute[186788]: 2025-11-22 08:20:07.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:20:08 np0005531888 nova_compute[186788]: 2025-11-22 08:20:08.385 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:08 np0005531888 nova_compute[186788]: 2025-11-22 08:20:08.385 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:08 np0005531888 nova_compute[186788]: 2025-11-22 08:20:08.386 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:20:08 np0005531888 nova_compute[186788]: 2025-11-22 08:20:08.386 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 623aa62d-3837-4517-b0c9-c705520c154b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:08 np0005531888 podman[239602]: 2025-11-22 08:20:08.674757549 +0000 UTC m=+0.051596699 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:20:08 np0005531888 nova_compute[186788]: 2025-11-22 08:20:08.818 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:08 np0005531888 nova_compute[186788]: 2025-11-22 08:20:08.965 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:09 np0005531888 podman[239622]: 2025-11-22 08:20:09.679380396 +0000 UTC m=+0.058658824 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:20:13 np0005531888 nova_compute[186788]: 2025-11-22 08:20:13.338 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updating instance_info_cache with network_info: [{"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:13 np0005531888 nova_compute[186788]: 2025-11-22 08:20:13.359 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:13 np0005531888 nova_compute[186788]: 2025-11-22 08:20:13.360 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:20:13 np0005531888 nova_compute[186788]: 2025-11-22 08:20:13.360 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:13 np0005531888 nova_compute[186788]: 2025-11-22 08:20:13.361 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:13 np0005531888 nova_compute[186788]: 2025-11-22 08:20:13.821 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:13 np0005531888 nova_compute[186788]: 2025-11-22 08:20:13.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:14 np0005531888 podman[239648]: 2025-11-22 08:20:14.689611203 +0000 UTC m=+0.063334448 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:20:14 np0005531888 nova_compute[186788]: 2025-11-22 08:20:14.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:16 np0005531888 podman[239670]: 2025-11-22 08:20:16.701288757 +0000 UTC m=+0.075569579 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:20:16 np0005531888 podman[239671]: 2025-11-22 08:20:16.701773989 +0000 UTC m=+0.072944365 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:20:16 np0005531888 nova_compute[186788]: 2025-11-22 08:20:16.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:16 np0005531888 nova_compute[186788]: 2025-11-22 08:20:16.975 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:16 np0005531888 nova_compute[186788]: 2025-11-22 08:20:16.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:16 np0005531888 nova_compute[186788]: 2025-11-22 08:20:16.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:16 np0005531888 nova_compute[186788]: 2025-11-22 08:20:16.976 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.044 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.129 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.130 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.183 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.323 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.324 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5509MB free_disk=73.23781204223633GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.324 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.324 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.424 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 623aa62d-3837-4517-b0c9-c705520c154b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.425 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.425 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.441 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.587 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.587 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.611 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.653 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.696 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.709 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.732 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:20:17 np0005531888 nova_compute[186788]: 2025-11-22 08:20:17.732 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:18 np0005531888 nova_compute[186788]: 2025-11-22 08:20:18.824 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:18 np0005531888 nova_compute[186788]: 2025-11-22 08:20:18.969 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:19 np0005531888 nova_compute[186788]: 2025-11-22 08:20:19.732 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:21 np0005531888 nova_compute[186788]: 2025-11-22 08:20:21.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:21 np0005531888 nova_compute[186788]: 2025-11-22 08:20:21.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:20:23 np0005531888 nova_compute[186788]: 2025-11-22 08:20:23.827 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:23 np0005531888 nova_compute[186788]: 2025-11-22 08:20:23.971 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:27 np0005531888 podman[239720]: 2025-11-22 08:20:27.668744106 +0000 UTC m=+0.045843209 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:20:27 np0005531888 podman[239721]: 2025-11-22 08:20:27.675516572 +0000 UTC m=+0.045627643 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:20:28 np0005531888 nova_compute[186788]: 2025-11-22 08:20:28.830 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:28 np0005531888 nova_compute[186788]: 2025-11-22 08:20:28.972 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.240 186792 DEBUG nova.compute.manager [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.655 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.656 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.781 186792 DEBUG nova.objects.instance [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'pci_requests' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.804 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.804 186792 INFO nova.compute.claims [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.805 186792 DEBUG nova.objects.instance [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'resources' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.821 186792 DEBUG nova.objects.instance [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'numa_topology' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.839 186792 DEBUG nova.objects.instance [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'pci_devices' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.983 186792 INFO nova.compute.resource_tracker [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating resource usage from migration e34e02d3-9f5b-48e1-82d2-ca0d3bbe1c70#033[00m
Nov 22 03:20:32 np0005531888 nova_compute[186788]: 2025-11-22 08:20:32.983 186792 DEBUG nova.compute.resource_tracker [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Starting to track incoming migration e34e02d3-9f5b-48e1-82d2-ca0d3bbe1c70 with flavor 31612188-3cd6-428b-9166-9568f0affd4a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 22 03:20:33 np0005531888 nova_compute[186788]: 2025-11-22 08:20:33.251 186792 DEBUG nova.compute.provider_tree [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:20:33 np0005531888 nova_compute[186788]: 2025-11-22 08:20:33.273 186792 DEBUG nova.scheduler.client.report [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:20:33 np0005531888 nova_compute[186788]: 2025-11-22 08:20:33.364 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:33 np0005531888 nova_compute[186788]: 2025-11-22 08:20:33.364 186792 INFO nova.compute.manager [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Migrating#033[00m
Nov 22 03:20:33 np0005531888 nova_compute[186788]: 2025-11-22 08:20:33.833 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:33 np0005531888 nova_compute[186788]: 2025-11-22 08:20:33.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:33 np0005531888 nova_compute[186788]: 2025-11-22 08:20:33.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:36 np0005531888 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 03:20:36 np0005531888 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 03:20:36 np0005531888 systemd-logind[825]: New session 43 of user nova.
Nov 22 03:20:36 np0005531888 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 03:20:36 np0005531888 systemd[1]: Starting User Manager for UID 42436...
Nov 22 03:20:36 np0005531888 systemd[239764]: Queued start job for default target Main User Target.
Nov 22 03:20:36 np0005531888 systemd[239764]: Created slice User Application Slice.
Nov 22 03:20:36 np0005531888 systemd[239764]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 03:20:36 np0005531888 systemd[239764]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 03:20:36 np0005531888 systemd[239764]: Reached target Paths.
Nov 22 03:20:36 np0005531888 systemd[239764]: Reached target Timers.
Nov 22 03:20:36 np0005531888 systemd[239764]: Starting D-Bus User Message Bus Socket...
Nov 22 03:20:36 np0005531888 systemd[239764]: Starting Create User's Volatile Files and Directories...
Nov 22 03:20:36 np0005531888 systemd[239764]: Listening on D-Bus User Message Bus Socket.
Nov 22 03:20:36 np0005531888 systemd[239764]: Reached target Sockets.
Nov 22 03:20:36 np0005531888 systemd[239764]: Finished Create User's Volatile Files and Directories.
Nov 22 03:20:36 np0005531888 systemd[239764]: Reached target Basic System.
Nov 22 03:20:36 np0005531888 systemd[239764]: Reached target Main User Target.
Nov 22 03:20:36 np0005531888 systemd[239764]: Startup finished in 168ms.
Nov 22 03:20:36 np0005531888 systemd[1]: Started User Manager for UID 42436.
Nov 22 03:20:36 np0005531888 systemd[1]: Started Session 43 of User nova.
Nov 22 03:20:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:36.834 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:36.836 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:36.838 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.850 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '623aa62d-3837-4517-b0c9-c705520c154b', 'name': 'tempest-TestGettingAddress-server-854443914', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000097', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.851 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.851 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-854443914>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-854443914>]
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.852 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.857 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 623aa62d-3837-4517-b0c9-c705520c154b / tap2b3fde18-35 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.858 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '638cccbf-83a3-40a1-a43e-01357a7429f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.852177', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1f9aa25e-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': '1e6ab38f64b6e24d56bb82f0419e8dc217eb1626d5435b7351f1ac219cb7144e'}]}, 'timestamp': '2025-11-22 08:20:36.858862', '_unique_id': '9b2d10a03b2d477abde77de6bcb06551'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.861 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.875 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.876 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41f851d3-488c-4b87-80d8-5e807c4f9c58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.861707', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f9d5332-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.561228233, 'message_signature': '19e70f975e6b33b58569e58a88b94bdbb40999725c970856ed7e47fb7ee815cb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.861707', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f9d5fc6-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.561228233, 'message_signature': '417a078d7526e632925d31324d7e5869db0a5bd1fd6ab2171a6fed1b7aafadb7'}]}, 'timestamp': '2025-11-22 08:20:36.876672', '_unique_id': 'b0b0f45692b54447a91a68b38f881eaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.878 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:20:36 np0005531888 systemd[1]: session-43.scope: Deactivated successfully.
Nov 22 03:20:36 np0005531888 systemd-logind[825]: Session 43 logged out. Waiting for processes to exit.
Nov 22 03:20:36 np0005531888 systemd-logind[825]: Removed session 43.
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.910 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.write.requests volume: 287 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.911 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73bb4d4d-7275-4373-9539-abcbe05c0682', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 287, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.878658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1fa29f36-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': 'a6def66e25d4780d2b99efa1ab7eaca1d6f8845ef2b15ce393e4060e3d69d99a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.878658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1fa2ab34-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': '9632d688efb4a5df7da29b0f9963005cb807cef97cc67a0170387ae73e2e8de0'}]}, 'timestamp': '2025-11-22 08:20:36.911305', '_unique_id': '75a087d8e8d1468a9a0bc1db29b23dc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.912 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.913 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.913 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-854443914>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-854443914>]
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.913 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.930 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/cpu volume: 13540000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8a32697-cb3d-4b17-8fab-9b7bd3026ba9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13540000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'timestamp': '2025-11-22T08:20:36.913652', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1fa5a334-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.629856601, 'message_signature': '332b800374938ce96833207e92502af682b9a7429f219930dc432cfd50d30e5f'}]}, 'timestamp': '2025-11-22 08:20:36.930796', '_unique_id': '5cbcda12d4004d989898258ef424dfd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/memory.usage volume: 42.734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c6da5b3-36e6-4f62-a617-8baa551f978c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.734375, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'timestamp': '2025-11-22T08:20:36.932207', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '1fa5e56a-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.629856601, 'message_signature': 'c0ab7a97e7e86328c903d705acb162080cfe2612895dc27c4949c985063f99d3'}]}, 'timestamp': '2025-11-22 08:20:36.932448', '_unique_id': '68807e56fa654860bffb195bc2b7f8b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.933 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7610795c-ed59-4740-8859-361a83cf20d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.933622', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa61d0a-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': 'df4d9aeb99a5147e308c28457e53b5569c36df04ade98d93b057a01d53591bb4'}]}, 'timestamp': '2025-11-22 08:20:36.933881', '_unique_id': 'dda1319950e64ef6bd814ab8a6c60641'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.935 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.read.latency volume: 1392448198 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.935 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.read.latency volume: 54827229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af0efa25-88f2-438e-8f6e-2517df6fb529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1392448198, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.935035', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1fa6539c-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': '4e5380690fe1dbc7d2dc79a20c1fe7215c736e1918481338b66113c64c766c6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54827229, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.935035', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1fa65c3e-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': 'af7781df078e3b570384592a84e341545e51f190f99235d6c88f03d678b7cb97'}]}, 'timestamp': '2025-11-22 08:20:36.935482', '_unique_id': '6399981b443b436a90425d391cab51ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.936 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6afccd7f-ee71-4346-88fe-6094dbf82632', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.936748', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1fa6969a-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.561228233, 'message_signature': 'ff840c7670ba0cf6ec6393270b149b4ac6aeaf31eb8e50c9e05437bed8a22462'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.936748', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1fa69fe6-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.561228233, 'message_signature': '3de838406e67470b6c99dda76484a919ad056adcbe6e50a93b46e734a4b6cfde'}]}, 'timestamp': '2025-11-22 08:20:36.937225', '_unique_id': '0f2534a8503a48a1a04087e48f0d2ece'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.937 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.938 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15577788-5eb6-4c64-bd15-e5fa77db72ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.938447', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa6d9e8-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': '54e5b22b40cfbb2ba7cd30cfe391afc4274ce4203bc14a6cf80a08bcb29b64fd'}]}, 'timestamp': '2025-11-22 08:20:36.938745', '_unique_id': '64de4ad9471d42d8980cf0d355d2a3ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.939 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.outgoing.bytes volume: 4048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5744b5cb-1e14-40b6-be63-971c044d412b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4048, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.939940', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa71322-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': 'ecef354a7277942250158176d89b08c2a874dfe130ab0e48a840456d9f30fa82'}]}, 'timestamp': '2025-11-22 08:20:36.940181', '_unique_id': '0e71c5ad5bc54200806de5fa27feb40f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.941 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.941 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.incoming.packets volume: 43 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '828550ad-7bc6-4ac0-baed-dd0674e40310', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 43, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.941353', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa74a7c-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': '5e6c2a404c32655ba294190bcb20e8269f9239b526bf7b742d7fdbfec1f46d14'}]}, 'timestamp': '2025-11-22 08:20:36.941621', '_unique_id': 'db99652c88b44803a3173f1ffda81eda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.942 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.read.bytes volume: 30833152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26a05834-cc15-4ce3-a177-61edcb217bbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30833152, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.942785', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1fa7821c-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': 'a23f5a728e8769847a0d31e4852a42887fe83ea3c1ec6cfad4eb7e0ba37af9b1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.942785', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1fa78aa0-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': '82636a6cc5eb669a519e50bb6548b9e47ce421c8567c58c4d776119c7836e7ab'}]}, 'timestamp': '2025-11-22 08:20:36.943234', '_unique_id': '5544d31d1e1d4a9890e7bdfd16fd3f22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.944 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.read.requests volume: 1116 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.944 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '695468dc-888e-4496-89b4-480d0a522f2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1116, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.944489', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1fa7c600-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': 'd712705630e6dbee0796ee4a56133e483c2981500261930d536ad05707a85fa5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.944489', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1fa7d154-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': '710e10e74700e7edcd15ee1df10a917d704c70a9fedf3e7e10282add584e6a36'}]}, 'timestamp': '2025-11-22 08:20:36.945054', '_unique_id': '786ec483cd8c4adbad07a1e5006d30e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.945 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4464751f-6105-4ff9-8042-d8c459fce9d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.946244', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa8098a-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': 'ef7eaa73df96156ff25113ff6f26f5f5c48db80e066dd8d3488aa427a9d776ae'}]}, 'timestamp': '2025-11-22 08:20:36.946488', '_unique_id': 'a8b8673cd5b447879b9b04c8ee61ec85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.947 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71bb5233-f753-4fb2-82c8-20adab06431f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.947654', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa8413e-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': '6e42e6f61787d729de797f1b54fd43f7347607d52a7037884017c9d854e8bfa3'}]}, 'timestamp': '2025-11-22 08:20:36.947920', '_unique_id': 'add4e963f726408a9f137960af23ccd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89fa4957-ac4b-417a-b17a-cbd4b33c6386', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.949074', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa877e4-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': '0edf71869c14e16fd6f26e16d9e43b63337a332c2bd1749f984765f21a9ef54d'}]}, 'timestamp': '2025-11-22 08:20:36.949312', '_unique_id': '38e4ab8f957a436a89c30edc9bc87183'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.950 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.950 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f78bc6b-045d-48ac-ba85-c7782bdc4782', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72953856, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.950433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1fa8ace6-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': '34a78d1853e44c0fbddfd1008d343a0da6bb285816aff518fe56992b4943fa5d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.950433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1fa8b664-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': '354b2b5096c49c3bb16e52607ba7f35c60643bb23ececce4352244caee805062'}]}, 'timestamp': '2025-11-22 08:20:36.950897', '_unique_id': '611075edfa0b41888e9e2f19c56242be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.incoming.bytes volume: 5701 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '721ea1a3-fde1-4a59-b5e5-ca6199921da2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5701, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.952098', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa8ee0e-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': 'b035c18e8d6704de95af754f06995682b96adfbb9f425118576da27b9a0773a3'}]}, 'timestamp': '2025-11-22 08:20:36.952336', '_unique_id': 'd9145508b39d4be79662fd5f1c406bc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.953 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9c9ace1-ef8c-4b7d-9497-1a80a03b8230', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000097-623aa62d-3837-4517-b0c9-c705520c154b-tap2b3fde18-35', 'timestamp': '2025-11-22T08:20:36.953639', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'tap2b3fde18-35', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:20:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2b3fde18-35'}, 'message_id': '1fa92a22-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.551663388, 'message_signature': '817a9c30706e680862ec9b37d637fe13979793cdeff3d5034a8bee7fb6305746'}]}, 'timestamp': '2025-11-22 08:20:36.953881', '_unique_id': '93247d9b1dfe4cddb0cc83fd8ac9e065'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.954 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.955 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.955 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-854443914>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-854443914>]
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.955 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.955 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.955 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-854443914>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-854443914>]
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.955 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.955 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.955 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df840210-806c-4f76-b64e-c3811d10bbb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.955679', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1fa97b08-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.561228233, 'message_signature': '4b266d206c1b915b1c6a92f5b4f6d431b16e69c92886a514b27a8a1826b0c7ef'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.955679', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1fa98490-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.561228233, 'message_signature': 'b93a7610acbe957ce88d2b8f8806d357b3dec9ca490001bea6e339312140a035'}]}, 'timestamp': '2025-11-22 08:20:36.956174', '_unique_id': '685eef12d12942f2a88b1d7d5fc2021e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.957 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.write.latency volume: 3575681518 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.957 12 DEBUG ceilometer.compute.pollsters [-] 623aa62d-3837-4517-b0c9-c705520c154b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5889ddc-9b42-4f2c-8003-d094f8c898c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3575681518, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-vda', 'timestamp': '2025-11-22T08:20:36.957357', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1fa9bb72-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': 'ba1158623caa8843337f09527f69e2ae127abfa55137ccbfd7e9abbb1db4edad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '623aa62d-3837-4517-b0c9-c705520c154b-sda', 'timestamp': '2025-11-22T08:20:36.957357', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-854443914', 'name': 'instance-00000097', 'instance_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1fa9c4be-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6377.57817942, 'message_signature': '542379287f2b2027f9e62eb5c5fbe87fc05345d54e6e68f54b8c909e5f684273'}]}, 'timestamp': '2025-11-22 08:20:36.957828', '_unique_id': '0c931f2f1bb74e36ab45b1ad43859ac9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:20:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:37 np0005531888 systemd-logind[825]: New session 45 of user nova.
Nov 22 03:20:37 np0005531888 systemd[1]: Started Session 45 of User nova.
Nov 22 03:20:37 np0005531888 systemd[1]: session-45.scope: Deactivated successfully.
Nov 22 03:20:37 np0005531888 systemd-logind[825]: Session 45 logged out. Waiting for processes to exit.
Nov 22 03:20:37 np0005531888 systemd-logind[825]: Removed session 45.
Nov 22 03:20:38 np0005531888 nova_compute[186788]: 2025-11-22 08:20:38.837 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:38 np0005531888 nova_compute[186788]: 2025-11-22 08:20:38.977 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:39 np0005531888 podman[239787]: 2025-11-22 08:20:39.678985756 +0000 UTC m=+0.053858535 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:20:40 np0005531888 nova_compute[186788]: 2025-11-22 08:20:40.387 186792 DEBUG nova.compute.manager [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:40 np0005531888 nova_compute[186788]: 2025-11-22 08:20:40.388 186792 DEBUG oslo_concurrency.lockutils [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:40 np0005531888 nova_compute[186788]: 2025-11-22 08:20:40.388 186792 DEBUG oslo_concurrency.lockutils [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:40 np0005531888 nova_compute[186788]: 2025-11-22 08:20:40.388 186792 DEBUG oslo_concurrency.lockutils [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:40 np0005531888 nova_compute[186788]: 2025-11-22 08:20:40.388 186792 DEBUG nova.compute.manager [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:40 np0005531888 nova_compute[186788]: 2025-11-22 08:20:40.389 186792 WARNING nova.compute.manager [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:20:40 np0005531888 systemd-logind[825]: New session 46 of user nova.
Nov 22 03:20:40 np0005531888 systemd[1]: Started Session 46 of User nova.
Nov 22 03:20:40 np0005531888 podman[239810]: 2025-11-22 08:20:40.478530409 +0000 UTC m=+0.051792285 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:20:41 np0005531888 systemd[1]: session-46.scope: Deactivated successfully.
Nov 22 03:20:41 np0005531888 systemd-logind[825]: Session 46 logged out. Waiting for processes to exit.
Nov 22 03:20:41 np0005531888 systemd-logind[825]: Removed session 46.
Nov 22 03:20:41 np0005531888 systemd-logind[825]: New session 47 of user nova.
Nov 22 03:20:41 np0005531888 systemd[1]: Started Session 47 of User nova.
Nov 22 03:20:41 np0005531888 systemd[1]: session-47.scope: Deactivated successfully.
Nov 22 03:20:41 np0005531888 systemd-logind[825]: Session 47 logged out. Waiting for processes to exit.
Nov 22 03:20:41 np0005531888 systemd-logind[825]: Removed session 47.
Nov 22 03:20:41 np0005531888 systemd-logind[825]: New session 48 of user nova.
Nov 22 03:20:41 np0005531888 systemd[1]: Started Session 48 of User nova.
Nov 22 03:20:41 np0005531888 systemd[1]: session-48.scope: Deactivated successfully.
Nov 22 03:20:41 np0005531888 systemd-logind[825]: Session 48 logged out. Waiting for processes to exit.
Nov 22 03:20:41 np0005531888 systemd-logind[825]: Removed session 48.
Nov 22 03:20:42 np0005531888 nova_compute[186788]: 2025-11-22 08:20:42.511 186792 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:42 np0005531888 nova_compute[186788]: 2025-11-22 08:20:42.513 186792 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:42 np0005531888 nova_compute[186788]: 2025-11-22 08:20:42.514 186792 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:42 np0005531888 nova_compute[186788]: 2025-11-22 08:20:42.514 186792 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:42 np0005531888 nova_compute[186788]: 2025-11-22 08:20:42.515 186792 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:42 np0005531888 nova_compute[186788]: 2025-11-22 08:20:42.516 186792 WARNING nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:20:43 np0005531888 nova_compute[186788]: 2025-11-22 08:20:43.027 186792 INFO nova.network.neutron [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating port eda1ac92-e156-463f-9f90-8fdd14f55dc0 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 22 03:20:43 np0005531888 nova_compute[186788]: 2025-11-22 08:20:43.840 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:43 np0005531888 nova_compute[186788]: 2025-11-22 08:20:43.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:44 np0005531888 nova_compute[186788]: 2025-11-22 08:20:44.162 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:44 np0005531888 nova_compute[186788]: 2025-11-22 08:20:44.162 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:44 np0005531888 nova_compute[186788]: 2025-11-22 08:20:44.162 186792 DEBUG nova.network.neutron [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.074 186792 DEBUG nova.compute.manager [req-df98ba04-b61a-42ec-bc8f-80a65e522446 req-9b08000a-a06a-4a6e-b990-a3b8dc7b33f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received event network-changed-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.074 186792 DEBUG nova.compute.manager [req-df98ba04-b61a-42ec-bc8f-80a65e522446 req-9b08000a-a06a-4a6e-b990-a3b8dc7b33f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Refreshing instance network info cache due to event network-changed-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.075 186792 DEBUG oslo_concurrency.lockutils [req-df98ba04-b61a-42ec-bc8f-80a65e522446 req-9b08000a-a06a-4a6e-b990-a3b8dc7b33f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.075 186792 DEBUG oslo_concurrency.lockutils [req-df98ba04-b61a-42ec-bc8f-80a65e522446 req-9b08000a-a06a-4a6e-b990-a3b8dc7b33f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.075 186792 DEBUG nova.network.neutron [req-df98ba04-b61a-42ec-bc8f-80a65e522446 req-9b08000a-a06a-4a6e-b990-a3b8dc7b33f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Refreshing network info cache for port 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.152 186792 DEBUG nova.compute.manager [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.153 186792 DEBUG nova.compute.manager [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing instance network info cache due to event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.153 186792 DEBUG oslo_concurrency.lockutils [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.225 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "623aa62d-3837-4517-b0c9-c705520c154b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.225 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.225 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "623aa62d-3837-4517-b0c9-c705520c154b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.225 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.226 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.233 186792 INFO nova.compute.manager [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Terminating instance#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.241 186792 DEBUG nova.compute.manager [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:20:45 np0005531888 kernel: tap2b3fde18-35 (unregistering): left promiscuous mode
Nov 22 03:20:45 np0005531888 NetworkManager[55166]: <info>  [1763799645.2768] device (tap2b3fde18-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.286 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:45Z|00592|binding|INFO|Releasing lport 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 from this chassis (sb_readonly=0)
Nov 22 03:20:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:45Z|00593|binding|INFO|Setting lport 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 down in Southbound
Nov 22 03:20:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:45Z|00594|binding|INFO|Removing iface tap2b3fde18-35 ovn-installed in OVS
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.310 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:45.307 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:20:c3 10.100.0.12 2001:db8:0:1:f816:3eff:fe1c:20c3 2001:db8::f816:3eff:fe1c:20c3'], port_security=['fa:16:3e:1c:20:c3 10.100.0.12 2001:db8:0:1:f816:3eff:fe1c:20c3 2001:db8::f816:3eff:fe1c:20c3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8:0:1:f816:3eff:fe1c:20c3/64 2001:db8::f816:3eff:fe1c:20c3/64', 'neutron:device_id': '623aa62d-3837-4517-b0c9-c705520c154b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '04c6695d-d046-49fb-a069-528067303a16', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:45.309 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 in datapath cfb1249f-37ac-4df7-b559-e7968406997d unbound from our chassis#033[00m
Nov 22 03:20:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:45.311 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfb1249f-37ac-4df7-b559-e7968406997d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:20:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:45.313 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[809c4c88-ed7c-4a23-9b0c-fec91bf12ae8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:45.314 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d namespace which is not needed anymore#033[00m
Nov 22 03:20:45 np0005531888 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000097.scope: Deactivated successfully.
Nov 22 03:20:45 np0005531888 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000097.scope: Consumed 17.058s CPU time.
Nov 22 03:20:45 np0005531888 systemd-machined[153106]: Machine qemu-73-instance-00000097 terminated.
Nov 22 03:20:45 np0005531888 podman[239848]: 2025-11-22 08:20:45.368006668 +0000 UTC m=+0.066268890 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.512 186792 INFO nova.virt.libvirt.driver [-] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Instance destroyed successfully.#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.513 186792 DEBUG nova.objects.instance [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 623aa62d-3837-4517-b0c9-c705520c154b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.526 186792 DEBUG nova.virt.libvirt.vif [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:19:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-854443914',display_name='tempest-TestGettingAddress-server-854443914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-854443914',id=151,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFe+Iivl03JqMU254FLdbdmYsFU6DbbfEEAr4K/FY8GDuQ0mBhcnts9hxMb1kzVXY50lm7S9mYwpnOmQECnf5XsVq5CeQ5VY2CUHiqO5dq+d/xeUGYNH940WARTUGprt0Q==',key_name='tempest-TestGettingAddress-223844638',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:19:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-u012fm3c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:19:40Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=623aa62d-3837-4517-b0c9-c705520c154b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.526 186792 DEBUG nova.network.os_vif_util [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.527 186792 DEBUG nova.network.os_vif_util [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:20:c3,bridge_name='br-int',has_traffic_filtering=True,id=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b3fde18-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.528 186792 DEBUG os_vif [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:20:c3,bridge_name='br-int',has_traffic_filtering=True,id=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b3fde18-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.529 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.530 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b3fde18-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.531 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.532 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.535 186792 INFO os_vif [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:20:c3,bridge_name='br-int',has_traffic_filtering=True,id=2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b3fde18-35')#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.535 186792 INFO nova.virt.libvirt.driver [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Deleting instance files /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b_del#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.536 186792 INFO nova.virt.libvirt.driver [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Deletion of /var/lib/nova/instances/623aa62d-3837-4517-b0c9-c705520c154b_del complete#033[00m
Nov 22 03:20:45 np0005531888 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[239463]: [NOTICE]   (239467) : haproxy version is 2.8.14-c23fe91
Nov 22 03:20:45 np0005531888 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[239463]: [NOTICE]   (239467) : path to executable is /usr/sbin/haproxy
Nov 22 03:20:45 np0005531888 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[239463]: [WARNING]  (239467) : Exiting Master process...
Nov 22 03:20:45 np0005531888 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[239463]: [WARNING]  (239467) : Exiting Master process...
Nov 22 03:20:45 np0005531888 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[239463]: [ALERT]    (239467) : Current worker (239469) exited with code 143 (Terminated)
Nov 22 03:20:45 np0005531888 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[239463]: [WARNING]  (239467) : All workers exited. Exiting... (0)
Nov 22 03:20:45 np0005531888 systemd[1]: libpod-59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0.scope: Deactivated successfully.
Nov 22 03:20:45 np0005531888 podman[239893]: 2025-11-22 08:20:45.562718447 +0000 UTC m=+0.162013885 container died 59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.584 186792 DEBUG nova.network.neutron [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.603 186792 INFO nova.compute.manager [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.603 186792 DEBUG oslo.service.loopingcall [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.603 186792 DEBUG nova.compute.manager [-] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.604 186792 DEBUG nova.network.neutron [-] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.619 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.626 186792 DEBUG oslo_concurrency.lockutils [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.626 186792 DEBUG nova.network.neutron [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.736 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.737 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.737 186792 INFO nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Creating image(s)#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.738 186792 DEBUG nova.objects.instance [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.748 186792 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.819 186792 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.820 186792 DEBUG nova.virt.disk.api [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Checking if we can resize image /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.820 186792 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.874 186792 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.875 186792 DEBUG nova.virt.disk.api [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Cannot resize image /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.891 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.891 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Ensure instance console log exists: /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.892 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.893 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.893 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.895 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Start _get_guest_xml network_info=[{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1623348371", "vif_mac": "fa:16:3e:b3:0e:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.900 186792 WARNING nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.907 186792 DEBUG nova.virt.libvirt.host [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.908 186792 DEBUG nova.virt.libvirt.host [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.911 186792 DEBUG nova.virt.libvirt.host [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.911 186792 DEBUG nova.virt.libvirt.host [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.913 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.913 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.914 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.914 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.914 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.915 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.915 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.915 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.915 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.916 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.916 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.916 186792 DEBUG nova.virt.hardware [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.917 186792 DEBUG nova.objects.instance [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:45 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0-userdata-shm.mount: Deactivated successfully.
Nov 22 03:20:45 np0005531888 systemd[1]: var-lib-containers-storage-overlay-81a4d96002b6bca77da64a0f8de78a7950010299436ba6e48f4803dfd6b4689c-merged.mount: Deactivated successfully.
Nov 22 03:20:45 np0005531888 nova_compute[186788]: 2025-11-22 08:20:45.937 186792 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.033 186792 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.034 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.035 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.036 186792 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.038 186792 DEBUG nova.virt.libvirt.vif [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:20:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:20:42Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1623348371", "vif_mac": "fa:16:3e:b3:0e:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.038 186792 DEBUG nova.network.os_vif_util [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1623348371", "vif_mac": "fa:16:3e:b3:0e:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.039 186792 DEBUG nova.network.os_vif_util [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.042 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <uuid>074d9b5a-057a-46af-aea1-0f43e0ac7418</uuid>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <name>instance-00000098</name>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-550951011</nova:name>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:20:45</nova:creationTime>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        <nova:port uuid="eda1ac92-e156-463f-9f90-8fdd14f55dc0">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <entry name="serial">074d9b5a-057a-46af-aea1-0f43e0ac7418</entry>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <entry name="uuid">074d9b5a-057a-46af-aea1-0f43e0ac7418</entry>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:b3:0e:98"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <target dev="tapeda1ac92-e1"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/console.log" append="off"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:20:46 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:20:46 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:20:46 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:20:46 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.044 186792 DEBUG nova.virt.libvirt.vif [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:20:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:20:42Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1623348371", "vif_mac": "fa:16:3e:b3:0e:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.045 186792 DEBUG nova.network.os_vif_util [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1623348371", "vif_mac": "fa:16:3e:b3:0e:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.046 186792 DEBUG nova.network.os_vif_util [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.046 186792 DEBUG os_vif [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.047 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.047 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.048 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.050 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.050 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeda1ac92-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.051 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeda1ac92-e1, col_values=(('external_ids', {'iface-id': 'eda1ac92-e156-463f-9f90-8fdd14f55dc0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:0e:98', 'vm-uuid': '074d9b5a-057a-46af-aea1-0f43e0ac7418'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.052 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531888 NetworkManager[55166]: <info>  [1763799646.0537] manager: (tapeda1ac92-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.058 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.059 186792 INFO os_vif [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1')#033[00m
Nov 22 03:20:46 np0005531888 podman[239893]: 2025-11-22 08:20:46.089505533 +0000 UTC m=+0.688800971 container cleanup 59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:20:46 np0005531888 systemd[1]: libpod-conmon-59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0.scope: Deactivated successfully.
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.214 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.216 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.216 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] No VIF found with MAC fa:16:3e:b3:0e:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.216 186792 INFO nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Using config drive#033[00m
Nov 22 03:20:46 np0005531888 kernel: tapeda1ac92-e1: entered promiscuous mode
Nov 22 03:20:46 np0005531888 NetworkManager[55166]: <info>  [1763799646.2788] manager: (tapeda1ac92-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/280)
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.279 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:46Z|00595|binding|INFO|Claiming lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 for this chassis.
Nov 22 03:20:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:46Z|00596|binding|INFO|eda1ac92-e156-463f-9f90-8fdd14f55dc0: Claiming fa:16:3e:b3:0e:98 10.100.0.5
Nov 22 03:20:46 np0005531888 systemd-udevd[239857]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:20:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:46Z|00597|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 ovn-installed in OVS
Nov 22 03:20:46 np0005531888 NetworkManager[55166]: <info>  [1763799646.2978] device (tapeda1ac92-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:20:46 np0005531888 NetworkManager[55166]: <info>  [1763799646.2991] device (tapeda1ac92-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.299 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.301 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:46Z|00598|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 up in Southbound
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.310 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:0e:98 10.100.0.5'], port_security=['fa:16:3e:b3:0e:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a792325d-0f21-49cc-9e79-20df696791a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46980f3d-5c4c-4eaa-bdbc-e9dfef13e740, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=eda1ac92-e156-463f-9f90-8fdd14f55dc0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:46 np0005531888 systemd-machined[153106]: New machine qemu-74-instance-00000098.
Nov 22 03:20:46 np0005531888 systemd[1]: Started Virtual Machine qemu-74-instance-00000098.
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.578 186792 DEBUG nova.compute.manager [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.579 186792 DEBUG oslo_concurrency.lockutils [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.581 186792 DEBUG oslo_concurrency.lockutils [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.581 186792 DEBUG oslo_concurrency.lockutils [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.581 186792 DEBUG nova.compute.manager [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.581 186792 WARNING nova.compute.manager [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state active and task_state resize_finish.#033[00m
Nov 22 03:20:46 np0005531888 podman[239951]: 2025-11-22 08:20:46.674998182 +0000 UTC m=+0.561920581 container remove 59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.682 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[99c7d01b-18f8-4f71-8dc3-969b959dd0f9]: (4, ('Sat Nov 22 08:20:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d (59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0)\n59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0\nSat Nov 22 08:20:46 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d (59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0)\n59fd4fb9dbd6f8544ee913eae9a6019934724ab9c0652c989e77a59fdb83f6c0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.685 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7902eb-f7e6-4320-95af-29a562215167]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.686 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfb1249f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.689 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531888 kernel: tapcfb1249f-30: left promiscuous mode
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.697 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799646.6969907, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.698 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.702 186792 DEBUG nova.compute.manager [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.709 186792 INFO nova.virt.libvirt.driver [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance running successfully.#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.711 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531888 virtqemud[186358]: argument unsupported: QEMU guest agent is not configured
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.715 186792 DEBUG nova.virt.libvirt.guest [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.715 186792 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.716 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[be7a3606-cb90-4b76-9d38-38b69cb3df31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.725 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.730 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.729 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0801661c-8f88-4dc1-ba96-fe7221884fe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.734 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[768bfaa2-9f2b-42ca-b8ab-07fdfb77c6c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.749 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.749 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799646.6985765, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.750 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Started (Lifecycle Event)#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.754 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5f11b3-0733-4aae-b68f-9d6bf58392a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632011, 'reachable_time': 36869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239998, 'error': None, 'target': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.758 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.758 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[98cb228d-79a5-4f05-8e6e-82271c1bec28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.760 104023 INFO neutron.agent.ovn.metadata.agent [-] Port eda1ac92-e156-463f-9f90-8fdd14f55dc0 in datapath 52d2fbd4-6713-49c3-93b1-794bccb91cb5 unbound from our chassis#033[00m
Nov 22 03:20:46 np0005531888 systemd[1]: run-netns-ovnmeta\x2dcfb1249f\x2d37ac\x2d4df7\x2db559\x2de7968406997d.mount: Deactivated successfully.
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.761 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 52d2fbd4-6713-49c3-93b1-794bccb91cb5#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.775 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2a75ad50-f118-4aed-a95c-7752dad9e8e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.776 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap52d2fbd4-61 in ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.779 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap52d2fbd4-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.779 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9802acb8-21fa-4ef8-947f-52534bf97d47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.780 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a20df42f-c4c5-4b50-9bdf-590e1253dcdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.799 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.798 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[9784a3bb-a6b1-4d25-858a-a00c927350c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 nova_compute[186788]: 2025-11-22 08:20:46.805 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.832 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[09582821-ac6e-4f1c-8f17-ed0faa3e6cad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 podman[239997]: 2025-11-22 08:20:46.843364242 +0000 UTC m=+0.082654994 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.869 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[fef96bd9-d100-4b87-a85b-cbf2de6844d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.877 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3f1c82a3-782c-4a74-886c-6fbba57579af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 NetworkManager[55166]: <info>  [1763799646.8781] manager: (tap52d2fbd4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/281)
Nov 22 03:20:46 np0005531888 podman[239999]: 2025-11-22 08:20:46.889312262 +0000 UTC m=+0.131278330 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.919 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[54bc2e42-c04c-48c1-8539-36cc5e73ffca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.922 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab96365-436a-493d-9573-a0278c630112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 NetworkManager[55166]: <info>  [1763799646.9514] device (tap52d2fbd4-60): carrier: link connected
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.953 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e1efbf-08ef-404e-8ebc-3787afe8e170]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.975 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[005852b3-b4c6-4f92-ad03-8b6be9a79958]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52d2fbd4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:e8:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638759, 'reachable_time': 43593, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240065, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:46.996 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6d980b3c-eae4-45f1-900c-ae96ebf7f4ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:e832'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 638759, 'tstamp': 638759}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240066, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.016 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[35cc9062-497b-451e-b4b8-d8d124489405]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52d2fbd4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:e8:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638759, 'reachable_time': 43593, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240067, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.056 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7a34b3e6-1c7e-4ee0-b884-371224397432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.135 186792 DEBUG nova.network.neutron [-] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.153 186792 INFO nova.compute.manager [-] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Took 1.55 seconds to deallocate network for instance.#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.172 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4f1dcd-627c-4d75-8ce1-a572d1119857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.173 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52d2fbd4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.174 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.174 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52d2fbd4-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:47 np0005531888 NetworkManager[55166]: <info>  [1763799647.1772] manager: (tap52d2fbd4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.178 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:47 np0005531888 kernel: tap52d2fbd4-60: entered promiscuous mode
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.181 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap52d2fbd4-60, col_values=(('external_ids', {'iface-id': 'a25db17f-5074-4e3e-9504-ee30cd3f6d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:47Z|00599|binding|INFO|Releasing lport a25db17f-5074-4e3e-9504-ee30cd3f6d4c from this chassis (sb_readonly=0)
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.200 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.200 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.201 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[24f5ba51-5eed-40a6-9149-2652c3e57689]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.201 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-52d2fbd4-6713-49c3-93b1-794bccb91cb5
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 52d2fbd4-6713-49c3-93b1-794bccb91cb5
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:20:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:47.202 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'env', 'PROCESS_TAG=haproxy-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/52d2fbd4-6713-49c3-93b1-794bccb91cb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.222 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.223 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.255 186792 DEBUG nova.compute.manager [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received event network-vif-unplugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.256 186792 DEBUG oslo_concurrency.lockutils [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "623aa62d-3837-4517-b0c9-c705520c154b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.256 186792 DEBUG oslo_concurrency.lockutils [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.256 186792 DEBUG oslo_concurrency.lockutils [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.256 186792 DEBUG nova.compute.manager [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] No waiting events found dispatching network-vif-unplugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.257 186792 WARNING nova.compute.manager [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received unexpected event network-vif-unplugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.257 186792 DEBUG nova.compute.manager [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received event network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.257 186792 DEBUG oslo_concurrency.lockutils [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "623aa62d-3837-4517-b0c9-c705520c154b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.257 186792 DEBUG oslo_concurrency.lockutils [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.258 186792 DEBUG oslo_concurrency.lockutils [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.258 186792 DEBUG nova.compute.manager [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] No waiting events found dispatching network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.258 186792 WARNING nova.compute.manager [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received unexpected event network-vif-plugged-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.259 186792 DEBUG nova.compute.manager [req-6c7da534-fcdb-45d1-acd4-da98550186c4 req-576b3e81-547d-47fc-bb2c-321f1a12ff7d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Received event network-vif-deleted-2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.308 186792 DEBUG nova.compute.provider_tree [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.321 186792 DEBUG nova.scheduler.client.report [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.348 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.385 186792 INFO nova.scheduler.client.report [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 623aa62d-3837-4517-b0c9-c705520c154b#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.562 186792 DEBUG oslo_concurrency.lockutils [None req-30e316cb-7d5d-4296-9c5f-3f4ef0581a9e 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "623aa62d-3837-4517-b0c9-c705520c154b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:47 np0005531888 podman[240101]: 2025-11-22 08:20:47.651292911 +0000 UTC m=+0.086442187 container create 4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 03:20:47 np0005531888 podman[240101]: 2025-11-22 08:20:47.596380901 +0000 UTC m=+0.031530197 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:20:47 np0005531888 systemd[1]: Started libpod-conmon-4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf.scope.
Nov 22 03:20:47 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:20:47 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af929e77e4e790d76f676f7d1f75d6f5e06be222f9dce6ad97e9459a19461738/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:20:47 np0005531888 podman[240101]: 2025-11-22 08:20:47.760171609 +0000 UTC m=+0.195320905 container init 4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:20:47 np0005531888 podman[240101]: 2025-11-22 08:20:47.767037928 +0000 UTC m=+0.202187204 container start 4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:20:47 np0005531888 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[240116]: [NOTICE]   (240120) : New worker (240122) forked
Nov 22 03:20:47 np0005531888 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[240116]: [NOTICE]   (240120) : Loading success.
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.837 186792 DEBUG nova.network.neutron [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updated VIF entry in instance network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.838 186792 DEBUG nova.network.neutron [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:47 np0005531888 nova_compute[186788]: 2025-11-22 08:20:47.868 186792 DEBUG oslo_concurrency.lockutils [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:48 np0005531888 nova_compute[186788]: 2025-11-22 08:20:48.655 186792 DEBUG nova.compute.manager [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:48 np0005531888 nova_compute[186788]: 2025-11-22 08:20:48.656 186792 DEBUG oslo_concurrency.lockutils [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:48 np0005531888 nova_compute[186788]: 2025-11-22 08:20:48.656 186792 DEBUG oslo_concurrency.lockutils [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:48 np0005531888 nova_compute[186788]: 2025-11-22 08:20:48.657 186792 DEBUG oslo_concurrency.lockutils [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:48 np0005531888 nova_compute[186788]: 2025-11-22 08:20:48.657 186792 DEBUG nova.compute.manager [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:48 np0005531888 nova_compute[186788]: 2025-11-22 08:20:48.657 186792 WARNING nova.compute.manager [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 22 03:20:48 np0005531888 nova_compute[186788]: 2025-11-22 08:20:48.982 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:49 np0005531888 nova_compute[186788]: 2025-11-22 08:20:49.090 186792 DEBUG nova.network.neutron [req-df98ba04-b61a-42ec-bc8f-80a65e522446 req-9b08000a-a06a-4a6e-b990-a3b8dc7b33f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updated VIF entry in instance network info cache for port 2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:20:49 np0005531888 nova_compute[186788]: 2025-11-22 08:20:49.091 186792 DEBUG nova.network.neutron [req-df98ba04-b61a-42ec-bc8f-80a65e522446 req-9b08000a-a06a-4a6e-b990-a3b8dc7b33f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Updating instance_info_cache with network_info: [{"id": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "address": "fa:16:3e:1c:20:c3", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1c:20c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b3fde18-35", "ovs_interfaceid": "2b3fde18-35ea-46f7-9e36-94ea2e5b8ba2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:49 np0005531888 nova_compute[186788]: 2025-11-22 08:20:49.115 186792 DEBUG oslo_concurrency.lockutils [req-df98ba04-b61a-42ec-bc8f-80a65e522446 req-9b08000a-a06a-4a6e-b990-a3b8dc7b33f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-623aa62d-3837-4517-b0c9-c705520c154b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:50 np0005531888 nova_compute[186788]: 2025-11-22 08:20:50.030 186792 DEBUG nova.network.neutron [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Port eda1ac92-e156-463f-9f90-8fdd14f55dc0 binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Nov 22 03:20:50 np0005531888 nova_compute[186788]: 2025-11-22 08:20:50.031 186792 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:50 np0005531888 nova_compute[186788]: 2025-11-22 08:20:50.032 186792 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:50 np0005531888 nova_compute[186788]: 2025-11-22 08:20:50.032 186792 DEBUG nova.network.neutron [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.636 186792 DEBUG nova.network.neutron [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.658 186792 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.673 186792 DEBUG nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Creating tmpfile /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/tmppldahftp to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Nov 22 03:20:51 np0005531888 kernel: tapeda1ac92-e1 (unregistering): left promiscuous mode
Nov 22 03:20:51 np0005531888 NetworkManager[55166]: <info>  [1763799651.7038] device (tapeda1ac92-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.717 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:51Z|00600|binding|INFO|Releasing lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 from this chassis (sb_readonly=0)
Nov 22 03:20:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:51Z|00601|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 down in Southbound
Nov 22 03:20:51 np0005531888 ovn_controller[95067]: 2025-11-22T08:20:51Z|00602|binding|INFO|Removing iface tapeda1ac92-e1 ovn-installed in OVS
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.721 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:51.730 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:0e:98 10.100.0.5'], port_security=['fa:16:3e:b3:0e:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a792325d-0f21-49cc-9e79-20df696791a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.205', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46980f3d-5c4c-4eaa-bdbc-e9dfef13e740, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=eda1ac92-e156-463f-9f90-8fdd14f55dc0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:51.732 104023 INFO neutron.agent.ovn.metadata.agent [-] Port eda1ac92-e156-463f-9f90-8fdd14f55dc0 in datapath 52d2fbd4-6713-49c3-93b1-794bccb91cb5 unbound from our chassis#033[00m
Nov 22 03:20:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:51.734 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 52d2fbd4-6713-49c3-93b1-794bccb91cb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.736 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531888 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 03:20:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:51.736 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[41c3f0bf-7331-4d67-83e5-27bf8040482f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:51 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:51.737 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 namespace which is not needed anymore#033[00m
Nov 22 03:20:51 np0005531888 systemd[239764]: Activating special unit Exit the Session...
Nov 22 03:20:51 np0005531888 systemd[239764]: Stopped target Main User Target.
Nov 22 03:20:51 np0005531888 systemd[239764]: Stopped target Basic System.
Nov 22 03:20:51 np0005531888 systemd[239764]: Stopped target Paths.
Nov 22 03:20:51 np0005531888 systemd[239764]: Stopped target Sockets.
Nov 22 03:20:51 np0005531888 systemd[239764]: Stopped target Timers.
Nov 22 03:20:51 np0005531888 systemd[239764]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 03:20:51 np0005531888 systemd[239764]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 03:20:51 np0005531888 systemd[239764]: Closed D-Bus User Message Bus Socket.
Nov 22 03:20:51 np0005531888 systemd[239764]: Stopped Create User's Volatile Files and Directories.
Nov 22 03:20:51 np0005531888 systemd[239764]: Removed slice User Application Slice.
Nov 22 03:20:51 np0005531888 systemd[239764]: Reached target Shutdown.
Nov 22 03:20:51 np0005531888 systemd[239764]: Finished Exit the Session.
Nov 22 03:20:51 np0005531888 systemd[239764]: Reached target Exit the Session.
Nov 22 03:20:51 np0005531888 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000098.scope: Deactivated successfully.
Nov 22 03:20:51 np0005531888 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000098.scope: Consumed 5.447s CPU time.
Nov 22 03:20:51 np0005531888 systemd-machined[153106]: Machine qemu-74-instance-00000098 terminated.
Nov 22 03:20:51 np0005531888 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 03:20:51 np0005531888 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 03:20:51 np0005531888 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 03:20:51 np0005531888 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 03:20:51 np0005531888 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 03:20:51 np0005531888 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 03:20:51 np0005531888 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 03:20:51 np0005531888 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[240116]: [NOTICE]   (240120) : haproxy version is 2.8.14-c23fe91
Nov 22 03:20:51 np0005531888 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[240116]: [NOTICE]   (240120) : path to executable is /usr/sbin/haproxy
Nov 22 03:20:51 np0005531888 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[240116]: [WARNING]  (240120) : Exiting Master process...
Nov 22 03:20:51 np0005531888 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[240116]: [ALERT]    (240120) : Current worker (240122) exited with code 143 (Terminated)
Nov 22 03:20:51 np0005531888 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[240116]: [WARNING]  (240120) : All workers exited. Exiting... (0)
Nov 22 03:20:51 np0005531888 systemd[1]: libpod-4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf.scope: Deactivated successfully.
Nov 22 03:20:51 np0005531888 podman[240156]: 2025-11-22 08:20:51.868066676 +0000 UTC m=+0.049647933 container died 4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.899 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.905 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.931 186792 INFO nova.virt.libvirt.driver [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance destroyed successfully.#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.931 186792 DEBUG nova.objects.instance [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.947 186792 DEBUG nova.virt.libvirt.vif [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:20:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:20:46Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.948 186792 DEBUG nova.network.os_vif_util [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.949 186792 DEBUG nova.network.os_vif_util [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.950 186792 DEBUG os_vif [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.953 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.953 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeda1ac92-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.956 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.959 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.962 186792 DEBUG nova.compute.manager [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.962 186792 DEBUG oslo_concurrency.lockutils [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.963 186792 DEBUG oslo_concurrency.lockutils [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.963 186792 DEBUG oslo_concurrency.lockutils [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.963 186792 DEBUG nova.compute.manager [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.963 186792 WARNING nova.compute.manager [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.964 186792 INFO os_vif [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1')#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.964 186792 INFO nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Deleting instance files /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_del#033[00m
Nov 22 03:20:51 np0005531888 nova_compute[186788]: 2025-11-22 08:20:51.970 186792 INFO nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Deletion of /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_del complete#033[00m
Nov 22 03:20:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf-userdata-shm.mount: Deactivated successfully.
Nov 22 03:20:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay-af929e77e4e790d76f676f7d1f75d6f5e06be222f9dce6ad97e9459a19461738-merged.mount: Deactivated successfully.
Nov 22 03:20:52 np0005531888 nova_compute[186788]: 2025-11-22 08:20:52.044 186792 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:52 np0005531888 nova_compute[186788]: 2025-11-22 08:20:52.044 186792 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:52 np0005531888 nova_compute[186788]: 2025-11-22 08:20:52.056 186792 DEBUG nova.objects.instance [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:52 np0005531888 podman[240156]: 2025-11-22 08:20:52.093433758 +0000 UTC m=+0.275015015 container cleanup 4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:20:52 np0005531888 systemd[1]: libpod-conmon-4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf.scope: Deactivated successfully.
Nov 22 03:20:52 np0005531888 nova_compute[186788]: 2025-11-22 08:20:52.106 186792 DEBUG nova.compute.provider_tree [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:20:52 np0005531888 nova_compute[186788]: 2025-11-22 08:20:52.117 186792 DEBUG nova.scheduler.client.report [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:20:52 np0005531888 nova_compute[186788]: 2025-11-22 08:20:52.170 186792 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:52 np0005531888 podman[240203]: 2025-11-22 08:20:52.220661447 +0000 UTC m=+0.105820074 container remove 4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.226 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[509fa6ba-fa07-48c2-ba9c-6604b8e40614]: (4, ('Sat Nov 22 08:20:51 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 (4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf)\n4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf\nSat Nov 22 08:20:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 (4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf)\n4f1828316a353cf79c6c663b3565b74cb8e33377454728e896ce3a5204af51cf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.228 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3a18e3-6589-4c3d-93ea-aa9f61df0413]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.229 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52d2fbd4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:52 np0005531888 nova_compute[186788]: 2025-11-22 08:20:52.230 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:52 np0005531888 kernel: tap52d2fbd4-60: left promiscuous mode
Nov 22 03:20:52 np0005531888 nova_compute[186788]: 2025-11-22 08:20:52.255 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.260 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1b6ae0-ff92-4ad6-8532-6e52436b62ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.273 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d357c736-ff23-4cc9-84a2-c597b0512679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.274 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[25fc33b5-68cc-4c36-b151-1c296a561eb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.291 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[24196379-4fb7-4c9a-8dbf-fc4223455e01]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638750, 'reachable_time': 28650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240218, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.294 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:20:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:52.294 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[53163c4d-4c4a-4558-961b-2e599f62b71f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:52 np0005531888 systemd[1]: run-netns-ovnmeta\x2d52d2fbd4\x2d6713\x2d49c3\x2d93b1\x2d794bccb91cb5.mount: Deactivated successfully.
Nov 22 03:20:53 np0005531888 nova_compute[186788]: 2025-11-22 08:20:53.543 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:53.543 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:20:53.544 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:20:53 np0005531888 nova_compute[186788]: 2025-11-22 08:20:53.984 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.069 186792 DEBUG nova.compute.manager [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.070 186792 DEBUG nova.compute.manager [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing instance network info cache due to event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.070 186792 DEBUG oslo_concurrency.lockutils [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.071 186792 DEBUG oslo_concurrency.lockutils [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.071 186792 DEBUG nova.network.neutron [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.075 186792 DEBUG nova.compute.manager [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.075 186792 DEBUG oslo_concurrency.lockutils [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.076 186792 DEBUG oslo_concurrency.lockutils [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.076 186792 DEBUG oslo_concurrency.lockutils [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.077 186792 DEBUG nova.compute.manager [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:54 np0005531888 nova_compute[186788]: 2025-11-22 08:20:54.077 186792 WARNING nova.compute.manager [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 22 03:20:55 np0005531888 nova_compute[186788]: 2025-11-22 08:20:55.640 186792 DEBUG nova.network.neutron [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updated VIF entry in instance network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:20:55 np0005531888 nova_compute[186788]: 2025-11-22 08:20:55.641 186792 DEBUG nova.network.neutron [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:55 np0005531888 nova_compute[186788]: 2025-11-22 08:20:55.657 186792 DEBUG oslo_concurrency.lockutils [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.724 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.872 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.940 186792 DEBUG nova.compute.manager [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.941 186792 DEBUG oslo_concurrency.lockutils [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.941 186792 DEBUG oslo_concurrency.lockutils [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.942 186792 DEBUG oslo_concurrency.lockutils [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.942 186792 DEBUG nova.compute.manager [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.942 186792 WARNING nova.compute.manager [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 22 03:20:56 np0005531888 nova_compute[186788]: 2025-11-22 08:20:56.955 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:58 np0005531888 podman[240221]: 2025-11-22 08:20:58.707303203 +0000 UTC m=+0.068797833 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:20:58 np0005531888 podman[240220]: 2025-11-22 08:20:58.715051144 +0000 UTC m=+0.086803656 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:20:58 np0005531888 nova_compute[186788]: 2025-11-22 08:20:58.986 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:00 np0005531888 nova_compute[186788]: 2025-11-22 08:21:00.511 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799645.509343, 623aa62d-3837-4517-b0c9-c705520c154b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:21:00 np0005531888 nova_compute[186788]: 2025-11-22 08:21:00.512 186792 INFO nova.compute.manager [-] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:21:00 np0005531888 nova_compute[186788]: 2025-11-22 08:21:00.536 186792 DEBUG nova.compute.manager [None req-99d3960b-49b2-4398-a368-d420563b74f9 - - - - - -] [instance: 623aa62d-3837-4517-b0c9-c705520c154b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:00.546 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:01 np0005531888 nova_compute[186788]: 2025-11-22 08:21:01.960 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:03 np0005531888 nova_compute[186788]: 2025-11-22 08:21:03.987 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:04 np0005531888 nova_compute[186788]: 2025-11-22 08:21:04.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:05 np0005531888 nova_compute[186788]: 2025-11-22 08:21:05.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:06 np0005531888 nova_compute[186788]: 2025-11-22 08:21:06.934 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799651.9296968, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:21:06 np0005531888 nova_compute[186788]: 2025-11-22 08:21:06.935 186792 INFO nova.compute.manager [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:21:06 np0005531888 nova_compute[186788]: 2025-11-22 08:21:06.957 186792 DEBUG nova.compute.manager [None req-ffa0efad-3d0a-488c-9062-08f9abe81e9f - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:06 np0005531888 nova_compute[186788]: 2025-11-22 08:21:06.964 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:07 np0005531888 nova_compute[186788]: 2025-11-22 08:21:07.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:07 np0005531888 nova_compute[186788]: 2025-11-22 08:21:07.957 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:21:07 np0005531888 nova_compute[186788]: 2025-11-22 08:21:07.957 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:21:07 np0005531888 nova_compute[186788]: 2025-11-22 08:21:07.971 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:21:08 np0005531888 nova_compute[186788]: 2025-11-22 08:21:08.990 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:09.194 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8::f816:3eff:febd:96a1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=397a3db2-78b9-4182-b3e5-f29d5ae58cda) old=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:21:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:09.195 104023 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 397a3db2-78b9-4182-b3e5-f29d5ae58cda in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 updated#033[00m
Nov 22 03:21:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:09.197 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b35c418-bf90-4666-a674-9b7153e90ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:21:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:09.198 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2108f4-dfde-49da-bec0-8b95143a4983]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:10 np0005531888 podman[240264]: 2025-11-22 08:21:10.677614294 +0000 UTC m=+0.054402999 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:21:10 np0005531888 podman[240265]: 2025-11-22 08:21:10.691727151 +0000 UTC m=+0.057366702 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:21:10 np0005531888 nova_compute[186788]: 2025-11-22 08:21:10.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:10 np0005531888 nova_compute[186788]: 2025-11-22 08:21:10.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:11 np0005531888 nova_compute[186788]: 2025-11-22 08:21:11.969 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:12.686 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8:0:1:f816:3eff:febd:96a1 2001:db8::f816:3eff:febd:96a1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:febd:96a1/64 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=397a3db2-78b9-4182-b3e5-f29d5ae58cda) old=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8::f816:3eff:febd:96a1'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:21:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:12.688 104023 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 397a3db2-78b9-4182-b3e5-f29d5ae58cda in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 updated#033[00m
Nov 22 03:21:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:12.691 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b35c418-bf90-4666-a674-9b7153e90ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:21:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:12.692 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc6acad-4e90-4b47-b39c-66be549c4da3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:13 np0005531888 nova_compute[186788]: 2025-11-22 08:21:13.992 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:15 np0005531888 podman[240308]: 2025-11-22 08:21:15.711615915 +0000 UTC m=+0.080078611 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, vcs-type=git)
Nov 22 03:21:15 np0005531888 nova_compute[186788]: 2025-11-22 08:21:15.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:16 np0005531888 nova_compute[186788]: 2025-11-22 08:21:16.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:16 np0005531888 nova_compute[186788]: 2025-11-22 08:21:16.972 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:16 np0005531888 nova_compute[186788]: 2025-11-22 08:21:16.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:16 np0005531888 nova_compute[186788]: 2025-11-22 08:21:16.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:16 np0005531888 nova_compute[186788]: 2025-11-22 08:21:16.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:16 np0005531888 nova_compute[186788]: 2025-11-22 08:21:16.978 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:21:17 np0005531888 podman[240330]: 2025-11-22 08:21:17.095623722 +0000 UTC m=+0.065710497 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:21:17 np0005531888 podman[240331]: 2025-11-22 08:21:17.114951686 +0000 UTC m=+0.080212523 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.160 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.162 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5695MB free_disk=73.26646041870117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.162 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.162 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.232 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.233 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.274 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.293 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.313 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.314 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.899 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "bebef998-c2ca-462e-95b1-d61e5c6f1198" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.900 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.915 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.987 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.988 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.994 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:21:17 np0005531888 nova_compute[186788]: 2025-11-22 08:21:17.994 186792 INFO nova.compute.claims [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.101 186792 DEBUG nova.compute.provider_tree [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.115 186792 DEBUG nova.scheduler.client.report [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.136 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.137 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.189 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.190 186792 DEBUG nova.network.neutron [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.211 186792 INFO nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.237 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.371 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.372 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.373 186792 INFO nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Creating image(s)#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.373 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.374 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.375 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.392 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.467 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.469 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.470 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.487 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.511 186792 DEBUG nova.policy [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.554 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.556 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.590 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.591 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.592 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.675 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.676 186792 DEBUG nova.virt.disk.api [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.677 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.749 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.750 186792 DEBUG nova.virt.disk.api [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.750 186792 DEBUG nova.objects.instance [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid bebef998-c2ca-462e-95b1-d61e5c6f1198 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.764 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.764 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Ensure instance console log exists: /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.765 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.766 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.766 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:18 np0005531888 nova_compute[186788]: 2025-11-22 08:21:18.993 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:20 np0005531888 nova_compute[186788]: 2025-11-22 08:21:20.155 186792 DEBUG nova.network.neutron [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Successfully created port: 5569421a-dc56-40c1-8ce0-e4d52b397e8e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:21:20 np0005531888 nova_compute[186788]: 2025-11-22 08:21:20.314 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.014 186792 DEBUG nova.network.neutron [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Successfully updated port: 5569421a-dc56-40c1-8ce0-e4d52b397e8e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.032 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.032 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.032 186792 DEBUG nova.network.neutron [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.223 186792 DEBUG nova.network.neutron [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.526 186792 DEBUG nova.compute.manager [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-changed-5569421a-dc56-40c1-8ce0-e4d52b397e8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.527 186792 DEBUG nova.compute.manager [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Refreshing instance network info cache due to event network-changed-5569421a-dc56-40c1-8ce0-e4d52b397e8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.527 186792 DEBUG oslo_concurrency.lockutils [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:21:21 np0005531888 nova_compute[186788]: 2025-11-22 08:21:21.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.786 186792 DEBUG nova.network.neutron [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Updating instance_info_cache with network_info: [{"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.806 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.807 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Instance network_info: |[{"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.807 186792 DEBUG oslo_concurrency.lockutils [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.808 186792 DEBUG nova.network.neutron [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Refreshing network info cache for port 5569421a-dc56-40c1-8ce0-e4d52b397e8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.811 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Start _get_guest_xml network_info=[{"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.815 186792 WARNING nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.819 186792 DEBUG nova.virt.libvirt.host [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.820 186792 DEBUG nova.virt.libvirt.host [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.825 186792 DEBUG nova.virt.libvirt.host [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.826 186792 DEBUG nova.virt.libvirt.host [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.827 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.827 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.828 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.828 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.828 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.828 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.828 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.829 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.829 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.829 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.829 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.830 186792 DEBUG nova.virt.hardware [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.833 186792 DEBUG nova.virt.libvirt.vif [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1544516102',display_name='tempest-TestGettingAddress-server-1544516102',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1544516102',id=154,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNw6MJSPGNLL+kyz8EDCo5RBiInn7CToNjad/C5G7xXekytky+w7dqC4DTuvRP5lSmyynbA95/tXuxj5C1AvwD2ls8Ttqlo5U7TSLLaNP5qNVVinR5ySHcvnqVNdDgyGbg==',key_name='tempest-TestGettingAddress-227311633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ie4q754v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:21:18Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=bebef998-c2ca-462e-95b1-d61e5c6f1198,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.833 186792 DEBUG nova.network.os_vif_util [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.834 186792 DEBUG nova.network.os_vif_util [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:4f:da,bridge_name='br-int',has_traffic_filtering=True,id=5569421a-dc56-40c1-8ce0-e4d52b397e8e,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5569421a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.835 186792 DEBUG nova.objects.instance [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid bebef998-c2ca-462e-95b1-d61e5c6f1198 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.846 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <uuid>bebef998-c2ca-462e-95b1-d61e5c6f1198</uuid>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <name>instance-0000009a</name>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestGettingAddress-server-1544516102</nova:name>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:21:22</nova:creationTime>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        <nova:port uuid="5569421a-dc56-40c1-8ce0-e4d52b397e8e">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe9a:4fda" ipVersion="6"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe9a:4fda" ipVersion="6"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <entry name="serial">bebef998-c2ca-462e-95b1-d61e5c6f1198</entry>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <entry name="uuid">bebef998-c2ca-462e-95b1-d61e5c6f1198</entry>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk.config"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:9a:4f:da"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <target dev="tap5569421a-dc"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/console.log" append="off"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:21:22 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:21:22 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:21:22 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:21:22 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.847 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Preparing to wait for external event network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.848 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.848 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.848 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.849 186792 DEBUG nova.virt.libvirt.vif [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1544516102',display_name='tempest-TestGettingAddress-server-1544516102',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1544516102',id=154,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNw6MJSPGNLL+kyz8EDCo5RBiInn7CToNjad/C5G7xXekytky+w7dqC4DTuvRP5lSmyynbA95/tXuxj5C1AvwD2ls8Ttqlo5U7TSLLaNP5qNVVinR5ySHcvnqVNdDgyGbg==',key_name='tempest-TestGettingAddress-227311633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ie4q754v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:21:18Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=bebef998-c2ca-462e-95b1-d61e5c6f1198,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.849 186792 DEBUG nova.network.os_vif_util [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.850 186792 DEBUG nova.network.os_vif_util [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:4f:da,bridge_name='br-int',has_traffic_filtering=True,id=5569421a-dc56-40c1-8ce0-e4d52b397e8e,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5569421a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.850 186792 DEBUG os_vif [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:4f:da,bridge_name='br-int',has_traffic_filtering=True,id=5569421a-dc56-40c1-8ce0-e4d52b397e8e,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5569421a-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.851 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.851 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.851 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.853 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.853 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5569421a-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.854 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5569421a-dc, col_values=(('external_ids', {'iface-id': '5569421a-dc56-40c1-8ce0-e4d52b397e8e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:4f:da', 'vm-uuid': 'bebef998-c2ca-462e-95b1-d61e5c6f1198'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.855 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:22 np0005531888 NetworkManager[55166]: <info>  [1763799682.8562] manager: (tap5569421a-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.857 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.861 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.862 186792 INFO os_vif [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:4f:da,bridge_name='br-int',has_traffic_filtering=True,id=5569421a-dc56-40c1-8ce0-e4d52b397e8e,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5569421a-dc')#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.899 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.900 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.900 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:9a:4f:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.901 186792 INFO nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Using config drive#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:22 np0005531888 nova_compute[186788]: 2025-11-22 08:21:22.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.231 186792 INFO nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Creating config drive at /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk.config#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.238 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwprwsep9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.365 186792 DEBUG oslo_concurrency.processutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwprwsep9" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:23 np0005531888 kernel: tap5569421a-dc: entered promiscuous mode
Nov 22 03:21:23 np0005531888 NetworkManager[55166]: <info>  [1763799683.4432] manager: (tap5569421a-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/284)
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:23Z|00603|binding|INFO|Claiming lport 5569421a-dc56-40c1-8ce0-e4d52b397e8e for this chassis.
Nov 22 03:21:23 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:23Z|00604|binding|INFO|5569421a-dc56-40c1-8ce0-e4d52b397e8e: Claiming fa:16:3e:9a:4f:da 10.100.0.12 2001:db8:0:1:f816:3eff:fe9a:4fda 2001:db8::f816:3eff:fe9a:4fda
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.446 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.464 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:4f:da 10.100.0.12 2001:db8:0:1:f816:3eff:fe9a:4fda 2001:db8::f816:3eff:fe9a:4fda'], port_security=['fa:16:3e:9a:4f:da 10.100.0.12 2001:db8:0:1:f816:3eff:fe9a:4fda 2001:db8::f816:3eff:fe9a:4fda'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8:0:1:f816:3eff:fe9a:4fda/64 2001:db8::f816:3eff:fe9a:4fda/64', 'neutron:device_id': 'bebef998-c2ca-462e-95b1-d61e5c6f1198', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77dc402d-bf06-4a39-8313-1435ce0160f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=5569421a-dc56-40c1-8ce0-e4d52b397e8e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.466 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 5569421a-dc56-40c1-8ce0-e4d52b397e8e in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 bound to our chassis#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.467 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b35c418-bf90-4666-a674-9b7153e90ab7#033[00m
Nov 22 03:21:23 np0005531888 systemd-udevd[240407]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.477 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[70726066-e446-4cdb-b437-754f147529bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.478 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b35c418-b1 in ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.480 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b35c418-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.481 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d9717d-0c29-4ac8-9f35-35d45591ed3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.481 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff2e281-ade8-4d55-b81f-a007baa7763e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 systemd-machined[153106]: New machine qemu-75-instance-0000009a.
Nov 22 03:21:23 np0005531888 NetworkManager[55166]: <info>  [1763799683.4921] device (tap5569421a-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:21:23 np0005531888 NetworkManager[55166]: <info>  [1763799683.4927] device (tap5569421a-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.492 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[ca06af78-cbfc-4257-8eba-a89905e06398]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.501 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:23Z|00605|binding|INFO|Setting lport 5569421a-dc56-40c1-8ce0-e4d52b397e8e ovn-installed in OVS
Nov 22 03:21:23 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:23Z|00606|binding|INFO|Setting lport 5569421a-dc56-40c1-8ce0-e4d52b397e8e up in Southbound
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.509 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 systemd[1]: Started Virtual Machine qemu-75-instance-0000009a.
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.518 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[113391e9-5e91-4fa2-bd54-944e402e7721]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.547 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d7e9f0-f391-455a-9473-e992d23c0cb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 NetworkManager[55166]: <info>  [1763799683.5555] manager: (tap6b35c418-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/285)
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.554 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[59ee9534-db9f-4338-bc55-389863c26fde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.586 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4097f4ea-be13-4017-8b4f-f646eddbbfaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.588 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[52ffe0f9-41b9-4d1b-ac46-911ea935dbee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 NetworkManager[55166]: <info>  [1763799683.6079] device (tap6b35c418-b0): carrier: link connected
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.611 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[bea54b3d-b659-4982-9c5a-509d6e57a40b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.628 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e50050-6b5a-4ccf-8650-0a0ec9407f85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b35c418-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:96:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642425, 'reachable_time': 25633, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240441, 'error': None, 'target': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.643 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[26044fa0-7b05-4583-8f2c-9b1cfb7adad3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:96a1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642425, 'tstamp': 642425}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240442, 'error': None, 'target': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.659 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[502eca25-71a8-40d0-96fb-c89f60d08144]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b35c418-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:96:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642425, 'reachable_time': 25633, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240443, 'error': None, 'target': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.685 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7a256d87-f9ce-463c-9ad7-b426d5ebbd0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.738 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2b875bfd-0ea7-4d7f-886a-7862c0af50f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.740 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b35c418-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.740 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.741 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b35c418-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:23 np0005531888 NetworkManager[55166]: <info>  [1763799683.7440] manager: (tap6b35c418-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Nov 22 03:21:23 np0005531888 kernel: tap6b35c418-b0: entered promiscuous mode
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.743 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.746 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.747 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b35c418-b0, col_values=(('external_ids', {'iface-id': '397a3db2-78b9-4182-b3e5-f29d5ae58cda'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.748 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:23Z|00607|binding|INFO|Releasing lport 397a3db2-78b9-4182-b3e5-f29d5ae58cda from this chassis (sb_readonly=0)
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.751 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.752 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b35c418-bf90-4666-a674-9b7153e90ab7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b35c418-bf90-4666-a674-9b7153e90ab7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.753 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5658e150-7139-452c-83d1-9f83ad9eb8bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.754 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-6b35c418-bf90-4666-a674-9b7153e90ab7
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/6b35c418-bf90-4666-a674-9b7153e90ab7.pid.haproxy
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 6b35c418-bf90-4666-a674-9b7153e90ab7
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:21:23 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:23.754 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'env', 'PROCESS_TAG=haproxy-6b35c418-bf90-4666-a674-9b7153e90ab7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b35c418-bf90-4666-a674-9b7153e90ab7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.764 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.856 186792 DEBUG nova.compute.manager [req-3cd386c7-ea79-4ff6-bd0d-3ff8a6611461 req-46f298f8-fc0d-427b-8e2e-398ab1e91101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.858 186792 DEBUG oslo_concurrency.lockutils [req-3cd386c7-ea79-4ff6-bd0d-3ff8a6611461 req-46f298f8-fc0d-427b-8e2e-398ab1e91101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.859 186792 DEBUG oslo_concurrency.lockutils [req-3cd386c7-ea79-4ff6-bd0d-3ff8a6611461 req-46f298f8-fc0d-427b-8e2e-398ab1e91101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.859 186792 DEBUG oslo_concurrency.lockutils [req-3cd386c7-ea79-4ff6-bd0d-3ff8a6611461 req-46f298f8-fc0d-427b-8e2e-398ab1e91101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.859 186792 DEBUG nova.compute.manager [req-3cd386c7-ea79-4ff6-bd0d-3ff8a6611461 req-46f298f8-fc0d-427b-8e2e-398ab1e91101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Processing event network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.923 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.923 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799683.9225028, bebef998-c2ca-462e-95b1-d61e5c6f1198 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.924 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] VM Started (Lifecycle Event)#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.926 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.929 186792 INFO nova.virt.libvirt.driver [-] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Instance spawned successfully.#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.930 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.951 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.954 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.954 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.955 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.955 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.956 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.956 186792 DEBUG nova.virt.libvirt.driver [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.961 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:21:23 np0005531888 nova_compute[186788]: 2025-11-22 08:21:23.995 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.000 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.001 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799683.9234962, bebef998-c2ca-462e-95b1-d61e5c6f1198 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.001 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.025 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.030 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799683.926679, bebef998-c2ca-462e-95b1-d61e5c6f1198 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.031 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.050 186792 INFO nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Took 5.68 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.050 186792 DEBUG nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.078 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.080 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.120 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:21:24 np0005531888 podman[240482]: 2025-11-22 08:21:24.124717389 +0000 UTC m=+0.052633876 container create e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:21:24 np0005531888 systemd[1]: Started libpod-conmon-e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974.scope.
Nov 22 03:21:24 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:21:24 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b771284b9565650aef5af89dd610990f15ab77c6e71e3a73298aa711676e0027/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:21:24 np0005531888 podman[240482]: 2025-11-22 08:21:24.097285084 +0000 UTC m=+0.025201581 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:21:24 np0005531888 podman[240482]: 2025-11-22 08:21:24.201578769 +0000 UTC m=+0.129495266 container init e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:21:24 np0005531888 podman[240482]: 2025-11-22 08:21:24.208383467 +0000 UTC m=+0.136299944 container start e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.213 186792 INFO nova.compute.manager [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Took 6.24 seconds to build instance.#033[00m
Nov 22 03:21:24 np0005531888 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[240497]: [NOTICE]   (240501) : New worker (240503) forked
Nov 22 03:21:24 np0005531888 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[240497]: [NOTICE]   (240501) : Loading success.
Nov 22 03:21:24 np0005531888 nova_compute[186788]: 2025-11-22 08:21:24.245 186792 DEBUG oslo_concurrency.lockutils [None req-94b00240-54f2-4cf1-a713-82bd03d0fc9d 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.580 186792 DEBUG nova.network.neutron [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Updated VIF entry in instance network info cache for port 5569421a-dc56-40c1-8ce0-e4d52b397e8e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.581 186792 DEBUG nova.network.neutron [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Updating instance_info_cache with network_info: [{"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.594 186792 DEBUG oslo_concurrency.lockutils [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.958 186792 DEBUG nova.compute.manager [req-2b4edb70-adaa-45cb-b52f-c875257a2c2e req-d61af5ee-438c-4368-9564-a8ead88bc377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.959 186792 DEBUG oslo_concurrency.lockutils [req-2b4edb70-adaa-45cb-b52f-c875257a2c2e req-d61af5ee-438c-4368-9564-a8ead88bc377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.959 186792 DEBUG oslo_concurrency.lockutils [req-2b4edb70-adaa-45cb-b52f-c875257a2c2e req-d61af5ee-438c-4368-9564-a8ead88bc377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.960 186792 DEBUG oslo_concurrency.lockutils [req-2b4edb70-adaa-45cb-b52f-c875257a2c2e req-d61af5ee-438c-4368-9564-a8ead88bc377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.960 186792 DEBUG nova.compute.manager [req-2b4edb70-adaa-45cb-b52f-c875257a2c2e req-d61af5ee-438c-4368-9564-a8ead88bc377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] No waiting events found dispatching network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:21:25 np0005531888 nova_compute[186788]: 2025-11-22 08:21:25.961 186792 WARNING nova.compute.manager [req-2b4edb70-adaa-45cb-b52f-c875257a2c2e req-d61af5ee-438c-4368-9564-a8ead88bc377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received unexpected event network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e for instance with vm_state active and task_state None.#033[00m
Nov 22 03:21:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:27Z|00608|binding|INFO|Releasing lport 397a3db2-78b9-4182-b3e5-f29d5ae58cda from this chassis (sb_readonly=0)
Nov 22 03:21:27 np0005531888 NetworkManager[55166]: <info>  [1763799687.7514] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Nov 22 03:21:27 np0005531888 nova_compute[186788]: 2025-11-22 08:21:27.751 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:27 np0005531888 NetworkManager[55166]: <info>  [1763799687.7523] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Nov 22 03:21:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:27Z|00609|binding|INFO|Releasing lport 397a3db2-78b9-4182-b3e5-f29d5ae58cda from this chassis (sb_readonly=0)
Nov 22 03:21:27 np0005531888 nova_compute[186788]: 2025-11-22 08:21:27.779 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:27 np0005531888 nova_compute[186788]: 2025-11-22 08:21:27.785 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:27 np0005531888 nova_compute[186788]: 2025-11-22 08:21:27.856 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:28 np0005531888 nova_compute[186788]: 2025-11-22 08:21:28.294 186792 DEBUG nova.compute.manager [req-d5e7e8d8-10cc-43a3-9b92-926274fc4450 req-073e4907-20ce-47f4-a389-23fcb0938fc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-changed-5569421a-dc56-40c1-8ce0-e4d52b397e8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:21:28 np0005531888 nova_compute[186788]: 2025-11-22 08:21:28.295 186792 DEBUG nova.compute.manager [req-d5e7e8d8-10cc-43a3-9b92-926274fc4450 req-073e4907-20ce-47f4-a389-23fcb0938fc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Refreshing instance network info cache due to event network-changed-5569421a-dc56-40c1-8ce0-e4d52b397e8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:21:28 np0005531888 nova_compute[186788]: 2025-11-22 08:21:28.295 186792 DEBUG oslo_concurrency.lockutils [req-d5e7e8d8-10cc-43a3-9b92-926274fc4450 req-073e4907-20ce-47f4-a389-23fcb0938fc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:21:28 np0005531888 nova_compute[186788]: 2025-11-22 08:21:28.295 186792 DEBUG oslo_concurrency.lockutils [req-d5e7e8d8-10cc-43a3-9b92-926274fc4450 req-073e4907-20ce-47f4-a389-23fcb0938fc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:21:28 np0005531888 nova_compute[186788]: 2025-11-22 08:21:28.295 186792 DEBUG nova.network.neutron [req-d5e7e8d8-10cc-43a3-9b92-926274fc4450 req-073e4907-20ce-47f4-a389-23fcb0938fc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Refreshing network info cache for port 5569421a-dc56-40c1-8ce0-e4d52b397e8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:21:28 np0005531888 nova_compute[186788]: 2025-11-22 08:21:28.996 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:29 np0005531888 podman[240514]: 2025-11-22 08:21:29.688585622 +0000 UTC m=+0.061438392 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:21:29 np0005531888 podman[240513]: 2025-11-22 08:21:29.702878084 +0000 UTC m=+0.079167709 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:21:30 np0005531888 nova_compute[186788]: 2025-11-22 08:21:30.893 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:31 np0005531888 nova_compute[186788]: 2025-11-22 08:21:31.724 186792 DEBUG nova.network.neutron [req-d5e7e8d8-10cc-43a3-9b92-926274fc4450 req-073e4907-20ce-47f4-a389-23fcb0938fc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Updated VIF entry in instance network info cache for port 5569421a-dc56-40c1-8ce0-e4d52b397e8e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:21:31 np0005531888 nova_compute[186788]: 2025-11-22 08:21:31.725 186792 DEBUG nova.network.neutron [req-d5e7e8d8-10cc-43a3-9b92-926274fc4450 req-073e4907-20ce-47f4-a389-23fcb0938fc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Updating instance_info_cache with network_info: [{"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:21:31 np0005531888 nova_compute[186788]: 2025-11-22 08:21:31.769 186792 DEBUG oslo_concurrency.lockutils [req-d5e7e8d8-10cc-43a3-9b92-926274fc4450 req-073e4907-20ce-47f4-a389-23fcb0938fc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:21:32 np0005531888 nova_compute[186788]: 2025-11-22 08:21:32.857 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:34 np0005531888 nova_compute[186788]: 2025-11-22 08:21:34.002 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:36.836 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:36.837 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:36.837 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:37Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9a:4f:da 10.100.0.12
Nov 22 03:21:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:21:37Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:4f:da 10.100.0.12
Nov 22 03:21:37 np0005531888 nova_compute[186788]: 2025-11-22 08:21:37.859 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:39 np0005531888 nova_compute[186788]: 2025-11-22 08:21:39.003 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:39 np0005531888 nova_compute[186788]: 2025-11-22 08:21:39.400 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:41 np0005531888 podman[240572]: 2025-11-22 08:21:41.675388956 +0000 UTC m=+0.049193371 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:21:41 np0005531888 podman[240571]: 2025-11-22 08:21:41.678602065 +0000 UTC m=+0.057191518 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:21:42 np0005531888 nova_compute[186788]: 2025-11-22 08:21:42.863 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:44 np0005531888 nova_compute[186788]: 2025-11-22 08:21:44.005 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:46 np0005531888 podman[240616]: 2025-11-22 08:21:46.681634366 +0000 UTC m=+0.060586081 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:21:47 np0005531888 podman[240637]: 2025-11-22 08:21:47.688257923 +0000 UTC m=+0.066255730 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:21:47 np0005531888 podman[240638]: 2025-11-22 08:21:47.70520031 +0000 UTC m=+0.079968358 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:21:47 np0005531888 nova_compute[186788]: 2025-11-22 08:21:47.864 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:49 np0005531888 nova_compute[186788]: 2025-11-22 08:21:49.008 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:52 np0005531888 nova_compute[186788]: 2025-11-22 08:21:52.865 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:54 np0005531888 nova_compute[186788]: 2025-11-22 08:21:54.010 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:57 np0005531888 nova_compute[186788]: 2025-11-22 08:21:57.867 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:58 np0005531888 nova_compute[186788]: 2025-11-22 08:21:58.535 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:58.535 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:21:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:21:58.537 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:21:59 np0005531888 nova_compute[186788]: 2025-11-22 08:21:59.012 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:00 np0005531888 podman[240681]: 2025-11-22 08:22:00.67462941 +0000 UTC m=+0.042374053 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 03:22:00 np0005531888 podman[240680]: 2025-11-22 08:22:00.675361818 +0000 UTC m=+0.045045368 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:22:02 np0005531888 nova_compute[186788]: 2025-11-22 08:22:02.868 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:04 np0005531888 nova_compute[186788]: 2025-11-22 08:22:04.013 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:04 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:04.540 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:22:05 np0005531888 nova_compute[186788]: 2025-11-22 08:22:05.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:06 np0005531888 nova_compute[186788]: 2025-11-22 08:22:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:07 np0005531888 nova_compute[186788]: 2025-11-22 08:22:07.871 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:09 np0005531888 nova_compute[186788]: 2025-11-22 08:22:09.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:09 np0005531888 nova_compute[186788]: 2025-11-22 08:22:09.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:09 np0005531888 nova_compute[186788]: 2025-11-22 08:22:09.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:22:09 np0005531888 nova_compute[186788]: 2025-11-22 08:22:09.957 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:22:11 np0005531888 nova_compute[186788]: 2025-11-22 08:22:11.391 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:22:11 np0005531888 nova_compute[186788]: 2025-11-22 08:22:11.392 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:22:11 np0005531888 nova_compute[186788]: 2025-11-22 08:22:11.392 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:22:11 np0005531888 nova_compute[186788]: 2025-11-22 08:22:11.392 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bebef998-c2ca-462e-95b1-d61e5c6f1198 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:22:12 np0005531888 podman[240725]: 2025-11-22 08:22:12.689425432 +0000 UTC m=+0.065462410 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:22:12 np0005531888 podman[240726]: 2025-11-22 08:22:12.689914455 +0000 UTC m=+0.060110989 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:22:12 np0005531888 nova_compute[186788]: 2025-11-22 08:22:12.873 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:14 np0005531888 nova_compute[186788]: 2025-11-22 08:22:14.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:15 np0005531888 nova_compute[186788]: 2025-11-22 08:22:15.773 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Updating instance_info_cache with network_info: [{"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:22:15 np0005531888 nova_compute[186788]: 2025-11-22 08:22:15.787 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:22:15 np0005531888 nova_compute[186788]: 2025-11-22 08:22:15.787 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:22:15 np0005531888 nova_compute[186788]: 2025-11-22 08:22:15.788 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:15 np0005531888 nova_compute[186788]: 2025-11-22 08:22:15.788 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:17 np0005531888 podman[240764]: 2025-11-22 08:22:17.686705342 +0000 UTC m=+0.058040379 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, config_id=edpm, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350)
Nov 22 03:22:17 np0005531888 podman[240786]: 2025-11-22 08:22:17.784734953 +0000 UTC m=+0.063994285 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 22 03:22:17 np0005531888 nova_compute[186788]: 2025-11-22 08:22:17.876 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:17 np0005531888 podman[240807]: 2025-11-22 08:22:17.900751175 +0000 UTC m=+0.086616840 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:22:17 np0005531888 nova_compute[186788]: 2025-11-22 08:22:17.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:18 np0005531888 nova_compute[186788]: 2025-11-22 08:22:18.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:18 np0005531888 nova_compute[186788]: 2025-11-22 08:22:18.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:18 np0005531888 nova_compute[186788]: 2025-11-22 08:22:18.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:18 np0005531888 nova_compute[186788]: 2025-11-22 08:22:18.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:18 np0005531888 nova_compute[186788]: 2025-11-22 08:22:18.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.019 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.049 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.112 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.113 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.169 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.334 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.336 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5518MB free_disk=73.2373275756836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.337 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.337 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.410 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance bebef998-c2ca-462e-95b1-d61e5c6f1198 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.411 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.411 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.452 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.465 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.483 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:22:19 np0005531888 nova_compute[186788]: 2025-11-22 08:22:19.483 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:20 np0005531888 nova_compute[186788]: 2025-11-22 08:22:20.483 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:22 np0005531888 nova_compute[186788]: 2025-11-22 08:22:22.878 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:24 np0005531888 nova_compute[186788]: 2025-11-22 08:22:24.022 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:24 np0005531888 nova_compute[186788]: 2025-11-22 08:22:24.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:24 np0005531888 nova_compute[186788]: 2025-11-22 08:22:24.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.841 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "bebef998-c2ca-462e-95b1-d61e5c6f1198" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.842 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.842 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.842 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.843 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.853 186792 INFO nova.compute.manager [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Terminating instance#033[00m
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.861 186792 DEBUG nova.compute.manager [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:22:25 np0005531888 kernel: tap5569421a-dc (unregistering): left promiscuous mode
Nov 22 03:22:25 np0005531888 NetworkManager[55166]: <info>  [1763799745.8971] device (tap5569421a-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:22:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:22:25Z|00610|binding|INFO|Releasing lport 5569421a-dc56-40c1-8ce0-e4d52b397e8e from this chassis (sb_readonly=0)
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.907 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:22:25Z|00611|binding|INFO|Setting lport 5569421a-dc56-40c1-8ce0-e4d52b397e8e down in Southbound
Nov 22 03:22:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:22:25Z|00612|binding|INFO|Removing iface tap5569421a-dc ovn-installed in OVS
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.909 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:25.919 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:4f:da 10.100.0.12 2001:db8:0:1:f816:3eff:fe9a:4fda 2001:db8::f816:3eff:fe9a:4fda'], port_security=['fa:16:3e:9a:4f:da 10.100.0.12 2001:db8:0:1:f816:3eff:fe9a:4fda 2001:db8::f816:3eff:fe9a:4fda'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8:0:1:f816:3eff:fe9a:4fda/64 2001:db8::f816:3eff:fe9a:4fda/64', 'neutron:device_id': 'bebef998-c2ca-462e-95b1-d61e5c6f1198', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '77dc402d-bf06-4a39-8313-1435ce0160f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=5569421a-dc56-40c1-8ce0-e4d52b397e8e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:22:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:25.920 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 5569421a-dc56-40c1-8ce0-e4d52b397e8e in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 unbound from our chassis#033[00m
Nov 22 03:22:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:25.921 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b35c418-bf90-4666-a674-9b7153e90ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:22:25 np0005531888 nova_compute[186788]: 2025-11-22 08:22:25.923 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:25.925 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f6c996-2534-4dab-b72b-64873afbdda1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:25.926 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 namespace which is not needed anymore#033[00m
Nov 22 03:22:25 np0005531888 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000009a.scope: Deactivated successfully.
Nov 22 03:22:25 np0005531888 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000009a.scope: Consumed 15.804s CPU time.
Nov 22 03:22:25 np0005531888 systemd-machined[153106]: Machine qemu-75-instance-0000009a terminated.
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.131 186792 DEBUG nova.compute.manager [req-34c7db94-42d6-49c0-97ee-95d2b53bda84 req-78e6684e-c9f4-4e4b-ba9a-1bd43e87ba52 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-vif-unplugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.132 186792 DEBUG oslo_concurrency.lockutils [req-34c7db94-42d6-49c0-97ee-95d2b53bda84 req-78e6684e-c9f4-4e4b-ba9a-1bd43e87ba52 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.132 186792 DEBUG oslo_concurrency.lockutils [req-34c7db94-42d6-49c0-97ee-95d2b53bda84 req-78e6684e-c9f4-4e4b-ba9a-1bd43e87ba52 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.133 186792 DEBUG oslo_concurrency.lockutils [req-34c7db94-42d6-49c0-97ee-95d2b53bda84 req-78e6684e-c9f4-4e4b-ba9a-1bd43e87ba52 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.133 186792 DEBUG nova.compute.manager [req-34c7db94-42d6-49c0-97ee-95d2b53bda84 req-78e6684e-c9f4-4e4b-ba9a-1bd43e87ba52 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] No waiting events found dispatching network-vif-unplugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.133 186792 DEBUG nova.compute.manager [req-34c7db94-42d6-49c0-97ee-95d2b53bda84 req-78e6684e-c9f4-4e4b-ba9a-1bd43e87ba52 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-vif-unplugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.135 186792 INFO nova.virt.libvirt.driver [-] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Instance destroyed successfully.#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.136 186792 DEBUG nova.objects.instance [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid bebef998-c2ca-462e-95b1-d61e5c6f1198 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.146 186792 DEBUG nova.virt.libvirt.vif [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1544516102',display_name='tempest-TestGettingAddress-server-1544516102',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1544516102',id=154,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNw6MJSPGNLL+kyz8EDCo5RBiInn7CToNjad/C5G7xXekytky+w7dqC4DTuvRP5lSmyynbA95/tXuxj5C1AvwD2ls8Ttqlo5U7TSLLaNP5qNVVinR5ySHcvnqVNdDgyGbg==',key_name='tempest-TestGettingAddress-227311633',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:21:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ie4q754v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:21:24Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=bebef998-c2ca-462e-95b1-d61e5c6f1198,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.147 186792 DEBUG nova.network.os_vif_util [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "address": "fa:16:3e:9a:4f:da", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9a:4fda", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5569421a-dc", "ovs_interfaceid": "5569421a-dc56-40c1-8ce0-e4d52b397e8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.148 186792 DEBUG nova.network.os_vif_util [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:4f:da,bridge_name='br-int',has_traffic_filtering=True,id=5569421a-dc56-40c1-8ce0-e4d52b397e8e,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5569421a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.148 186792 DEBUG os_vif [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:4f:da,bridge_name='br-int',has_traffic_filtering=True,id=5569421a-dc56-40c1-8ce0-e4d52b397e8e,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5569421a-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.150 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.150 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569421a-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.152 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.154 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.156 186792 INFO os_vif [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:4f:da,bridge_name='br-int',has_traffic_filtering=True,id=5569421a-dc56-40c1-8ce0-e4d52b397e8e,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5569421a-dc')#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.157 186792 INFO nova.virt.libvirt.driver [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Deleting instance files /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198_del#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.157 186792 INFO nova.virt.libvirt.driver [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Deletion of /var/lib/nova/instances/bebef998-c2ca-462e-95b1-d61e5c6f1198_del complete#033[00m
Nov 22 03:22:26 np0005531888 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[240497]: [NOTICE]   (240501) : haproxy version is 2.8.14-c23fe91
Nov 22 03:22:26 np0005531888 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[240497]: [NOTICE]   (240501) : path to executable is /usr/sbin/haproxy
Nov 22 03:22:26 np0005531888 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[240497]: [WARNING]  (240501) : Exiting Master process...
Nov 22 03:22:26 np0005531888 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[240497]: [WARNING]  (240501) : Exiting Master process...
Nov 22 03:22:26 np0005531888 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[240497]: [ALERT]    (240501) : Current worker (240503) exited with code 143 (Terminated)
Nov 22 03:22:26 np0005531888 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[240497]: [WARNING]  (240501) : All workers exited. Exiting... (0)
Nov 22 03:22:26 np0005531888 systemd[1]: libpod-e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974.scope: Deactivated successfully.
Nov 22 03:22:26 np0005531888 podman[240868]: 2025-11-22 08:22:26.181254999 +0000 UTC m=+0.169233492 container died e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.254 186792 INFO nova.compute.manager [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.255 186792 DEBUG oslo.service.loopingcall [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.255 186792 DEBUG nova.compute.manager [-] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:22:26 np0005531888 nova_compute[186788]: 2025-11-22 08:22:26.255 186792 DEBUG nova.network.neutron [-] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:22:26 np0005531888 systemd[1]: var-lib-containers-storage-overlay-b771284b9565650aef5af89dd610990f15ab77c6e71e3a73298aa711676e0027-merged.mount: Deactivated successfully.
Nov 22 03:22:26 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974-userdata-shm.mount: Deactivated successfully.
Nov 22 03:22:26 np0005531888 podman[240868]: 2025-11-22 08:22:26.780714072 +0000 UTC m=+0.768692565 container cleanup e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.367 186792 DEBUG nova.network.neutron [-] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.382 186792 INFO nova.compute.manager [-] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Took 1.13 seconds to deallocate network for instance.#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.437 186792 DEBUG nova.compute.manager [req-ddaa5e0b-12f1-4fcc-a111-ae94f677529b req-763273e6-a4d0-4e41-9d62-b2a014d8f7be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-vif-deleted-5569421a-dc56-40c1-8ce0-e4d52b397e8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.439 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.439 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.486 186792 DEBUG nova.compute.provider_tree [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.497 186792 DEBUG nova.scheduler.client.report [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.514 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.553 186792 INFO nova.scheduler.client.report [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance bebef998-c2ca-462e-95b1-d61e5c6f1198#033[00m
Nov 22 03:22:27 np0005531888 podman[240911]: 2025-11-22 08:22:27.565456271 +0000 UTC m=+0.762781580 container remove e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.570 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9c573943-3d20-42cf-9767-26a064a1ef36]: (4, ('Sat Nov 22 08:22:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 (e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974)\ne6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974\nSat Nov 22 08:22:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 (e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974)\ne6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.571 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[441340ed-0462-44f7-9e44-78bc6da77d88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.572 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b35c418-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.574 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:27 np0005531888 kernel: tap6b35c418-b0: left promiscuous mode
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.586 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.589 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e49c07be-02cd-49e4-9949-d2cc7d53f5f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:27 np0005531888 systemd[1]: libpod-conmon-e6b20c2db3c27e249a4f1b75fb4d94797937925e0c7988c73a563bb5bc39e974.scope: Deactivated successfully.
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.603 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[98caac2a-5d70-4cfd-9927-18a7648603e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.604 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1afc28bb-845c-4b9b-8a02-58ef996de31d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.618 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[17efd4fd-c87c-479e-9158-4d46cd5da270]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642418, 'reachable_time': 17041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240927, 'error': None, 'target': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.621 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:22:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:27.621 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec47e94-3aa7-45db-9622-02597f0832da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:27 np0005531888 systemd[1]: run-netns-ovnmeta\x2d6b35c418\x2dbf90\x2d4666\x2da674\x2d9b7153e90ab7.mount: Deactivated successfully.
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.626 186792 DEBUG oslo_concurrency.lockutils [None req-9c6721f4-4622-4c30-8495-0be903c43475 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.826 186792 DEBUG nova.compute.manager [req-6025ac54-0637-47c6-b9bd-1779ba89f460 req-2390a6c1-f92f-4ec5-8cac-52d9e8290c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-changed-5569421a-dc56-40c1-8ce0-e4d52b397e8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.826 186792 DEBUG nova.compute.manager [req-6025ac54-0637-47c6-b9bd-1779ba89f460 req-2390a6c1-f92f-4ec5-8cac-52d9e8290c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Refreshing instance network info cache due to event network-changed-5569421a-dc56-40c1-8ce0-e4d52b397e8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.826 186792 DEBUG oslo_concurrency.lockutils [req-6025ac54-0637-47c6-b9bd-1779ba89f460 req-2390a6c1-f92f-4ec5-8cac-52d9e8290c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.827 186792 DEBUG oslo_concurrency.lockutils [req-6025ac54-0637-47c6-b9bd-1779ba89f460 req-2390a6c1-f92f-4ec5-8cac-52d9e8290c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:22:27 np0005531888 nova_compute[186788]: 2025-11-22 08:22:27.827 186792 DEBUG nova.network.neutron [req-6025ac54-0637-47c6-b9bd-1779ba89f460 req-2390a6c1-f92f-4ec5-8cac-52d9e8290c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Refreshing network info cache for port 5569421a-dc56-40c1-8ce0-e4d52b397e8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.049 186792 DEBUG nova.network.neutron [req-6025ac54-0637-47c6-b9bd-1779ba89f460 req-2390a6c1-f92f-4ec5-8cac-52d9e8290c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.233 186792 DEBUG nova.compute.manager [req-f010aa0d-c8ee-41da-8993-541d93089437 req-413e5aeb-2189-436b-b2fd-32d7fcf03370 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received event network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.233 186792 DEBUG oslo_concurrency.lockutils [req-f010aa0d-c8ee-41da-8993-541d93089437 req-413e5aeb-2189-436b-b2fd-32d7fcf03370 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.234 186792 DEBUG oslo_concurrency.lockutils [req-f010aa0d-c8ee-41da-8993-541d93089437 req-413e5aeb-2189-436b-b2fd-32d7fcf03370 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.234 186792 DEBUG oslo_concurrency.lockutils [req-f010aa0d-c8ee-41da-8993-541d93089437 req-413e5aeb-2189-436b-b2fd-32d7fcf03370 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bebef998-c2ca-462e-95b1-d61e5c6f1198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.234 186792 DEBUG nova.compute.manager [req-f010aa0d-c8ee-41da-8993-541d93089437 req-413e5aeb-2189-436b-b2fd-32d7fcf03370 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] No waiting events found dispatching network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.234 186792 WARNING nova.compute.manager [req-f010aa0d-c8ee-41da-8993-541d93089437 req-413e5aeb-2189-436b-b2fd-32d7fcf03370 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Received unexpected event network-vif-plugged-5569421a-dc56-40c1-8ce0-e4d52b397e8e for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.372 186792 DEBUG nova.network.neutron [req-6025ac54-0637-47c6-b9bd-1779ba89f460 req-2390a6c1-f92f-4ec5-8cac-52d9e8290c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 22 03:22:28 np0005531888 nova_compute[186788]: 2025-11-22 08:22:28.372 186792 DEBUG oslo_concurrency.lockutils [req-6025ac54-0637-47c6-b9bd-1779ba89f460 req-2390a6c1-f92f-4ec5-8cac-52d9e8290c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-bebef998-c2ca-462e-95b1-d61e5c6f1198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:22:29 np0005531888 nova_compute[186788]: 2025-11-22 08:22:29.025 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:31 np0005531888 nova_compute[186788]: 2025-11-22 08:22:31.153 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:31 np0005531888 podman[240928]: 2025-11-22 08:22:31.688716195 +0000 UTC m=+0.060723614 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:22:31 np0005531888 podman[240929]: 2025-11-22 08:22:31.694749203 +0000 UTC m=+0.060316594 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:22:34 np0005531888 nova_compute[186788]: 2025-11-22 08:22:34.025 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:36 np0005531888 nova_compute[186788]: 2025-11-22 08:22:36.148 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:36.148 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:22:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:36.151 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:22:36 np0005531888 nova_compute[186788]: 2025-11-22 08:22:36.155 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:36.836 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:36.837 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:36.837 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.847 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:22:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531888 nova_compute[186788]: 2025-11-22 08:22:36.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:39 np0005531888 nova_compute[186788]: 2025-11-22 08:22:39.028 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:40 np0005531888 nova_compute[186788]: 2025-11-22 08:22:40.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:40 np0005531888 nova_compute[186788]: 2025-11-22 08:22:40.756 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:41 np0005531888 nova_compute[186788]: 2025-11-22 08:22:41.129 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799746.128035, bebef998-c2ca-462e-95b1-d61e5c6f1198 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:22:41 np0005531888 nova_compute[186788]: 2025-11-22 08:22:41.130 186792 INFO nova.compute.manager [-] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:22:41 np0005531888 nova_compute[186788]: 2025-11-22 08:22:41.147 186792 DEBUG nova.compute.manager [None req-3f7ec07b-a58b-4b4f-8248-65689e0fba64 - - - - - -] [instance: bebef998-c2ca-462e-95b1-d61e5c6f1198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:22:41 np0005531888 nova_compute[186788]: 2025-11-22 08:22:41.156 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:22:43.153 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:22:43 np0005531888 podman[240973]: 2025-11-22 08:22:43.701881388 +0000 UTC m=+0.060524489 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:22:43 np0005531888 podman[240972]: 2025-11-22 08:22:43.714069698 +0000 UTC m=+0.078670556 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 22 03:22:44 np0005531888 nova_compute[186788]: 2025-11-22 08:22:44.029 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:46 np0005531888 nova_compute[186788]: 2025-11-22 08:22:46.159 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:48 np0005531888 podman[241017]: 2025-11-22 08:22:48.709952541 +0000 UTC m=+0.077247380 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:22:48 np0005531888 podman[241016]: 2025-11-22 08:22:48.721675459 +0000 UTC m=+0.093938211 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 03:22:48 np0005531888 podman[241018]: 2025-11-22 08:22:48.761708074 +0000 UTC m=+0.128307417 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:22:49 np0005531888 nova_compute[186788]: 2025-11-22 08:22:49.031 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:51 np0005531888 nova_compute[186788]: 2025-11-22 08:22:51.162 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:54 np0005531888 nova_compute[186788]: 2025-11-22 08:22:54.032 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:56 np0005531888 nova_compute[186788]: 2025-11-22 08:22:56.166 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:59 np0005531888 nova_compute[186788]: 2025-11-22 08:22:59.033 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:01 np0005531888 nova_compute[186788]: 2025-11-22 08:23:01.168 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:02 np0005531888 podman[241083]: 2025-11-22 08:23:02.679928938 +0000 UTC m=+0.053835934 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:23:02 np0005531888 podman[241084]: 2025-11-22 08:23:02.705503648 +0000 UTC m=+0.078494171 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 03:23:04 np0005531888 nova_compute[186788]: 2025-11-22 08:23:04.034 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:06 np0005531888 nova_compute[186788]: 2025-11-22 08:23:06.172 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:06 np0005531888 nova_compute[186788]: 2025-11-22 08:23:06.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:06 np0005531888 nova_compute[186788]: 2025-11-22 08:23:06.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:09 np0005531888 nova_compute[186788]: 2025-11-22 08:23:09.036 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:10 np0005531888 nova_compute[186788]: 2025-11-22 08:23:10.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:10 np0005531888 nova_compute[186788]: 2025-11-22 08:23:10.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:23:10 np0005531888 nova_compute[186788]: 2025-11-22 08:23:10.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:23:10 np0005531888 nova_compute[186788]: 2025-11-22 08:23:10.968 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:23:11 np0005531888 nova_compute[186788]: 2025-11-22 08:23:11.175 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:12 np0005531888 nova_compute[186788]: 2025-11-22 08:23:12.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:12 np0005531888 nova_compute[186788]: 2025-11-22 08:23:12.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:14 np0005531888 nova_compute[186788]: 2025-11-22 08:23:14.038 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:14 np0005531888 podman[241127]: 2025-11-22 08:23:14.675190092 +0000 UTC m=+0.045589012 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:23:14 np0005531888 podman[241126]: 2025-11-22 08:23:14.682983793 +0000 UTC m=+0.057406242 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 03:23:16 np0005531888 nova_compute[186788]: 2025-11-22 08:23:16.178 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:16.246 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:f1:bb 10.100.0.2 2001:db8::f816:3eff:fe10:f1bb'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe10:f1bb/64', 'neutron:device_id': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c110aad-90e5-4caa-b631-3c18861eaadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e1bc69f6-ec55-4040-be0d-44f334cbe3a6) old=Port_Binding(mac=['fa:16:3e:10:f1:bb 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:23:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:16.247 104023 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e1bc69f6-ec55-4040-be0d-44f334cbe3a6 in datapath 326c0814-77d4-416b-a5a1-28be00b61ecd updated#033[00m
Nov 22 03:23:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:16.248 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 326c0814-77d4-416b-a5a1-28be00b61ecd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:23:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:16.250 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cc86d6c3-6a9c-4f08-a74f-851b443bc08a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:18 np0005531888 nova_compute[186788]: 2025-11-22 08:23:18.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:18 np0005531888 nova_compute[186788]: 2025-11-22 08:23:18.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:18 np0005531888 nova_compute[186788]: 2025-11-22 08:23:18.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:18 np0005531888 nova_compute[186788]: 2025-11-22 08:23:18.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:18 np0005531888 nova_compute[186788]: 2025-11-22 08:23:18.983 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.039 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:19 np0005531888 podman[241169]: 2025-11-22 08:23:19.099070559 +0000 UTC m=+0.071251483 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public)
Nov 22 03:23:19 np0005531888 podman[241170]: 2025-11-22 08:23:19.09911204 +0000 UTC m=+0.070603547 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:23:19 np0005531888 podman[241171]: 2025-11-22 08:23:19.139260158 +0000 UTC m=+0.101229370 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller)
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.210 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.212 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5711MB free_disk=73.26655197143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.212 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.212 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.274 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.275 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.296 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.308 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.337 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:23:19 np0005531888 nova_compute[186788]: 2025-11-22 08:23:19.337 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:20 np0005531888 nova_compute[186788]: 2025-11-22 08:23:20.338 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:20 np0005531888 nova_compute[186788]: 2025-11-22 08:23:20.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.182 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.183 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.183 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.202 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.296 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.296 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.304 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.304 186792 INFO nova.compute.claims [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.436 186792 DEBUG nova.compute.provider_tree [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.449 186792 DEBUG nova.scheduler.client.report [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.492 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.494 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.569 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.569 186792 DEBUG nova.network.neutron [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.613 186792 INFO nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.638 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.825 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.827 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.827 186792 INFO nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Creating image(s)#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.828 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.828 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.829 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.841 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.906 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.907 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.908 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.923 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.979 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:23:21 np0005531888 nova_compute[186788]: 2025-11-22 08:23:21.980 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.079 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk 1073741824" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.080 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.081 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.135 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.137 186792 DEBUG nova.virt.disk.api [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.137 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.157 186792 DEBUG nova.policy [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.201 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.202 186792 DEBUG nova.virt.disk.api [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.202 186792 DEBUG nova.objects.instance [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid bdc89be5-8a10-4ee8-85d6-a7243fa7d970 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.218 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.219 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Ensure instance console log exists: /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.219 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.220 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:22 np0005531888 nova_compute[186788]: 2025-11-22 08:23:22.220 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:24 np0005531888 nova_compute[186788]: 2025-11-22 08:23:24.043 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:24 np0005531888 nova_compute[186788]: 2025-11-22 08:23:24.283 186792 DEBUG nova.network.neutron [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Successfully created port: 239d172c-f1e8-411e-86cb-a1f9d9a6809f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:23:24 np0005531888 nova_compute[186788]: 2025-11-22 08:23:24.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:24 np0005531888 nova_compute[186788]: 2025-11-22 08:23:24.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:23:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:26.123 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.124 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:26.124 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.183 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.440 186792 DEBUG nova.network.neutron [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Successfully updated port: 239d172c-f1e8-411e-86cb-a1f9d9a6809f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.498 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.498 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.498 186792 DEBUG nova.network.neutron [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.566 186792 DEBUG nova.compute.manager [req-2d134ccb-19e0-4ead-90a4-a9c5d1a38c50 req-8c705746-3fe8-4786-a366-1ae21735b04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received event network-changed-239d172c-f1e8-411e-86cb-a1f9d9a6809f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.567 186792 DEBUG nova.compute.manager [req-2d134ccb-19e0-4ead-90a4-a9c5d1a38c50 req-8c705746-3fe8-4786-a366-1ae21735b04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Refreshing instance network info cache due to event network-changed-239d172c-f1e8-411e-86cb-a1f9d9a6809f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.567 186792 DEBUG oslo_concurrency.lockutils [req-2d134ccb-19e0-4ead-90a4-a9c5d1a38c50 req-8c705746-3fe8-4786-a366-1ae21735b04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:23:26 np0005531888 nova_compute[186788]: 2025-11-22 08:23:26.740 186792 DEBUG nova.network.neutron [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.170 186792 DEBUG nova.network.neutron [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Updating instance_info_cache with network_info: [{"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.192 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.192 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Instance network_info: |[{"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.193 186792 DEBUG oslo_concurrency.lockutils [req-2d134ccb-19e0-4ead-90a4-a9c5d1a38c50 req-8c705746-3fe8-4786-a366-1ae21735b04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.193 186792 DEBUG nova.network.neutron [req-2d134ccb-19e0-4ead-90a4-a9c5d1a38c50 req-8c705746-3fe8-4786-a366-1ae21735b04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Refreshing network info cache for port 239d172c-f1e8-411e-86cb-a1f9d9a6809f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.199 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Start _get_guest_xml network_info=[{"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.204 186792 WARNING nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.212 186792 DEBUG nova.virt.libvirt.host [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.213 186792 DEBUG nova.virt.libvirt.host [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.216 186792 DEBUG nova.virt.libvirt.host [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.218 186792 DEBUG nova.virt.libvirt.host [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.219 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.219 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.220 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.220 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.220 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.221 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.221 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.221 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.221 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.222 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.222 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.222 186792 DEBUG nova.virt.hardware [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.227 186792 DEBUG nova.virt.libvirt.vif [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:23:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-80098772',display_name='tempest-TestNetworkAdvancedServerOps-server-80098772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-80098772',id=157,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGLhyUJ+KEZ71YtAJ371P9F3aIOxDv82GXOM2rDSaovbuwE7ZludLQPbMWKyRsNCmds/O3FEcOZmrL6F7mlqNkfmkAfib4ejLY8sYaalCrugYwrch5fmUHjD7KlYtnKPQ==',key_name='tempest-TestNetworkAdvancedServerOps-1378316896',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-gxtkp165',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:23:21Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=bdc89be5-8a10-4ee8-85d6-a7243fa7d970,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.228 186792 DEBUG nova.network.os_vif_util [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.229 186792 DEBUG nova.network.os_vif_util [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b4:ce,bridge_name='br-int',has_traffic_filtering=True,id=239d172c-f1e8-411e-86cb-a1f9d9a6809f,network=Network(026c2103-167f-4a9a-bb3c-1ec829f03745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap239d172c-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.230 186792 DEBUG nova.objects.instance [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid bdc89be5-8a10-4ee8-85d6-a7243fa7d970 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.253 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <uuid>bdc89be5-8a10-4ee8-85d6-a7243fa7d970</uuid>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <name>instance-0000009d</name>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-80098772</nova:name>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:23:28</nova:creationTime>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        <nova:port uuid="239d172c-f1e8-411e-86cb-a1f9d9a6809f">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <entry name="serial">bdc89be5-8a10-4ee8-85d6-a7243fa7d970</entry>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <entry name="uuid">bdc89be5-8a10-4ee8-85d6-a7243fa7d970</entry>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk.config"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:5a:b4:ce"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <target dev="tap239d172c-f1"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/console.log" append="off"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:23:28 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:23:28 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:23:28 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:23:28 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.255 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Preparing to wait for external event network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.256 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.256 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.256 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.257 186792 DEBUG nova.virt.libvirt.vif [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:23:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-80098772',display_name='tempest-TestNetworkAdvancedServerOps-server-80098772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-80098772',id=157,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGLhyUJ+KEZ71YtAJ371P9F3aIOxDv82GXOM2rDSaovbuwE7ZludLQPbMWKyRsNCmds/O3FEcOZmrL6F7mlqNkfmkAfib4ejLY8sYaalCrugYwrch5fmUHjD7KlYtnKPQ==',key_name='tempest-TestNetworkAdvancedServerOps-1378316896',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-gxtkp165',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:23:21Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=bdc89be5-8a10-4ee8-85d6-a7243fa7d970,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.258 186792 DEBUG nova.network.os_vif_util [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.259 186792 DEBUG nova.network.os_vif_util [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b4:ce,bridge_name='br-int',has_traffic_filtering=True,id=239d172c-f1e8-411e-86cb-a1f9d9a6809f,network=Network(026c2103-167f-4a9a-bb3c-1ec829f03745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap239d172c-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.259 186792 DEBUG os_vif [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b4:ce,bridge_name='br-int',has_traffic_filtering=True,id=239d172c-f1e8-411e-86cb-a1f9d9a6809f,network=Network(026c2103-167f-4a9a-bb3c-1ec829f03745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap239d172c-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.260 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.261 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.262 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.264 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.265 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap239d172c-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.265 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap239d172c-f1, col_values=(('external_ids', {'iface-id': '239d172c-f1e8-411e-86cb-a1f9d9a6809f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:b4:ce', 'vm-uuid': 'bdc89be5-8a10-4ee8-85d6-a7243fa7d970'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:23:28 np0005531888 NetworkManager[55166]: <info>  [1763799808.2677] manager: (tap239d172c-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.269 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.277 186792 INFO os_vif [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b4:ce,bridge_name='br-int',has_traffic_filtering=True,id=239d172c-f1e8-411e-86cb-a1f9d9a6809f,network=Network(026c2103-167f-4a9a-bb3c-1ec829f03745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap239d172c-f1')#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.948 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.949 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.949 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:5a:b4:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:23:28 np0005531888 nova_compute[186788]: 2025-11-22 08:23:28.949 186792 INFO nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Using config drive#033[00m
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.046 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.452 186792 INFO nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Creating config drive at /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk.config#033[00m
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.460 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpboc_53g9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.587 186792 DEBUG oslo_concurrency.processutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpboc_53g9" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:23:29 np0005531888 kernel: tap239d172c-f1: entered promiscuous mode
Nov 22 03:23:29 np0005531888 NetworkManager[55166]: <info>  [1763799809.6473] manager: (tap239d172c-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/290)
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.647 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:23:29Z|00613|binding|INFO|Claiming lport 239d172c-f1e8-411e-86cb-a1f9d9a6809f for this chassis.
Nov 22 03:23:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:23:29Z|00614|binding|INFO|239d172c-f1e8-411e-86cb-a1f9d9a6809f: Claiming fa:16:3e:5a:b4:ce 10.100.0.11
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.651 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:29 np0005531888 systemd-udevd[241270]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:23:29 np0005531888 NetworkManager[55166]: <info>  [1763799809.6877] device (tap239d172c-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:23:29 np0005531888 NetworkManager[55166]: <info>  [1763799809.6888] device (tap239d172c-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:23:29 np0005531888 systemd-machined[153106]: New machine qemu-76-instance-0000009d.
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.710 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:b4:ce 10.100.0.11'], port_security=['fa:16:3e:5a:b4:ce 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bdc89be5-8a10-4ee8-85d6-a7243fa7d970', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-026c2103-167f-4a9a-bb3c-1ec829f03745', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5de5bc1d-0b11-4452-a865-9eb496524359', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a6970b-133d-4a4c-93dc-892246ec381c, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=239d172c-f1e8-411e-86cb-a1f9d9a6809f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.712 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 239d172c-f1e8-411e-86cb-a1f9d9a6809f in datapath 026c2103-167f-4a9a-bb3c-1ec829f03745 bound to our chassis#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.713 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 026c2103-167f-4a9a-bb3c-1ec829f03745#033[00m
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.721 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.725 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9d26700f-3d79-4c85-8ec5-e733ea358559]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.726 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap026c2103-11 in ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:23:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:23:29Z|00615|binding|INFO|Setting lport 239d172c-f1e8-411e-86cb-a1f9d9a6809f ovn-installed in OVS
Nov 22 03:23:29 np0005531888 ovn_controller[95067]: 2025-11-22T08:23:29Z|00616|binding|INFO|Setting lport 239d172c-f1e8-411e-86cb-a1f9d9a6809f up in Southbound
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.727 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.727 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap026c2103-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.727 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[326b1b57-2f67-4074-89a1-65566d55891d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 systemd[1]: Started Virtual Machine qemu-76-instance-0000009d.
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.729 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a5efca9a-f14c-4534-b55b-513afca961c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.743 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[7243ee8f-9934-4a98-be85-9b67c4689df6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.768 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[52729f13-fb9d-4e9a-99e0-33c016fd13bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.779 186792 DEBUG nova.network.neutron [req-2d134ccb-19e0-4ead-90a4-a9c5d1a38c50 req-8c705746-3fe8-4786-a366-1ae21735b04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Updated VIF entry in instance network info cache for port 239d172c-f1e8-411e-86cb-a1f9d9a6809f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.780 186792 DEBUG nova.network.neutron [req-2d134ccb-19e0-4ead-90a4-a9c5d1a38c50 req-8c705746-3fe8-4786-a366-1ae21735b04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Updating instance_info_cache with network_info: [{"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.798 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[ff365fe8-c5ff-4e3d-b286-a7926aa02762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.803 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0867f405-44dd-477e-867c-7a9254ea44c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 NetworkManager[55166]: <info>  [1763799809.8049] manager: (tap026c2103-10): new Veth device (/org/freedesktop/NetworkManager/Devices/291)
Nov 22 03:23:29 np0005531888 systemd-udevd[241273]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:23:29 np0005531888 nova_compute[186788]: 2025-11-22 08:23:29.820 186792 DEBUG oslo_concurrency.lockutils [req-2d134ccb-19e0-4ead-90a4-a9c5d1a38c50 req-8c705746-3fe8-4786-a366-1ae21735b04a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.834 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4171ce-9b89-4a2b-ae95-73ea3bccc691]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.837 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[43558b78-3de7-43bd-8e70-47f27b2b0bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 NetworkManager[55166]: <info>  [1763799809.8582] device (tap026c2103-10): carrier: link connected
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.864 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[63887458-af0b-49c6-8324-bfe3bcefefa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.882 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[95ff2833-18a4-4a34-95ba-8eaafdd5d819]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap026c2103-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:dc:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655050, 'reachable_time': 43704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241304, 'error': None, 'target': 'ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.896 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[890482d5-f6e3-47b4-bcec-8e3706ede20d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:dcf4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655050, 'tstamp': 655050}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241305, 'error': None, 'target': 'ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.915 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0b647442-5369-439f-b0d6-c940febdc955]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap026c2103-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:dc:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655050, 'reachable_time': 43704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241306, 'error': None, 'target': 'ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:29 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:29.947 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[589d1c6a-1fc6-4f81-a663-85271b13df66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.008 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3e336d8a-a680-4259-ba55-d81142bbf58f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.010 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap026c2103-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.010 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.010 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap026c2103-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.012 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:30 np0005531888 NetworkManager[55166]: <info>  [1763799810.0133] manager: (tap026c2103-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Nov 22 03:23:30 np0005531888 kernel: tap026c2103-10: entered promiscuous mode
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.031 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap026c2103-10, col_values=(('external_ids', {'iface-id': '821a9d69-d92a-4153-a4e2-4aaccb8578e7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:23:30 np0005531888 ovn_controller[95067]: 2025-11-22T08:23:30Z|00617|binding|INFO|Releasing lport 821a9d69-d92a-4153-a4e2-4aaccb8578e7 from this chassis (sb_readonly=0)
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.032 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.034 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/026c2103-167f-4a9a-bb3c-1ec829f03745.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/026c2103-167f-4a9a-bb3c-1ec829f03745.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.035 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2cb889-7333-4249-875c-2548dbdda115]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.036 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-026c2103-167f-4a9a-bb3c-1ec829f03745
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/026c2103-167f-4a9a-bb3c-1ec829f03745.pid.haproxy
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 026c2103-167f-4a9a-bb3c-1ec829f03745
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:23:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:30.037 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745', 'env', 'PROCESS_TAG=haproxy-026c2103-167f-4a9a-bb3c-1ec829f03745', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/026c2103-167f-4a9a-bb3c-1ec829f03745.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.044 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.117 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799810.116711, bdc89be5-8a10-4ee8-85d6-a7243fa7d970 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.117 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] VM Started (Lifecycle Event)#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.145 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.149 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799810.1177666, bdc89be5-8a10-4ee8-85d6-a7243fa7d970 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.149 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.167 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.172 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:23:30 np0005531888 nova_compute[186788]: 2025-11-22 08:23:30.191 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:23:30 np0005531888 podman[241345]: 2025-11-22 08:23:30.378069984 +0000 UTC m=+0.021982022 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:23:30 np0005531888 podman[241345]: 2025-11-22 08:23:30.914016295 +0000 UTC m=+0.557928303 container create 83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:23:31 np0005531888 systemd[1]: Started libpod-conmon-83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b.scope.
Nov 22 03:23:31 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:23:31 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2921dfef6153c81b288230fc3b9c7e3c394e78eefa2df86e96bf422bda0c9bb3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:23:31 np0005531888 podman[241345]: 2025-11-22 08:23:31.256415165 +0000 UTC m=+0.900327203 container init 83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:23:31 np0005531888 podman[241345]: 2025-11-22 08:23:31.262521725 +0000 UTC m=+0.906433743 container start 83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:23:31 np0005531888 neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745[241360]: [NOTICE]   (241364) : New worker (241366) forked
Nov 22 03:23:31 np0005531888 neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745[241360]: [NOTICE]   (241364) : Loading success.
Nov 22 03:23:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:32.127 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.652 186792 DEBUG nova.compute.manager [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received event network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.653 186792 DEBUG oslo_concurrency.lockutils [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.653 186792 DEBUG oslo_concurrency.lockutils [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.653 186792 DEBUG oslo_concurrency.lockutils [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.654 186792 DEBUG nova.compute.manager [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Processing event network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.654 186792 DEBUG nova.compute.manager [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received event network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.654 186792 DEBUG oslo_concurrency.lockutils [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.654 186792 DEBUG oslo_concurrency.lockutils [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.654 186792 DEBUG oslo_concurrency.lockutils [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.655 186792 DEBUG nova.compute.manager [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] No waiting events found dispatching network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.655 186792 WARNING nova.compute.manager [req-0c3aee23-6131-4d6e-ab1e-9d4bf1da5daa req-f65abda2-7780-46cb-bb51-256c3176091f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received unexpected event network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.655 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.659 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799812.6591952, bdc89be5-8a10-4ee8-85d6-a7243fa7d970 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.660 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.661 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.664 186792 INFO nova.virt.libvirt.driver [-] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Instance spawned successfully.#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.665 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.703 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.710 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.713 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.713 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.714 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.714 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.715 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.715 186792 DEBUG nova.virt.libvirt.driver [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.752 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.834 186792 INFO nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Took 11.01 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.835 186792 DEBUG nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.934 186792 INFO nova.compute.manager [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Took 11.66 seconds to build instance.#033[00m
Nov 22 03:23:32 np0005531888 nova_compute[186788]: 2025-11-22 08:23:32.987 186792 DEBUG oslo_concurrency.lockutils [None req-c4a0d048-6fe7-4133-8b66-ce2563733686 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:33 np0005531888 nova_compute[186788]: 2025-11-22 08:23:33.268 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:33 np0005531888 podman[241376]: 2025-11-22 08:23:33.693101651 +0000 UTC m=+0.063309288 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:23:33 np0005531888 podman[241375]: 2025-11-22 08:23:33.693382429 +0000 UTC m=+0.063960655 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:23:34 np0005531888 nova_compute[186788]: 2025-11-22 08:23:34.047 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:36.840 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:36.841 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:23:36.843 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:38 np0005531888 nova_compute[186788]: 2025-11-22 08:23:38.104 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:38 np0005531888 NetworkManager[55166]: <info>  [1763799818.1047] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Nov 22 03:23:38 np0005531888 NetworkManager[55166]: <info>  [1763799818.1054] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Nov 22 03:23:38 np0005531888 nova_compute[186788]: 2025-11-22 08:23:38.156 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:38 np0005531888 ovn_controller[95067]: 2025-11-22T08:23:38Z|00618|binding|INFO|Releasing lport 821a9d69-d92a-4153-a4e2-4aaccb8578e7 from this chassis (sb_readonly=0)
Nov 22 03:23:38 np0005531888 nova_compute[186788]: 2025-11-22 08:23:38.168 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:38 np0005531888 nova_compute[186788]: 2025-11-22 08:23:38.275 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:39 np0005531888 nova_compute[186788]: 2025-11-22 08:23:39.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:39 np0005531888 nova_compute[186788]: 2025-11-22 08:23:39.279 186792 DEBUG nova.compute.manager [req-4f55035f-9358-4569-899e-8e799af60b20 req-60a4626b-8dfb-41b8-abac-2a2497841122 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received event network-changed-239d172c-f1e8-411e-86cb-a1f9d9a6809f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:23:39 np0005531888 nova_compute[186788]: 2025-11-22 08:23:39.280 186792 DEBUG nova.compute.manager [req-4f55035f-9358-4569-899e-8e799af60b20 req-60a4626b-8dfb-41b8-abac-2a2497841122 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Refreshing instance network info cache due to event network-changed-239d172c-f1e8-411e-86cb-a1f9d9a6809f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:23:39 np0005531888 nova_compute[186788]: 2025-11-22 08:23:39.281 186792 DEBUG oslo_concurrency.lockutils [req-4f55035f-9358-4569-899e-8e799af60b20 req-60a4626b-8dfb-41b8-abac-2a2497841122 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:23:39 np0005531888 nova_compute[186788]: 2025-11-22 08:23:39.282 186792 DEBUG oslo_concurrency.lockutils [req-4f55035f-9358-4569-899e-8e799af60b20 req-60a4626b-8dfb-41b8-abac-2a2497841122 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:23:39 np0005531888 nova_compute[186788]: 2025-11-22 08:23:39.282 186792 DEBUG nova.network.neutron [req-4f55035f-9358-4569-899e-8e799af60b20 req-60a4626b-8dfb-41b8-abac-2a2497841122 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Refreshing network info cache for port 239d172c-f1e8-411e-86cb-a1f9d9a6809f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:23:41 np0005531888 nova_compute[186788]: 2025-11-22 08:23:41.420 186792 DEBUG nova.network.neutron [req-4f55035f-9358-4569-899e-8e799af60b20 req-60a4626b-8dfb-41b8-abac-2a2497841122 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Updated VIF entry in instance network info cache for port 239d172c-f1e8-411e-86cb-a1f9d9a6809f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:23:41 np0005531888 nova_compute[186788]: 2025-11-22 08:23:41.422 186792 DEBUG nova.network.neutron [req-4f55035f-9358-4569-899e-8e799af60b20 req-60a4626b-8dfb-41b8-abac-2a2497841122 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Updating instance_info_cache with network_info: [{"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:23:41 np0005531888 nova_compute[186788]: 2025-11-22 08:23:41.442 186792 DEBUG oslo_concurrency.lockutils [req-4f55035f-9358-4569-899e-8e799af60b20 req-60a4626b-8dfb-41b8-abac-2a2497841122 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:23:43 np0005531888 nova_compute[186788]: 2025-11-22 08:23:43.286 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:44 np0005531888 nova_compute[186788]: 2025-11-22 08:23:44.055 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:45 np0005531888 podman[241418]: 2025-11-22 08:23:45.694948006 +0000 UTC m=+0.050081483 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:23:45 np0005531888 podman[241417]: 2025-11-22 08:23:45.727716113 +0000 UTC m=+0.087500044 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 22 03:23:48 np0005531888 nova_compute[186788]: 2025-11-22 08:23:48.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:49 np0005531888 nova_compute[186788]: 2025-11-22 08:23:49.060 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:49 np0005531888 podman[241480]: 2025-11-22 08:23:49.689915817 +0000 UTC m=+0.057955057 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Nov 22 03:23:49 np0005531888 podman[241481]: 2025-11-22 08:23:49.720547919 +0000 UTC m=+0.085526624 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:23:49 np0005531888 podman[241482]: 2025-11-22 08:23:49.727225344 +0000 UTC m=+0.087555194 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:23:50 np0005531888 ovn_controller[95067]: 2025-11-22T08:23:50Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5a:b4:ce 10.100.0.11
Nov 22 03:23:50 np0005531888 ovn_controller[95067]: 2025-11-22T08:23:50Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5a:b4:ce 10.100.0.11
Nov 22 03:23:53 np0005531888 nova_compute[186788]: 2025-11-22 08:23:53.292 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:54 np0005531888 nova_compute[186788]: 2025-11-22 08:23:54.061 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:55 np0005531888 nova_compute[186788]: 2025-11-22 08:23:55.837 186792 INFO nova.compute.manager [None req-5abef4cc-f78d-47f9-9176-a881e3a053da d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Get console output#033[00m
Nov 22 03:23:55 np0005531888 nova_compute[186788]: 2025-11-22 08:23:55.843 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:23:56 np0005531888 nova_compute[186788]: 2025-11-22 08:23:56.267 186792 INFO nova.compute.manager [None req-a9bdf0a8-0dcf-4584-989a-08abede1218a d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Pausing#033[00m
Nov 22 03:23:56 np0005531888 nova_compute[186788]: 2025-11-22 08:23:56.268 186792 DEBUG nova.objects.instance [None req-a9bdf0a8-0dcf-4584-989a-08abede1218a d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'flavor' on Instance uuid bdc89be5-8a10-4ee8-85d6-a7243fa7d970 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:23:56 np0005531888 nova_compute[186788]: 2025-11-22 08:23:56.309 186792 DEBUG nova.compute.manager [None req-a9bdf0a8-0dcf-4584-989a-08abede1218a d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:23:56 np0005531888 nova_compute[186788]: 2025-11-22 08:23:56.311 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799836.310116, bdc89be5-8a10-4ee8-85d6-a7243fa7d970 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:23:56 np0005531888 nova_compute[186788]: 2025-11-22 08:23:56.311 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:23:56 np0005531888 nova_compute[186788]: 2025-11-22 08:23:56.341 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:23:56 np0005531888 nova_compute[186788]: 2025-11-22 08:23:56.344 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:23:56 np0005531888 nova_compute[186788]: 2025-11-22 08:23:56.365 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Nov 22 03:23:58 np0005531888 nova_compute[186788]: 2025-11-22 08:23:58.296 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:59 np0005531888 nova_compute[186788]: 2025-11-22 08:23:59.063 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.445 186792 INFO nova.compute.manager [None req-94819cc7-6f17-4a68-a9ce-c99b40d1467b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Get console output#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.450 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.725 186792 INFO nova.compute.manager [None req-54828a54-d5e3-4383-ae5d-c4b4c70be4ab d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Unpausing#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.726 186792 DEBUG nova.objects.instance [None req-54828a54-d5e3-4383-ae5d-c4b4c70be4ab d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'flavor' on Instance uuid bdc89be5-8a10-4ee8-85d6-a7243fa7d970 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.755 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799841.754282, bdc89be5-8a10-4ee8-85d6-a7243fa7d970 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.755 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:24:01 np0005531888 virtqemud[186358]: argument unsupported: QEMU guest agent is not configured
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.760 186792 DEBUG nova.virt.libvirt.guest [None req-54828a54-d5e3-4383-ae5d-c4b4c70be4ab d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.760 186792 DEBUG nova.compute.manager [None req-54828a54-d5e3-4383-ae5d-c4b4c70be4ab d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.787 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.790 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:24:01 np0005531888 nova_compute[186788]: 2025-11-22 08:24:01.814 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Nov 22 03:24:03 np0005531888 nova_compute[186788]: 2025-11-22 08:24:03.298 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:04 np0005531888 nova_compute[186788]: 2025-11-22 08:24:04.068 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:04 np0005531888 nova_compute[186788]: 2025-11-22 08:24:04.544 186792 INFO nova.compute.manager [None req-4cee9b8c-7ca3-4d16-bf29-8957476fd0d9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Get console output#033[00m
Nov 22 03:24:04 np0005531888 nova_compute[186788]: 2025-11-22 08:24:04.551 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:24:04 np0005531888 podman[241545]: 2025-11-22 08:24:04.671396138 +0000 UTC m=+0.047738285 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:24:04 np0005531888 podman[241546]: 2025-11-22 08:24:04.676417352 +0000 UTC m=+0.051017526 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.123 186792 DEBUG nova.compute.manager [req-384d051c-f73e-48d2-82fa-142e2763735a req-5f224827-8e55-481c-a78d-dc8e2b4ab131 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received event network-changed-239d172c-f1e8-411e-86cb-a1f9d9a6809f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.124 186792 DEBUG nova.compute.manager [req-384d051c-f73e-48d2-82fa-142e2763735a req-5f224827-8e55-481c-a78d-dc8e2b4ab131 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Refreshing instance network info cache due to event network-changed-239d172c-f1e8-411e-86cb-a1f9d9a6809f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.124 186792 DEBUG oslo_concurrency.lockutils [req-384d051c-f73e-48d2-82fa-142e2763735a req-5f224827-8e55-481c-a78d-dc8e2b4ab131 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.124 186792 DEBUG oslo_concurrency.lockutils [req-384d051c-f73e-48d2-82fa-142e2763735a req-5f224827-8e55-481c-a78d-dc8e2b4ab131 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.125 186792 DEBUG nova.network.neutron [req-384d051c-f73e-48d2-82fa-142e2763735a req-5f224827-8e55-481c-a78d-dc8e2b4ab131 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Refreshing network info cache for port 239d172c-f1e8-411e-86cb-a1f9d9a6809f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.219 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.219 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.220 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.220 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.220 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.229 186792 INFO nova.compute.manager [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Terminating instance#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.241 186792 DEBUG nova.compute.manager [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:24:06 np0005531888 kernel: tap239d172c-f1 (unregistering): left promiscuous mode
Nov 22 03:24:06 np0005531888 NetworkManager[55166]: <info>  [1763799846.2716] device (tap239d172c-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:24:06 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:06Z|00619|binding|INFO|Releasing lport 239d172c-f1e8-411e-86cb-a1f9d9a6809f from this chassis (sb_readonly=0)
Nov 22 03:24:06 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:06Z|00620|binding|INFO|Setting lport 239d172c-f1e8-411e-86cb-a1f9d9a6809f down in Southbound
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.282 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:06Z|00621|binding|INFO|Removing iface tap239d172c-f1 ovn-installed in OVS
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.284 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.292 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:b4:ce 10.100.0.11'], port_security=['fa:16:3e:5a:b4:ce 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bdc89be5-8a10-4ee8-85d6-a7243fa7d970', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-026c2103-167f-4a9a-bb3c-1ec829f03745', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5de5bc1d-0b11-4452-a865-9eb496524359', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a6970b-133d-4a4c-93dc-892246ec381c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=239d172c-f1e8-411e-86cb-a1f9d9a6809f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.294 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 239d172c-f1e8-411e-86cb-a1f9d9a6809f in datapath 026c2103-167f-4a9a-bb3c-1ec829f03745 unbound from our chassis#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.296 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 026c2103-167f-4a9a-bb3c-1ec829f03745, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.298 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff27797-3762-4d16-bb50-e0c934f54b73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.299 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745 namespace which is not needed anymore#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.299 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531888 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d0000009d.scope: Deactivated successfully.
Nov 22 03:24:06 np0005531888 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d0000009d.scope: Consumed 16.485s CPU time.
Nov 22 03:24:06 np0005531888 systemd-machined[153106]: Machine qemu-76-instance-0000009d terminated.
Nov 22 03:24:06 np0005531888 neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745[241360]: [NOTICE]   (241364) : haproxy version is 2.8.14-c23fe91
Nov 22 03:24:06 np0005531888 neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745[241360]: [NOTICE]   (241364) : path to executable is /usr/sbin/haproxy
Nov 22 03:24:06 np0005531888 neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745[241360]: [WARNING]  (241364) : Exiting Master process...
Nov 22 03:24:06 np0005531888 neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745[241360]: [ALERT]    (241364) : Current worker (241366) exited with code 143 (Terminated)
Nov 22 03:24:06 np0005531888 neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745[241360]: [WARNING]  (241364) : All workers exited. Exiting... (0)
Nov 22 03:24:06 np0005531888 systemd[1]: libpod-83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b.scope: Deactivated successfully.
Nov 22 03:24:06 np0005531888 podman[241614]: 2025-11-22 08:24:06.445190321 +0000 UTC m=+0.054734417 container died 83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:24:06 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b-userdata-shm.mount: Deactivated successfully.
Nov 22 03:24:06 np0005531888 systemd[1]: var-lib-containers-storage-overlay-2921dfef6153c81b288230fc3b9c7e3c394e78eefa2df86e96bf422bda0c9bb3-merged.mount: Deactivated successfully.
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.514 186792 INFO nova.virt.libvirt.driver [-] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Instance destroyed successfully.#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.515 186792 DEBUG nova.objects.instance [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid bdc89be5-8a10-4ee8-85d6-a7243fa7d970 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:24:06 np0005531888 podman[241614]: 2025-11-22 08:24:06.527761672 +0000 UTC m=+0.137305758 container cleanup 83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:24:06 np0005531888 systemd[1]: libpod-conmon-83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b.scope: Deactivated successfully.
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.544 186792 DEBUG nova.virt.libvirt.vif [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:23:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-80098772',display_name='tempest-TestNetworkAdvancedServerOps-server-80098772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-80098772',id=157,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKGLhyUJ+KEZ71YtAJ371P9F3aIOxDv82GXOM2rDSaovbuwE7ZludLQPbMWKyRsNCmds/O3FEcOZmrL6F7mlqNkfmkAfib4ejLY8sYaalCrugYwrch5fmUHjD7KlYtnKPQ==',key_name='tempest-TestNetworkAdvancedServerOps-1378316896',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:23:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-gxtkp165',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:24:01Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=bdc89be5-8a10-4ee8-85d6-a7243fa7d970,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.544 186792 DEBUG nova.network.os_vif_util [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.545 186792 DEBUG nova.network.os_vif_util [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b4:ce,bridge_name='br-int',has_traffic_filtering=True,id=239d172c-f1e8-411e-86cb-a1f9d9a6809f,network=Network(026c2103-167f-4a9a-bb3c-1ec829f03745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap239d172c-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.546 186792 DEBUG os_vif [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b4:ce,bridge_name='br-int',has_traffic_filtering=True,id=239d172c-f1e8-411e-86cb-a1f9d9a6809f,network=Network(026c2103-167f-4a9a-bb3c-1ec829f03745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap239d172c-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.548 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.548 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap239d172c-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.550 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.551 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.554 186792 INFO os_vif [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b4:ce,bridge_name='br-int',has_traffic_filtering=True,id=239d172c-f1e8-411e-86cb-a1f9d9a6809f,network=Network(026c2103-167f-4a9a-bb3c-1ec829f03745),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap239d172c-f1')#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.555 186792 INFO nova.virt.libvirt.driver [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Deleting instance files /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970_del#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.556 186792 INFO nova.virt.libvirt.driver [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Deletion of /var/lib/nova/instances/bdc89be5-8a10-4ee8-85d6-a7243fa7d970_del complete#033[00m
Nov 22 03:24:06 np0005531888 podman[241660]: 2025-11-22 08:24:06.696147474 +0000 UTC m=+0.141752668 container remove 83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.702 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[36664964-5904-49a5-927b-efc402823d07]: (4, ('Sat Nov 22 08:24:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745 (83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b)\n83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b\nSat Nov 22 08:24:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745 (83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b)\n83a94b16c0ed915aab624100978b2a98122910cef2750b49ae1a93040229e84b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.704 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d86d2943-2b84-4d5f-bfdc-c2fc4e4247ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.705 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap026c2103-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.706 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531888 kernel: tap026c2103-10: left promiscuous mode
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.720 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.722 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[08f18279-72e8-4698-8ad8-1b37dc76b71f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.726 186792 INFO nova.compute.manager [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.728 186792 DEBUG oslo.service.loopingcall [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.728 186792 DEBUG nova.compute.manager [-] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.729 186792 DEBUG nova.network.neutron [-] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.739 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e552b9-3760-4f59-b300-40a7e0b2abab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.740 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf412fe-c395-4302-bb87-938021e1355e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.754 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e3d44a-2a39-46c3-87ae-b195c40e10e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655043, 'reachable_time': 25747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241675, 'error': None, 'target': 'ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.757 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-026c2103-167f-4a9a-bb3c-1ec829f03745 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:24:06 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:06.757 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[602bf1c7-a86e-485a-b250-69bd2e170142]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:06 np0005531888 systemd[1]: run-netns-ovnmeta\x2d026c2103\x2d167f\x2d4a9a\x2dbb3c\x2d1ec829f03745.mount: Deactivated successfully.
Nov 22 03:24:06 np0005531888 nova_compute[186788]: 2025-11-22 08:24:06.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:07.604 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:24:07 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:07.605 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.608 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.638 186792 DEBUG nova.network.neutron [-] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.668 186792 INFO nova.compute.manager [-] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Took 0.94 seconds to deallocate network for instance.#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.757 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.758 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.790 186792 DEBUG nova.compute.manager [req-480eb4c9-1a14-4b9a-a653-a77b2cee4a40 req-dce2a930-0cdd-4ea8-81b0-120525e62646 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received event network-vif-deleted-239d172c-f1e8-411e-86cb-a1f9d9a6809f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.820 186792 DEBUG nova.compute.provider_tree [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.833 186792 DEBUG nova.scheduler.client.report [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.852 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.879 186792 INFO nova.scheduler.client.report [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance bdc89be5-8a10-4ee8-85d6-a7243fa7d970#033[00m
Nov 22 03:24:07 np0005531888 nova_compute[186788]: 2025-11-22 08:24:07.940 186792 DEBUG oslo_concurrency.lockutils [None req-fb098f5d-e60d-4742-817d-6c20d456572b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.176 186792 DEBUG nova.network.neutron [req-384d051c-f73e-48d2-82fa-142e2763735a req-5f224827-8e55-481c-a78d-dc8e2b4ab131 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Updated VIF entry in instance network info cache for port 239d172c-f1e8-411e-86cb-a1f9d9a6809f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.176 186792 DEBUG nova.network.neutron [req-384d051c-f73e-48d2-82fa-142e2763735a req-5f224827-8e55-481c-a78d-dc8e2b4ab131 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Updating instance_info_cache with network_info: [{"id": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "address": "fa:16:3e:5a:b4:ce", "network": {"id": "026c2103-167f-4a9a-bb3c-1ec829f03745", "bridge": "br-int", "label": "tempest-network-smoke--158787951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap239d172c-f1", "ovs_interfaceid": "239d172c-f1e8-411e-86cb-a1f9d9a6809f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.197 186792 DEBUG oslo_concurrency.lockutils [req-384d051c-f73e-48d2-82fa-142e2763735a req-5f224827-8e55-481c-a78d-dc8e2b4ab131 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-bdc89be5-8a10-4ee8-85d6-a7243fa7d970" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.275 186792 DEBUG nova.compute.manager [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received event network-vif-unplugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.275 186792 DEBUG oslo_concurrency.lockutils [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.275 186792 DEBUG oslo_concurrency.lockutils [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.276 186792 DEBUG oslo_concurrency.lockutils [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.276 186792 DEBUG nova.compute.manager [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] No waiting events found dispatching network-vif-unplugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.276 186792 WARNING nova.compute.manager [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received unexpected event network-vif-unplugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.276 186792 DEBUG nova.compute.manager [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received event network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.277 186792 DEBUG oslo_concurrency.lockutils [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.277 186792 DEBUG oslo_concurrency.lockutils [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.277 186792 DEBUG oslo_concurrency.lockutils [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "bdc89be5-8a10-4ee8-85d6-a7243fa7d970-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.277 186792 DEBUG nova.compute.manager [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] No waiting events found dispatching network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.278 186792 WARNING nova.compute.manager [req-aee8f1b6-c5e2-46e0-855d-1a1335451316 req-0b600600-b1ec-4e78-baa0-69b4f781df8a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Received unexpected event network-vif-plugged-239d172c-f1e8-411e-86cb-a1f9d9a6809f for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:24:08 np0005531888 nova_compute[186788]: 2025-11-22 08:24:08.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:09 np0005531888 nova_compute[186788]: 2025-11-22 08:24:09.070 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:10 np0005531888 nova_compute[186788]: 2025-11-22 08:24:10.592 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:10 np0005531888 nova_compute[186788]: 2025-11-22 08:24:10.698 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:11 np0005531888 nova_compute[186788]: 2025-11-22 08:24:11.551 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:11 np0005531888 nova_compute[186788]: 2025-11-22 08:24:11.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:11 np0005531888 nova_compute[186788]: 2025-11-22 08:24:11.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:24:11 np0005531888 nova_compute[186788]: 2025-11-22 08:24:11.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:24:11 np0005531888 nova_compute[186788]: 2025-11-22 08:24:11.977 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:24:13 np0005531888 nova_compute[186788]: 2025-11-22 08:24:13.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:14 np0005531888 nova_compute[186788]: 2025-11-22 08:24:14.071 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:14.607 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:14 np0005531888 nova_compute[186788]: 2025-11-22 08:24:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:16 np0005531888 nova_compute[186788]: 2025-11-22 08:24:16.554 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:16 np0005531888 podman[241677]: 2025-11-22 08:24:16.685607994 +0000 UTC m=+0.056962371 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:24:16 np0005531888 podman[241678]: 2025-11-22 08:24:16.690641089 +0000 UTC m=+0.058473200 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:24:19 np0005531888 nova_compute[186788]: 2025-11-22 08:24:19.074 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:19 np0005531888 nova_compute[186788]: 2025-11-22 08:24:19.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:19 np0005531888 nova_compute[186788]: 2025-11-22 08:24:19.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:19 np0005531888 nova_compute[186788]: 2025-11-22 08:24:19.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:19 np0005531888 nova_compute[186788]: 2025-11-22 08:24:19.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:19 np0005531888 nova_compute[186788]: 2025-11-22 08:24:19.983 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:24:20 np0005531888 podman[241721]: 2025-11-22 08:24:20.134452572 +0000 UTC m=+0.101051476 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc.)
Nov 22 03:24:20 np0005531888 podman[241723]: 2025-11-22 08:24:20.139803414 +0000 UTC m=+0.095962781 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3)
Nov 22 03:24:20 np0005531888 podman[241724]: 2025-11-22 08:24:20.169938326 +0000 UTC m=+0.122815022 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.184 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "60927646-221a-4141-a422-8a7823be8d64" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.185 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.203 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.258 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.260 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.26642608642578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.260 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.260 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.629 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.878 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 60927646-221a-4141-a422-8a7823be8d64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.878 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:24:20 np0005531888 nova_compute[186788]: 2025-11-22 08:24:20.878 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.121 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.138 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.294 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.295 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.296 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.307 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.308 186792 INFO nova.compute.claims [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.512 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799846.5111701, bdc89be5-8a10-4ee8-85d6-a7243fa7d970 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.513 186792 INFO nova.compute.manager [-] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.557 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.736 186792 DEBUG nova.compute.manager [None req-0d5f801e-9b99-4954-a5ff-e4e394302126 - - - - - -] [instance: bdc89be5-8a10-4ee8-85d6-a7243fa7d970] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.786 186792 DEBUG nova.compute.provider_tree [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.797 186792 DEBUG nova.scheduler.client.report [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.883 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.883 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:21 np0005531888 nova_compute[186788]: 2025-11-22 08:24:21.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.022 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.022 186792 DEBUG nova.network.neutron [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.143 186792 INFO nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.183 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.246 186792 DEBUG nova.policy [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.405 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.406 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.407 186792 INFO nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Creating image(s)#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.407 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.407 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.408 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.421 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.482 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.483 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.484 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.499 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.559 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.560 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.655 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk 1073741824" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.656 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.656 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.714 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.716 186792 DEBUG nova.virt.disk.api [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.717 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.775 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.777 186792 DEBUG nova.virt.disk.api [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.777 186792 DEBUG nova.objects.instance [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 60927646-221a-4141-a422-8a7823be8d64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.794 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.794 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Ensure instance console log exists: /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.795 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.795 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:22 np0005531888 nova_compute[186788]: 2025-11-22 08:24:22.795 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:23 np0005531888 nova_compute[186788]: 2025-11-22 08:24:23.660 186792 DEBUG nova.network.neutron [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Successfully created port: 25f695e5-4550-44fd-aabf-8cabc50e1bb9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.076 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.654 186792 DEBUG nova.network.neutron [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Successfully updated port: 25f695e5-4550-44fd-aabf-8cabc50e1bb9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.711 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.712 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.712 186792 DEBUG nova.network.neutron [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.772 186792 DEBUG nova.compute.manager [req-e9e3593c-64ee-4cb2-a05a-7b94e97873f3 req-fe945a18-9af3-4789-afca-d5a0503a2332 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-changed-25f695e5-4550-44fd-aabf-8cabc50e1bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.773 186792 DEBUG nova.compute.manager [req-e9e3593c-64ee-4cb2-a05a-7b94e97873f3 req-fe945a18-9af3-4789-afca-d5a0503a2332 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Refreshing instance network info cache due to event network-changed-25f695e5-4550-44fd-aabf-8cabc50e1bb9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.773 186792 DEBUG oslo_concurrency.lockutils [req-e9e3593c-64ee-4cb2-a05a-7b94e97873f3 req-fe945a18-9af3-4789-afca-d5a0503a2332 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.925 186792 DEBUG nova.network.neutron [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.972 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:24:24 np0005531888 nova_compute[186788]: 2025-11-22 08:24:24.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:25 np0005531888 nova_compute[186788]: 2025-11-22 08:24:25.983 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:25 np0005531888 nova_compute[186788]: 2025-11-22 08:24:25.984 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.561 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.919 186792 DEBUG nova.network.neutron [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Updating instance_info_cache with network_info: [{"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.955 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.955 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Instance network_info: |[{"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.956 186792 DEBUG oslo_concurrency.lockutils [req-e9e3593c-64ee-4cb2-a05a-7b94e97873f3 req-fe945a18-9af3-4789-afca-d5a0503a2332 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.956 186792 DEBUG nova.network.neutron [req-e9e3593c-64ee-4cb2-a05a-7b94e97873f3 req-fe945a18-9af3-4789-afca-d5a0503a2332 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Refreshing network info cache for port 25f695e5-4550-44fd-aabf-8cabc50e1bb9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.959 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Start _get_guest_xml network_info=[{"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.964 186792 WARNING nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.968 186792 DEBUG nova.virt.libvirt.host [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.968 186792 DEBUG nova.virt.libvirt.host [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.976 186792 DEBUG nova.virt.libvirt.host [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.977 186792 DEBUG nova.virt.libvirt.host [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.979 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.979 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.979 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.980 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.980 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.980 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.980 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.980 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.981 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.981 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.981 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.981 186792 DEBUG nova.virt.hardware [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.984 186792 DEBUG nova.virt.libvirt.vif [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:24:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1050837626',display_name='tempest-TestGettingAddress-server-1050837626',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1050837626',id=159,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDo9IOjv1g7RgDVY2BCJ+mmy9Na2DHnkZ9mE8BRZMtj9iwqLsOttn1FTaKP3MHHav783x9ncOoaXWlLdYlMz7ANRW+4cS5nXI1iryizo5WsS7HuJMsWvhJA60lnbjcIyew==',key_name='tempest-TestGettingAddress-814265221',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ihrg38q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:24:22Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=60927646-221a-4141-a422-8a7823be8d64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.984 186792 DEBUG nova.network.os_vif_util [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.985 186792 DEBUG nova.network.os_vif_util [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:5d:99,bridge_name='br-int',has_traffic_filtering=True,id=25f695e5-4550-44fd-aabf-8cabc50e1bb9,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25f695e5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.986 186792 DEBUG nova.objects.instance [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 60927646-221a-4141-a422-8a7823be8d64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:24:26 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.996 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <uuid>60927646-221a-4141-a422-8a7823be8d64</uuid>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <name>instance-0000009f</name>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestGettingAddress-server-1050837626</nova:name>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:24:26</nova:creationTime>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        <nova:port uuid="25f695e5-4550-44fd-aabf-8cabc50e1bb9">
Nov 22 03:24:26 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe91:5d99" ipVersion="6"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <entry name="serial">60927646-221a-4141-a422-8a7823be8d64</entry>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <entry name="uuid">60927646-221a-4141-a422-8a7823be8d64</entry>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:24:26 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:24:26 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk.config"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:91:5d:99"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <target dev="tap25f695e5-45"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/console.log" append="off"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:24:27 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:24:27 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:24:27 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:24:27 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.997 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Preparing to wait for external event network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.997 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "60927646-221a-4141-a422-8a7823be8d64-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.997 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.997 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.998 186792 DEBUG nova.virt.libvirt.vif [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:24:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1050837626',display_name='tempest-TestGettingAddress-server-1050837626',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1050837626',id=159,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDo9IOjv1g7RgDVY2BCJ+mmy9Na2DHnkZ9mE8BRZMtj9iwqLsOttn1FTaKP3MHHav783x9ncOoaXWlLdYlMz7ANRW+4cS5nXI1iryizo5WsS7HuJMsWvhJA60lnbjcIyew==',key_name='tempest-TestGettingAddress-814265221',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ihrg38q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:24:22Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=60927646-221a-4141-a422-8a7823be8d64,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.998 186792 DEBUG nova.network.os_vif_util [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.999 186792 DEBUG nova.network.os_vif_util [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:5d:99,bridge_name='br-int',has_traffic_filtering=True,id=25f695e5-4550-44fd-aabf-8cabc50e1bb9,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25f695e5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.999 186792 DEBUG os_vif [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:5d:99,bridge_name='br-int',has_traffic_filtering=True,id=25f695e5-4550-44fd-aabf-8cabc50e1bb9,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25f695e5-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:26.999 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.000 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.000 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.002 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.002 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25f695e5-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.003 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25f695e5-45, col_values=(('external_ids', {'iface-id': '25f695e5-4550-44fd-aabf-8cabc50e1bb9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:5d:99', 'vm-uuid': '60927646-221a-4141-a422-8a7823be8d64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.004 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:27 np0005531888 NetworkManager[55166]: <info>  [1763799867.0055] manager: (tap25f695e5-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.011 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.012 186792 INFO os_vif [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:5d:99,bridge_name='br-int',has_traffic_filtering=True,id=25f695e5-4550-44fd-aabf-8cabc50e1bb9,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25f695e5-45')#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.511 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.512 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.512 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:91:5d:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.513 186792 INFO nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Using config drive#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.900 186792 INFO nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Creating config drive at /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk.config#033[00m
Nov 22 03:24:27 np0005531888 nova_compute[186788]: 2025-11-22 08:24:27.906 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp77qsxj9a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.038 186792 DEBUG oslo_concurrency.processutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp77qsxj9a" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:28 np0005531888 kernel: tap25f695e5-45: entered promiscuous mode
Nov 22 03:24:28 np0005531888 NetworkManager[55166]: <info>  [1763799868.0962] manager: (tap25f695e5-45): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Nov 22 03:24:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:28Z|00622|binding|INFO|Claiming lport 25f695e5-4550-44fd-aabf-8cabc50e1bb9 for this chassis.
Nov 22 03:24:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:28Z|00623|binding|INFO|25f695e5-4550-44fd-aabf-8cabc50e1bb9: Claiming fa:16:3e:91:5d:99 10.100.0.4 2001:db8::f816:3eff:fe91:5d99
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.097 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.100 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.105 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.108 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 NetworkManager[55166]: <info>  [1763799868.1090] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Nov 22 03:24:28 np0005531888 NetworkManager[55166]: <info>  [1763799868.1095] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.119 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:5d:99 10.100.0.4 2001:db8::f816:3eff:fe91:5d99'], port_security=['fa:16:3e:91:5d:99 10.100.0.4 2001:db8::f816:3eff:fe91:5d99'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe91:5d99/64', 'neutron:device_id': '60927646-221a-4141-a422-8a7823be8d64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a58d74bd-bc51-4723-b0e3-c855953c364c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c110aad-90e5-4caa-b631-3c18861eaadf, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=25f695e5-4550-44fd-aabf-8cabc50e1bb9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.120 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 25f695e5-4550-44fd-aabf-8cabc50e1bb9 in datapath 326c0814-77d4-416b-a5a1-28be00b61ecd bound to our chassis#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.122 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 326c0814-77d4-416b-a5a1-28be00b61ecd#033[00m
Nov 22 03:24:28 np0005531888 systemd-udevd[241821]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.134 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dca6e222-6dd1-4fc2-a796-5b2654a94cc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.135 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap326c0814-71 in ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.138 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap326c0814-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.138 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[30e54bb8-7a24-455d-9f4a-baab172b8852]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.139 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd1614e-e063-415c-8e20-e186f71b5ffd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 systemd-machined[153106]: New machine qemu-77-instance-0000009f.
Nov 22 03:24:28 np0005531888 NetworkManager[55166]: <info>  [1763799868.1414] device (tap25f695e5-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:24:28 np0005531888 NetworkManager[55166]: <info>  [1763799868.1426] device (tap25f695e5-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.149 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[1d0d9dd8-59d0-444a-bbb9-f40105c3ce10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.174 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c182bd-eb28-4ece-9c51-277bf5440021]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 systemd[1]: Started Virtual Machine qemu-77-instance-0000009f.
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.194 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.205 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[669ce303-4067-464f-9f0e-ab0a8b25f312]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.215 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[228181a8-2092-4a9c-87c0-53f806968834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 NetworkManager[55166]: <info>  [1763799868.2163] manager: (tap326c0814-70): new Veth device (/org/freedesktop/NetworkManager/Devices/299)
Nov 22 03:24:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:28Z|00624|binding|INFO|Setting lport 25f695e5-4550-44fd-aabf-8cabc50e1bb9 ovn-installed in OVS
Nov 22 03:24:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:28Z|00625|binding|INFO|Setting lport 25f695e5-4550-44fd-aabf-8cabc50e1bb9 up in Southbound
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.217 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.245 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[fdba3490-6830-414b-bdda-13f188b32fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.247 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[4acbb94e-c08d-4ad8-873b-2674d54e4d89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 NetworkManager[55166]: <info>  [1763799868.2679] device (tap326c0814-70): carrier: link connected
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.273 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[553a48bd-a873-409e-a063-3464e3ea0f66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.292 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[67c4531c-4ce4-4ad4-8dc4-0d84b6e521f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap326c0814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660891, 'reachable_time': 26442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241855, 'error': None, 'target': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.310 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[610932cf-b1b6-41fc-b123-cc163f08c69d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:f1bb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 660891, 'tstamp': 660891}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241856, 'error': None, 'target': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.330 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf40ab0-5a2c-4f1e-a3e3-85e786583170]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap326c0814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660891, 'reachable_time': 26442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241857, 'error': None, 'target': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.361 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[66856e84-8898-4ab9-819b-ee78494312f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.415 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f574ea90-3f8e-44f7-85ef-e440db48e90c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.416 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap326c0814-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.416 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.417 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap326c0814-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.418 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 kernel: tap326c0814-70: entered promiscuous mode
Nov 22 03:24:28 np0005531888 NetworkManager[55166]: <info>  [1763799868.4194] manager: (tap326c0814-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.420 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.422 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap326c0814-70, col_values=(('external_ids', {'iface-id': 'e1bc69f6-ec55-4040-be0d-44f334cbe3a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.423 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:28Z|00626|binding|INFO|Releasing lport e1bc69f6-ec55-4040-be0d-44f334cbe3a6 from this chassis (sb_readonly=0)
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.424 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.424 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/326c0814-77d4-416b-a5a1-28be00b61ecd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/326c0814-77d4-416b-a5a1-28be00b61ecd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.425 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[afbd2582-2a69-4ce2-998d-034091c8a7d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.426 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-326c0814-77d4-416b-a5a1-28be00b61ecd
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/326c0814-77d4-416b-a5a1-28be00b61ecd.pid.haproxy
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 326c0814-77d4-416b-a5a1-28be00b61ecd
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:24:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:28.427 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'env', 'PROCESS_TAG=haproxy-326c0814-77d4-416b-a5a1-28be00b61ecd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/326c0814-77d4-416b-a5a1-28be00b61ecd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.435 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.518 186792 DEBUG nova.compute.manager [req-84ee7445-dcd2-4817-9dcd-592b34967049 req-d91125ea-3777-4c5f-8cff-d69f44adcbd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.519 186792 DEBUG oslo_concurrency.lockutils [req-84ee7445-dcd2-4817-9dcd-592b34967049 req-d91125ea-3777-4c5f-8cff-d69f44adcbd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "60927646-221a-4141-a422-8a7823be8d64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.520 186792 DEBUG oslo_concurrency.lockutils [req-84ee7445-dcd2-4817-9dcd-592b34967049 req-d91125ea-3777-4c5f-8cff-d69f44adcbd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.520 186792 DEBUG oslo_concurrency.lockutils [req-84ee7445-dcd2-4817-9dcd-592b34967049 req-d91125ea-3777-4c5f-8cff-d69f44adcbd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.520 186792 DEBUG nova.compute.manager [req-84ee7445-dcd2-4817-9dcd-592b34967049 req-d91125ea-3777-4c5f-8cff-d69f44adcbd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Processing event network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.614 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799868.6137087, 60927646-221a-4141-a422-8a7823be8d64 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.614 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] VM Started (Lifecycle Event)#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.616 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.619 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.623 186792 INFO nova.virt.libvirt.driver [-] [instance: 60927646-221a-4141-a422-8a7823be8d64] Instance spawned successfully.#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.624 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.662 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.668 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.669 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.670 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.671 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.671 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.672 186792 DEBUG nova.virt.libvirt.driver [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.680 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.712 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.712 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799868.6138482, 60927646-221a-4141-a422-8a7823be8d64 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.713 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.734 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.737 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763799868.61901, 60927646-221a-4141-a422-8a7823be8d64 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.737 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.753 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.756 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:24:28 np0005531888 nova_compute[186788]: 2025-11-22 08:24:28.777 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:24:28 np0005531888 podman[241893]: 2025-11-22 08:24:28.760906704 +0000 UTC m=+0.021746416 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.077 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.162 186792 INFO nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Took 6.76 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.163 186792 DEBUG nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:29 np0005531888 podman[241893]: 2025-11-22 08:24:29.245998524 +0000 UTC m=+0.506838216 container create 73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.323 186792 DEBUG nova.network.neutron [req-e9e3593c-64ee-4cb2-a05a-7b94e97873f3 req-fe945a18-9af3-4789-afca-d5a0503a2332 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Updated VIF entry in instance network info cache for port 25f695e5-4550-44fd-aabf-8cabc50e1bb9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.323 186792 DEBUG nova.network.neutron [req-e9e3593c-64ee-4cb2-a05a-7b94e97873f3 req-fe945a18-9af3-4789-afca-d5a0503a2332 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Updating instance_info_cache with network_info: [{"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.338 186792 DEBUG oslo_concurrency.lockutils [req-e9e3593c-64ee-4cb2-a05a-7b94e97873f3 req-fe945a18-9af3-4789-afca-d5a0503a2332 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:24:29 np0005531888 systemd[1]: Started libpod-conmon-73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536.scope.
Nov 22 03:24:29 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.402 186792 INFO nova.compute.manager [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Took 9.11 seconds to build instance.#033[00m
Nov 22 03:24:29 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a4081705da3f3198ec163d4c90aa31f198df3948ae392c98d87c79819d8ef58/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.462 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:29 np0005531888 podman[241893]: 2025-11-22 08:24:29.491163893 +0000 UTC m=+0.752003605 container init 73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:24:29 np0005531888 podman[241893]: 2025-11-22 08:24:29.496640998 +0000 UTC m=+0.757480690 container start 73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:24:29 np0005531888 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[241908]: [NOTICE]   (241912) : New worker (241914) forked
Nov 22 03:24:29 np0005531888 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[241908]: [NOTICE]   (241912) : Loading success.
Nov 22 03:24:29 np0005531888 nova_compute[186788]: 2025-11-22 08:24:29.622 186792 DEBUG oslo_concurrency.lockutils [None req-c13cb38a-2852-4703-b800-89bf94246374 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:30 np0005531888 nova_compute[186788]: 2025-11-22 08:24:30.617 186792 DEBUG nova.compute.manager [req-6c4312da-59ff-410c-8a85-803a72726d5f req-abf12440-5108-4aa1-84e1-f4ea92ca03ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:30 np0005531888 nova_compute[186788]: 2025-11-22 08:24:30.618 186792 DEBUG oslo_concurrency.lockutils [req-6c4312da-59ff-410c-8a85-803a72726d5f req-abf12440-5108-4aa1-84e1-f4ea92ca03ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "60927646-221a-4141-a422-8a7823be8d64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:30 np0005531888 nova_compute[186788]: 2025-11-22 08:24:30.618 186792 DEBUG oslo_concurrency.lockutils [req-6c4312da-59ff-410c-8a85-803a72726d5f req-abf12440-5108-4aa1-84e1-f4ea92ca03ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:30 np0005531888 nova_compute[186788]: 2025-11-22 08:24:30.619 186792 DEBUG oslo_concurrency.lockutils [req-6c4312da-59ff-410c-8a85-803a72726d5f req-abf12440-5108-4aa1-84e1-f4ea92ca03ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:30 np0005531888 nova_compute[186788]: 2025-11-22 08:24:30.619 186792 DEBUG nova.compute.manager [req-6c4312da-59ff-410c-8a85-803a72726d5f req-abf12440-5108-4aa1-84e1-f4ea92ca03ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] No waiting events found dispatching network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:24:30 np0005531888 nova_compute[186788]: 2025-11-22 08:24:30.619 186792 WARNING nova.compute.manager [req-6c4312da-59ff-410c-8a85-803a72726d5f req-abf12440-5108-4aa1-84e1-f4ea92ca03ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received unexpected event network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:24:32 np0005531888 nova_compute[186788]: 2025-11-22 08:24:32.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:34 np0005531888 nova_compute[186788]: 2025-11-22 08:24:34.081 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:35 np0005531888 podman[241923]: 2025-11-22 08:24:35.679682428 +0000 UTC m=+0.049699792 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:24:35 np0005531888 podman[241924]: 2025-11-22 08:24:35.69029283 +0000 UTC m=+0.059926885 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:24:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:36.841 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:36.842 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:36.842 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.853 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '60927646-221a-4141-a422-8a7823be8d64', 'name': 'tempest-TestGettingAddress-server-1050837626', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.855 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.897 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.read.latency volume: 2256447747 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.898 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.read.latency volume: 4092800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c0016e5-d84b-4637-ac3f-388e97a5c642', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2256447747, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.855402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aeadd164-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': '8c9ff2f617ca7b16e46275a957291bbfd4f64728d724d62787720d1044754731'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4092800, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.855402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aeade758-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': '57554e0bbf91a8e935352d54eb3d36481c823a66cedd50b7de3713f1f2202755'}]}, 'timestamp': '2025-11-22 08:24:36.899229', '_unique_id': '9711859dfce948dd94435dc4e9ba9ce8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.900 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.902 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.902 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.902 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1050837626>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1050837626>]
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.903 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.903 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.903 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e13c7f14-afab-4dee-aa43-ee41c22cda99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.903332', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aeae9b6c-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': 'b1cac41c521f47a83a2eb1b63de1f190d30da838ab490c5065aba8eb345211e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.903332', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aeaeacf6-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': '0287ba05de9b6b418aeb6ce48834a94578fde981a6a0e248f3e166dd3164a7c3'}]}, 'timestamp': '2025-11-22 08:24:36.904273', '_unique_id': '31be7c3cb8fe4076bac93989b77c09fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.905 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.907 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.907 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b70ab9b7-0401-462f-b6b7-3a5f92d226ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.906955', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aeaf29b0-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': 'a2c685760884fc1aa0e62527de7d1dc09f89b30746cbe66d01653ada0e13df89'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.906955', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aeaf3ee6-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': 'a38d13cb14df94097327bf6cebb8693bc21a5ee012ab2951e8639567cda14611'}]}, 'timestamp': '2025-11-22 08:24:36.908022', '_unique_id': 'ef8d2b98330c4634b1f4ba0dde81d56c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.914 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 60927646-221a-4141-a422-8a7823be8d64 / tap25f695e5-45 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.914 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44c1ff10-feb4-490b-affe-3e3fe789bd00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.910710', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aeb05dee-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': '2785758073cce9ead57ad9e91e9758ec5408cfa5b883c09bb9cf1c766be6e141'}]}, 'timestamp': '2025-11-22 08:24:36.915449', '_unique_id': '8fc7783d65c84025b456e00a5a83ba02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.918 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40642074-6268-49e6-8441-a511ac656142', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.918430', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aeb0ee30-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': '8a59af4ae0b70b6f155c829bbf1c7c69132e5ac84ded20a8b0cab7c3fe83f4ec'}]}, 'timestamp': '2025-11-22 08:24:36.919086', '_unique_id': 'fc3ecbc555774b71ad0f601372e5cf65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.921 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.922 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7351bb02-165d-480e-8d48-584bca89fc15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.921813', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aeb16c70-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': '0c5bd05537b239ee45daacda45e85a93b1c1c8948ca24b6a475aa2f1148d4d34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.921813', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aeb1828c-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': '4046ffeb39a9922d6c9fcad3a87cd56cf90c50d651f84995e0a36fee7958f764'}]}, 'timestamp': '2025-11-22 08:24:36.922850', '_unique_id': '722c629a841a405594db0c031790a8e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.925 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8d27e55-6026-49b1-ae7a-6e4143911f11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.925748', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aeb2061c-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': 'c031aeff1e20ecdbe2dc6999484bde82fc2cf9e3f5b4d92cf7449cf944e2561b'}]}, 'timestamp': '2025-11-22 08:24:36.926252', '_unique_id': 'be4349c522124a11a5cc8552d2276bf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.928 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.944 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/cpu volume: 8080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2fb0702-9b1e-4c6e-aeb7-26a64387a354', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8080000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64', 'timestamp': '2025-11-22T08:24:36.928626', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'aeb4f282-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.644108511, 'message_signature': 'bf693a0a8a35daa8a332afa874ebf6c164ad1efff9b1e4325d763a50c23f9a05'}]}, 'timestamp': '2025-11-22 08:24:36.945495', '_unique_id': 'c00a65b9fd08493d83c4d594f6112b7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.947 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.948 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fdd2594-f226-4fbf-b118-a83a9593c0c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.948791', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aeb58e72-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': 'd6530ef691436025dfef2e6a6ac94c39779d0b11d8269d4ffdaff790559055ad'}]}, 'timestamp': '2025-11-22 08:24:36.949421', '_unique_id': '487b1854fb254cec9e7e4d3528aedc8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.952 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.952 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc864eb1-7f8a-405d-bc83-8fd83c2ec83f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.952250', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aeb61536-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': 'fbe1129322c9d51dce51cbd7366cfcee38651e9bf3a2bfc9165451ea3471f23e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.952250', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aeb62846-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': 'd8d4e70dc07953cfa558b73f1c5dda616009b5045901b3512bfaa8a04e12381a'}]}, 'timestamp': '2025-11-22 08:24:36.953302', '_unique_id': 'd93744dd47354097968ca11a236bb050'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.954 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.956 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.956 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81ce254c-60bc-4027-8cf0-c45366ca1345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.956455', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aeb6ba7c-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': '7841a9506e66e6b10e3593c57bad0fe153290662735e452994f40b0c32b4b36c'}]}, 'timestamp': '2025-11-22 08:24:36.957174', '_unique_id': '00d56dbf5151406caba392bf30b3ab06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.970 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.970 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09a404ff-2eca-4082-a93b-838875c25b16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.960276', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aeb8d528-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.659836217, 'message_signature': '56a86a20e59048c2e9bd8ac61cf5054d7ad1b493ce8e4373f8a1dbdfb5aeb834'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.960276', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aeb8e216-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.659836217, 'message_signature': '7d66353bb602b062556c0ac4f0584c608c5c431aec70c88b7f897b2bf073ae43'}]}, 'timestamp': '2025-11-22 08:24:36.971112', '_unique_id': 'dbda13df324c4046869fcca2d13aa8e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.973 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.973 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.973 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1050837626>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1050837626>]
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.973 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.973 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.973 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1050837626>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1050837626>]
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.974 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3b8b9f9-ea24-4747-b974-edb8f82d2740', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.974262', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aeb969f2-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': '5606e8e86e6a5bdf9348b1c3113d1899d5c30d417fdd1e073923f6167284ed3c'}]}, 'timestamp': '2025-11-22 08:24:36.974682', '_unique_id': 'd43644442ed240798fca49798a6b0aa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.976 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0562c4d-42c7-44e6-8b8a-10a1da26817c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.976416', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aeb9bee8-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': '0fe20d10953b2fc64f3fb1ebf46b57318f7772adb853ebc65c20a3798fcd983f'}]}, 'timestamp': '2025-11-22 08:24:36.976774', '_unique_id': '6ba31c9b5a1f4488b63d51f1aaccb695'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.977 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.978 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5925714-a26e-48fa-b929-3c02a28a230c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.978825', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aeba1de8-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': 'db9fe7bc84920872440b313c1279f8ec2bad47e47bfd16b9419b0a13e03db2b1'}]}, 'timestamp': '2025-11-22 08:24:36.979230', '_unique_id': '19a6ced60d4143e2a52cd2a607c0c1fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.980 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.981 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1050837626>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1050837626>]
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.981 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.981 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f13038a-89c0-4705-9040-63758fb14b0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.981377', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aeba7f90-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.659836217, 'message_signature': '4a55d4d28e87055526ee72446e8e7f3380991dfcc9074baa82775eb1cd3d2e48'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.981377', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aeba8c10-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.659836217, 'message_signature': 'a59a27d733d0a69faa4a1df92b5dd83002849a91a5ec2f7cf551cf4175f408a7'}]}, 'timestamp': '2025-11-22 08:24:36.981996', '_unique_id': '38cde5d760c74104bcd26b59d17eacdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.983 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '223b9575-012c-441d-bd12-8af00e226078', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.983473', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aebad1fc-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': 'e53a3e2f24bb617a726f01c24385de512cb28e4589d81c472d696b408362e305'}]}, 'timestamp': '2025-11-22 08:24:36.983799', '_unique_id': '7e43aef089d94a99b95c6bce207dc392'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.984 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.987 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.987 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.987 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33475c9f-5433-4e7d-8dc5-b9592bac845b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.987637', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aebb73d2-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.659836217, 'message_signature': '792cd71f5c80d038379f672ae849ba479c91212081319f3a9e4ced1530e4d1b5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.987637', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aebb7e7c-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.659836217, 'message_signature': '80fa741873d59250b5dea4b4b3a2425b265f05a8d395fedd185c07e7e3c7ec72'}]}, 'timestamp': '2025-11-22 08:24:36.988195', '_unique_id': '53f9c15f16704c418152424dae8de3c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.991 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.991 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.992 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 60927646-221a-4141-a422-8a7823be8d64: ceilometer.compute.pollsters.NoVolumeException
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.992 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.993 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1768bb3-4150-4d37-9f2c-ae64270ea226', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-vda', 'timestamp': '2025-11-22T08:24:36.992831', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aebc4230-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': '708010b8300dfee6564746cd354ebc909423bf0636b078231e56a8cfecf02aa1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60927646-221a-4141-a422-8a7823be8d64-sda', 'timestamp': '2025-11-22T08:24:36.992831', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'instance-0000009f', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aebc5004-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.554964928, 'message_signature': '376055760f267b3a2b08dbaf468e4c7855832e8498087b7cfc12f67246a5cdf4'}]}, 'timestamp': '2025-11-22 08:24:36.993591', '_unique_id': '3192c94afffa4fecaf43b8f288d336c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.994 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.998 12 DEBUG ceilometer.compute.pollsters [-] 60927646-221a-4141-a422-8a7823be8d64/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4096bd11-cb4f-40ea-9939-b15dfadb365e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009f-60927646-221a-4141-a422-8a7823be8d64-tap25f695e5-45', 'timestamp': '2025-11-22T08:24:36.998384', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1050837626', 'name': 'tap25f695e5-45', 'instance_id': '60927646-221a-4141-a422-8a7823be8d64', 'instance_type': 'm1.nano', 'host': '684744bcd2e3f0dbf3a94ba80d013bdf9d66d9f0d60727f9892aa8d2', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:5d:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap25f695e5-45'}, 'message_id': 'aebd17d2-c77c-11f0-941d-fa163e6775e5', 'monotonic_time': 6617.610251577, 'message_signature': '9be24ec0b9ac0389a6d6342e732fa9a949769ccbb3804e90a7c2f6201344f15d'}]}, 'timestamp': '2025-11-22 08:24:36.998754', '_unique_id': '0168c3f466234701971a804e90cc9c68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:24:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:24:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:24:37 np0005531888 nova_compute[186788]: 2025-11-22 08:24:37.009 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:38 np0005531888 nova_compute[186788]: 2025-11-22 08:24:38.323 186792 DEBUG nova.compute.manager [req-cc71e538-666b-4876-b2cd-9cbdc7e0fdd0 req-ec5e53a9-e203-4928-ac64-5507b602c377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-changed-25f695e5-4550-44fd-aabf-8cabc50e1bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:38 np0005531888 nova_compute[186788]: 2025-11-22 08:24:38.324 186792 DEBUG nova.compute.manager [req-cc71e538-666b-4876-b2cd-9cbdc7e0fdd0 req-ec5e53a9-e203-4928-ac64-5507b602c377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Refreshing instance network info cache due to event network-changed-25f695e5-4550-44fd-aabf-8cabc50e1bb9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:24:38 np0005531888 nova_compute[186788]: 2025-11-22 08:24:38.324 186792 DEBUG oslo_concurrency.lockutils [req-cc71e538-666b-4876-b2cd-9cbdc7e0fdd0 req-ec5e53a9-e203-4928-ac64-5507b602c377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:24:38 np0005531888 nova_compute[186788]: 2025-11-22 08:24:38.324 186792 DEBUG oslo_concurrency.lockutils [req-cc71e538-666b-4876-b2cd-9cbdc7e0fdd0 req-ec5e53a9-e203-4928-ac64-5507b602c377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:24:38 np0005531888 nova_compute[186788]: 2025-11-22 08:24:38.324 186792 DEBUG nova.network.neutron [req-cc71e538-666b-4876-b2cd-9cbdc7e0fdd0 req-ec5e53a9-e203-4928-ac64-5507b602c377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Refreshing network info cache for port 25f695e5-4550-44fd-aabf-8cabc50e1bb9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:24:39 np0005531888 nova_compute[186788]: 2025-11-22 08:24:39.082 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:39 np0005531888 nova_compute[186788]: 2025-11-22 08:24:39.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:41 np0005531888 nova_compute[186788]: 2025-11-22 08:24:41.686 186792 DEBUG nova.network.neutron [req-cc71e538-666b-4876-b2cd-9cbdc7e0fdd0 req-ec5e53a9-e203-4928-ac64-5507b602c377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Updated VIF entry in instance network info cache for port 25f695e5-4550-44fd-aabf-8cabc50e1bb9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:24:41 np0005531888 nova_compute[186788]: 2025-11-22 08:24:41.686 186792 DEBUG nova.network.neutron [req-cc71e538-666b-4876-b2cd-9cbdc7e0fdd0 req-ec5e53a9-e203-4928-ac64-5507b602c377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Updating instance_info_cache with network_info: [{"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:41 np0005531888 nova_compute[186788]: 2025-11-22 08:24:41.758 186792 DEBUG oslo_concurrency.lockutils [req-cc71e538-666b-4876-b2cd-9cbdc7e0fdd0 req-ec5e53a9-e203-4928-ac64-5507b602c377 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:24:42 np0005531888 nova_compute[186788]: 2025-11-22 08:24:42.012 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:44Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:91:5d:99 10.100.0.4
Nov 22 03:24:44 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:44Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:91:5d:99 10.100.0.4
Nov 22 03:24:44 np0005531888 nova_compute[186788]: 2025-11-22 08:24:44.084 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:47 np0005531888 nova_compute[186788]: 2025-11-22 08:24:47.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:47 np0005531888 podman[241984]: 2025-11-22 08:24:47.68714855 +0000 UTC m=+0.054233345 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:24:47 np0005531888 podman[241983]: 2025-11-22 08:24:47.692289567 +0000 UTC m=+0.061543844 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:24:49 np0005531888 nova_compute[186788]: 2025-11-22 08:24:49.085 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531888 podman[242024]: 2025-11-22 08:24:50.724774545 +0000 UTC m=+0.086719762 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:24:50 np0005531888 podman[242025]: 2025-11-22 08:24:50.735575571 +0000 UTC m=+0.089594434 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 03:24:50 np0005531888 podman[242026]: 2025-11-22 08:24:50.765856446 +0000 UTC m=+0.116915506 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 22 03:24:52 np0005531888 nova_compute[186788]: 2025-11-22 08:24:52.018 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:54 np0005531888 nova_compute[186788]: 2025-11-22 08:24:54.088 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:55 np0005531888 nova_compute[186788]: 2025-11-22 08:24:55.908 186792 DEBUG nova.compute.manager [req-2f96143d-8047-4cc2-a7bf-f7ebde42a298 req-6863bfd0-2228-4e06-a4b0-4212ba2ed29c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-changed-25f695e5-4550-44fd-aabf-8cabc50e1bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:55 np0005531888 nova_compute[186788]: 2025-11-22 08:24:55.909 186792 DEBUG nova.compute.manager [req-2f96143d-8047-4cc2-a7bf-f7ebde42a298 req-6863bfd0-2228-4e06-a4b0-4212ba2ed29c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Refreshing instance network info cache due to event network-changed-25f695e5-4550-44fd-aabf-8cabc50e1bb9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:24:55 np0005531888 nova_compute[186788]: 2025-11-22 08:24:55.909 186792 DEBUG oslo_concurrency.lockutils [req-2f96143d-8047-4cc2-a7bf-f7ebde42a298 req-6863bfd0-2228-4e06-a4b0-4212ba2ed29c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:24:55 np0005531888 nova_compute[186788]: 2025-11-22 08:24:55.909 186792 DEBUG oslo_concurrency.lockutils [req-2f96143d-8047-4cc2-a7bf-f7ebde42a298 req-6863bfd0-2228-4e06-a4b0-4212ba2ed29c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:24:55 np0005531888 nova_compute[186788]: 2025-11-22 08:24:55.909 186792 DEBUG nova.network.neutron [req-2f96143d-8047-4cc2-a7bf-f7ebde42a298 req-6863bfd0-2228-4e06-a4b0-4212ba2ed29c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Refreshing network info cache for port 25f695e5-4550-44fd-aabf-8cabc50e1bb9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:24:55 np0005531888 nova_compute[186788]: 2025-11-22 08:24:55.999 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "60927646-221a-4141-a422-8a7823be8d64" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:55.999 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.000 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "60927646-221a-4141-a422-8a7823be8d64-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.000 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.000 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.009 186792 INFO nova.compute.manager [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Terminating instance#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.016 186792 DEBUG nova.compute.manager [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:24:56 np0005531888 kernel: tap25f695e5-45 (unregistering): left promiscuous mode
Nov 22 03:24:56 np0005531888 NetworkManager[55166]: <info>  [1763799896.0741] device (tap25f695e5-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:24:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:56Z|00627|binding|INFO|Releasing lport 25f695e5-4550-44fd-aabf-8cabc50e1bb9 from this chassis (sb_readonly=0)
Nov 22 03:24:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:56Z|00628|binding|INFO|Setting lport 25f695e5-4550-44fd-aabf-8cabc50e1bb9 down in Southbound
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.087 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:24:56Z|00629|binding|INFO|Removing iface tap25f695e5-45 ovn-installed in OVS
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.090 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.104 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:56.107 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:5d:99 10.100.0.4 2001:db8::f816:3eff:fe91:5d99'], port_security=['fa:16:3e:91:5d:99 10.100.0.4 2001:db8::f816:3eff:fe91:5d99'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe91:5d99/64', 'neutron:device_id': '60927646-221a-4141-a422-8a7823be8d64', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a58d74bd-bc51-4723-b0e3-c855953c364c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c110aad-90e5-4caa-b631-3c18861eaadf, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=25f695e5-4550-44fd-aabf-8cabc50e1bb9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:24:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:56.109 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 25f695e5-4550-44fd-aabf-8cabc50e1bb9 in datapath 326c0814-77d4-416b-a5a1-28be00b61ecd unbound from our chassis#033[00m
Nov 22 03:24:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:56.110 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 326c0814-77d4-416b-a5a1-28be00b61ecd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:24:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:56.111 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[04ba7166-6f02-42a1-85c5-5ba96c149cd3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:56.112 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd namespace which is not needed anymore#033[00m
Nov 22 03:24:56 np0005531888 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Nov 22 03:24:56 np0005531888 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d0000009f.scope: Consumed 15.579s CPU time.
Nov 22 03:24:56 np0005531888 systemd-machined[153106]: Machine qemu-77-instance-0000009f terminated.
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.235 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.240 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.280 186792 INFO nova.virt.libvirt.driver [-] [instance: 60927646-221a-4141-a422-8a7823be8d64] Instance destroyed successfully.#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.281 186792 DEBUG nova.objects.instance [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 60927646-221a-4141-a422-8a7823be8d64 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.294 186792 DEBUG nova.virt.libvirt.vif [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:24:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1050837626',display_name='tempest-TestGettingAddress-server-1050837626',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1050837626',id=159,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDo9IOjv1g7RgDVY2BCJ+mmy9Na2DHnkZ9mE8BRZMtj9iwqLsOttn1FTaKP3MHHav783x9ncOoaXWlLdYlMz7ANRW+4cS5nXI1iryizo5WsS7HuJMsWvhJA60lnbjcIyew==',key_name='tempest-TestGettingAddress-814265221',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:24:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ihrg38q9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:24:29Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=60927646-221a-4141-a422-8a7823be8d64,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.295 186792 DEBUG nova.network.os_vif_util [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.296 186792 DEBUG nova.network.os_vif_util [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:91:5d:99,bridge_name='br-int',has_traffic_filtering=True,id=25f695e5-4550-44fd-aabf-8cabc50e1bb9,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25f695e5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.297 186792 DEBUG os_vif [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:5d:99,bridge_name='br-int',has_traffic_filtering=True,id=25f695e5-4550-44fd-aabf-8cabc50e1bb9,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25f695e5-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.298 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.298 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25f695e5-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.300 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.301 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.302 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.305 186792 INFO os_vif [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:5d:99,bridge_name='br-int',has_traffic_filtering=True,id=25f695e5-4550-44fd-aabf-8cabc50e1bb9,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25f695e5-45')#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.305 186792 INFO nova.virt.libvirt.driver [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Deleting instance files /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64_del#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.306 186792 INFO nova.virt.libvirt.driver [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Deletion of /var/lib/nova/instances/60927646-221a-4141-a422-8a7823be8d64_del complete#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.387 186792 DEBUG nova.compute.manager [req-d9ce8cfe-dfa9-44fa-b232-de9771211b68 req-974c2ba7-abfc-4afd-ac0f-e76690e7cf89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-vif-unplugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.388 186792 DEBUG oslo_concurrency.lockutils [req-d9ce8cfe-dfa9-44fa-b232-de9771211b68 req-974c2ba7-abfc-4afd-ac0f-e76690e7cf89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "60927646-221a-4141-a422-8a7823be8d64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.388 186792 DEBUG oslo_concurrency.lockutils [req-d9ce8cfe-dfa9-44fa-b232-de9771211b68 req-974c2ba7-abfc-4afd-ac0f-e76690e7cf89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.388 186792 DEBUG oslo_concurrency.lockutils [req-d9ce8cfe-dfa9-44fa-b232-de9771211b68 req-974c2ba7-abfc-4afd-ac0f-e76690e7cf89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.388 186792 DEBUG nova.compute.manager [req-d9ce8cfe-dfa9-44fa-b232-de9771211b68 req-974c2ba7-abfc-4afd-ac0f-e76690e7cf89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] No waiting events found dispatching network-vif-unplugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.388 186792 DEBUG nova.compute.manager [req-d9ce8cfe-dfa9-44fa-b232-de9771211b68 req-974c2ba7-abfc-4afd-ac0f-e76690e7cf89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-vif-unplugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:24:56 np0005531888 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[241908]: [NOTICE]   (241912) : haproxy version is 2.8.14-c23fe91
Nov 22 03:24:56 np0005531888 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[241908]: [NOTICE]   (241912) : path to executable is /usr/sbin/haproxy
Nov 22 03:24:56 np0005531888 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[241908]: [WARNING]  (241912) : Exiting Master process...
Nov 22 03:24:56 np0005531888 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[241908]: [WARNING]  (241912) : Exiting Master process...
Nov 22 03:24:56 np0005531888 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[241908]: [ALERT]    (241912) : Current worker (241914) exited with code 143 (Terminated)
Nov 22 03:24:56 np0005531888 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[241908]: [WARNING]  (241912) : All workers exited. Exiting... (0)
Nov 22 03:24:56 np0005531888 systemd[1]: libpod-73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536.scope: Deactivated successfully.
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.393 186792 INFO nova.compute.manager [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.394 186792 DEBUG oslo.service.loopingcall [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.394 186792 DEBUG nova.compute.manager [-] [instance: 60927646-221a-4141-a422-8a7823be8d64] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.394 186792 DEBUG nova.network.neutron [-] [instance: 60927646-221a-4141-a422-8a7823be8d64] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:24:56 np0005531888 podman[242115]: 2025-11-22 08:24:56.400724724 +0000 UTC m=+0.209637596 container died 73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 03:24:56 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536-userdata-shm.mount: Deactivated successfully.
Nov 22 03:24:56 np0005531888 systemd[1]: var-lib-containers-storage-overlay-2a4081705da3f3198ec163d4c90aa31f198df3948ae392c98d87c79819d8ef58-merged.mount: Deactivated successfully.
Nov 22 03:24:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:56.855 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:24:56 np0005531888 nova_compute[186788]: 2025-11-22 08:24:56.855 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:57 np0005531888 podman[242115]: 2025-11-22 08:24:57.066298413 +0000 UTC m=+0.875211255 container cleanup 73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:24:57 np0005531888 systemd[1]: libpod-conmon-73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536.scope: Deactivated successfully.
Nov 22 03:24:57 np0005531888 podman[242163]: 2025-11-22 08:24:57.536296832 +0000 UTC m=+0.447442495 container remove 73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.542 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[054d6a55-520a-4e5a-8d94-f22454943d0c]: (4, ('Sat Nov 22 08:24:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd (73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536)\n73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536\nSat Nov 22 08:24:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd (73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536)\n73f033c658a6d4795d161f3c108015efa6bcd5c40c47e72d8a787e49656a1536\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.544 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7db007c1-b620-42e1-950a-88c9ccc8d556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.545 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap326c0814-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:57 np0005531888 nova_compute[186788]: 2025-11-22 08:24:57.547 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:57 np0005531888 kernel: tap326c0814-70: left promiscuous mode
Nov 22 03:24:57 np0005531888 nova_compute[186788]: 2025-11-22 08:24:57.560 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.563 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[91468f9c-0b24-437a-a9e7-d8a509ac51f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.578 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8f762b5e-c820-43cf-a780-b45784670caa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.581 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cddc7e-f843-47bd-9dfb-79122e2fdb69]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.597 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ebc822-8b1c-46b1-81ac-4ee0b2804de2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660884, 'reachable_time': 17735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242180, 'error': None, 'target': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:57 np0005531888 systemd[1]: run-netns-ovnmeta\x2d326c0814\x2d77d4\x2d416b\x2da5a1\x2d28be00b61ecd.mount: Deactivated successfully.
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.603 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.603 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[ec6309ab-78dc-4962-8acc-9e8e6a5fd7c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:24:57.605 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.037 186792 DEBUG nova.network.neutron [-] [instance: 60927646-221a-4141-a422-8a7823be8d64] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.072 186792 INFO nova.compute.manager [-] [instance: 60927646-221a-4141-a422-8a7823be8d64] Took 1.68 seconds to deallocate network for instance.#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.097 186792 DEBUG nova.compute.manager [req-3885ae95-425d-4c65-808b-5af7b6a2d5bb req-cd0fd52c-8c5d-4012-9167-706bb8f73cb6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-vif-deleted-25f695e5-4550-44fd-aabf-8cabc50e1bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.157 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.157 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.218 186792 DEBUG nova.compute.provider_tree [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.236 186792 DEBUG nova.scheduler.client.report [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.261 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.346 186792 INFO nova.scheduler.client.report [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 60927646-221a-4141-a422-8a7823be8d64#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.453 186792 DEBUG oslo_concurrency.lockutils [None req-f710db05-af5e-47f2-b0c1-d3ffbf26a8ca 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.495 186792 DEBUG nova.compute.manager [req-94e4c15a-fdd9-42d3-8999-125d583d21dc req-1ae500f5-15df-44c3-830e-c26f23565b34 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received event network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.495 186792 DEBUG oslo_concurrency.lockutils [req-94e4c15a-fdd9-42d3-8999-125d583d21dc req-1ae500f5-15df-44c3-830e-c26f23565b34 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "60927646-221a-4141-a422-8a7823be8d64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.496 186792 DEBUG oslo_concurrency.lockutils [req-94e4c15a-fdd9-42d3-8999-125d583d21dc req-1ae500f5-15df-44c3-830e-c26f23565b34 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.496 186792 DEBUG oslo_concurrency.lockutils [req-94e4c15a-fdd9-42d3-8999-125d583d21dc req-1ae500f5-15df-44c3-830e-c26f23565b34 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60927646-221a-4141-a422-8a7823be8d64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.496 186792 DEBUG nova.compute.manager [req-94e4c15a-fdd9-42d3-8999-125d583d21dc req-1ae500f5-15df-44c3-830e-c26f23565b34 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] No waiting events found dispatching network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.497 186792 WARNING nova.compute.manager [req-94e4c15a-fdd9-42d3-8999-125d583d21dc req-1ae500f5-15df-44c3-830e-c26f23565b34 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Received unexpected event network-vif-plugged-25f695e5-4550-44fd-aabf-8cabc50e1bb9 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.610 186792 DEBUG nova.network.neutron [req-2f96143d-8047-4cc2-a7bf-f7ebde42a298 req-6863bfd0-2228-4e06-a4b0-4212ba2ed29c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Updated VIF entry in instance network info cache for port 25f695e5-4550-44fd-aabf-8cabc50e1bb9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.611 186792 DEBUG nova.network.neutron [req-2f96143d-8047-4cc2-a7bf-f7ebde42a298 req-6863bfd0-2228-4e06-a4b0-4212ba2ed29c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60927646-221a-4141-a422-8a7823be8d64] Updating instance_info_cache with network_info: [{"id": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "address": "fa:16:3e:91:5d:99", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe91:5d99", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25f695e5-45", "ovs_interfaceid": "25f695e5-4550-44fd-aabf-8cabc50e1bb9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:58 np0005531888 nova_compute[186788]: 2025-11-22 08:24:58.637 186792 DEBUG oslo_concurrency.lockutils [req-2f96143d-8047-4cc2-a7bf-f7ebde42a298 req-6863bfd0-2228-4e06-a4b0-4212ba2ed29c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-60927646-221a-4141-a422-8a7823be8d64" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:24:59 np0005531888 nova_compute[186788]: 2025-11-22 08:24:59.090 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:01 np0005531888 nova_compute[186788]: 2025-11-22 08:25:01.301 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:25:01.607 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:04 np0005531888 nova_compute[186788]: 2025-11-22 08:25:04.091 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:06 np0005531888 nova_compute[186788]: 2025-11-22 08:25:06.304 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:06 np0005531888 podman[242182]: 2025-11-22 08:25:06.683048288 +0000 UTC m=+0.053146498 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:25:06 np0005531888 podman[242181]: 2025-11-22 08:25:06.715481446 +0000 UTC m=+0.086978420 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:25:06 np0005531888 nova_compute[186788]: 2025-11-22 08:25:06.965 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:08 np0005531888 nova_compute[186788]: 2025-11-22 08:25:08.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:09 np0005531888 nova_compute[186788]: 2025-11-22 08:25:09.094 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:10 np0005531888 nova_compute[186788]: 2025-11-22 08:25:10.463 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:10 np0005531888 nova_compute[186788]: 2025-11-22 08:25:10.586 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:11 np0005531888 nova_compute[186788]: 2025-11-22 08:25:11.280 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799896.2775264, 60927646-221a-4141-a422-8a7823be8d64 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:25:11 np0005531888 nova_compute[186788]: 2025-11-22 08:25:11.280 186792 INFO nova.compute.manager [-] [instance: 60927646-221a-4141-a422-8a7823be8d64] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:25:11 np0005531888 nova_compute[186788]: 2025-11-22 08:25:11.303 186792 DEBUG nova.compute.manager [None req-52202352-aa64-4b3f-bef5-ca70d0d074a3 - - - - - -] [instance: 60927646-221a-4141-a422-8a7823be8d64] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:25:11 np0005531888 nova_compute[186788]: 2025-11-22 08:25:11.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:12 np0005531888 nova_compute[186788]: 2025-11-22 08:25:12.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:12 np0005531888 nova_compute[186788]: 2025-11-22 08:25:12.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:25:12 np0005531888 nova_compute[186788]: 2025-11-22 08:25:12.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:25:13 np0005531888 nova_compute[186788]: 2025-11-22 08:25:13.183 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:25:14 np0005531888 nova_compute[186788]: 2025-11-22 08:25:14.094 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:14 np0005531888 nova_compute[186788]: 2025-11-22 08:25:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:15 np0005531888 nova_compute[186788]: 2025-11-22 08:25:15.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:16 np0005531888 nova_compute[186788]: 2025-11-22 08:25:16.311 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:18 np0005531888 podman[242225]: 2025-11-22 08:25:18.686857401 +0000 UTC m=+0.058090879 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:25:18 np0005531888 podman[242224]: 2025-11-22 08:25:18.709844657 +0000 UTC m=+0.079546627 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:25:19 np0005531888 nova_compute[186788]: 2025-11-22 08:25:19.096 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:19 np0005531888 nova_compute[186788]: 2025-11-22 08:25:19.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:19 np0005531888 nova_compute[186788]: 2025-11-22 08:25:19.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:19 np0005531888 nova_compute[186788]: 2025-11-22 08:25:19.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:19 np0005531888 nova_compute[186788]: 2025-11-22 08:25:19.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:19 np0005531888 nova_compute[186788]: 2025-11-22 08:25:19.989 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.178 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.179 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5698MB free_disk=73.26642990112305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.179 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.179 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.375 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.375 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.396 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.421 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.422 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.438 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.466 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.487 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.504 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.580 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:25:20 np0005531888 nova_compute[186788]: 2025-11-22 08:25:20.580 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:21 np0005531888 nova_compute[186788]: 2025-11-22 08:25:21.314 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:21 np0005531888 podman[242269]: 2025-11-22 08:25:21.69940958 +0000 UTC m=+0.063275617 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, maintainer=Red Hat, Inc.)
Nov 22 03:25:21 np0005531888 podman[242270]: 2025-11-22 08:25:21.706358341 +0000 UTC m=+0.070356571 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:25:21 np0005531888 podman[242271]: 2025-11-22 08:25:21.749919613 +0000 UTC m=+0.112045107 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:25:23 np0005531888 nova_compute[186788]: 2025-11-22 08:25:23.581 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:23 np0005531888 nova_compute[186788]: 2025-11-22 08:25:23.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:24 np0005531888 nova_compute[186788]: 2025-11-22 08:25:24.097 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:26 np0005531888 nova_compute[186788]: 2025-11-22 08:25:26.317 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:26 np0005531888 nova_compute[186788]: 2025-11-22 08:25:26.879 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:26 np0005531888 nova_compute[186788]: 2025-11-22 08:25:26.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:26 np0005531888 nova_compute[186788]: 2025-11-22 08:25:26.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:25:29 np0005531888 nova_compute[186788]: 2025-11-22 08:25:29.099 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:31 np0005531888 nova_compute[186788]: 2025-11-22 08:25:31.320 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:34 np0005531888 nova_compute[186788]: 2025-11-22 08:25:34.102 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:36 np0005531888 nova_compute[186788]: 2025-11-22 08:25:36.324 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:25:36.841 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:25:36.842 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:25:36.842 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:37 np0005531888 podman[242336]: 2025-11-22 08:25:37.695745262 +0000 UTC m=+0.058411708 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:25:37 np0005531888 podman[242337]: 2025-11-22 08:25:37.725196696 +0000 UTC m=+0.086590281 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:25:39 np0005531888 nova_compute[186788]: 2025-11-22 08:25:39.104 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:41 np0005531888 nova_compute[186788]: 2025-11-22 08:25:41.326 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:44 np0005531888 nova_compute[186788]: 2025-11-22 08:25:44.106 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:46 np0005531888 nova_compute[186788]: 2025-11-22 08:25:46.330 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531888 nova_compute[186788]: 2025-11-22 08:25:47.894 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:25:47.894 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:25:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:25:47.896 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:25:49 np0005531888 nova_compute[186788]: 2025-11-22 08:25:49.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:49 np0005531888 podman[242380]: 2025-11-22 08:25:49.686474389 +0000 UTC m=+0.051080537 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:25:49 np0005531888 podman[242379]: 2025-11-22 08:25:49.688729624 +0000 UTC m=+0.060668002 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 03:25:51 np0005531888 nova_compute[186788]: 2025-11-22 08:25:51.332 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:52 np0005531888 podman[242422]: 2025-11-22 08:25:52.68492168 +0000 UTC m=+0.059298629 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:25:52 np0005531888 podman[242423]: 2025-11-22 08:25:52.69141062 +0000 UTC m=+0.060919100 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:25:52 np0005531888 podman[242424]: 2025-11-22 08:25:52.719307726 +0000 UTC m=+0.085616456 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:25:54 np0005531888 nova_compute[186788]: 2025-11-22 08:25:54.110 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:25:55.899 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:56 np0005531888 nova_compute[186788]: 2025-11-22 08:25:56.335 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:59 np0005531888 nova_compute[186788]: 2025-11-22 08:25:59.112 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:01 np0005531888 nova_compute[186788]: 2025-11-22 08:26:01.337 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:04 np0005531888 nova_compute[186788]: 2025-11-22 08:26:04.114 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:06 np0005531888 nova_compute[186788]: 2025-11-22 08:26:06.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:08 np0005531888 podman[242487]: 2025-11-22 08:26:08.670907196 +0000 UTC m=+0.045092811 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:26:08 np0005531888 podman[242488]: 2025-11-22 08:26:08.70242132 +0000 UTC m=+0.072223436 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:26:08 np0005531888 nova_compute[186788]: 2025-11-22 08:26:08.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:09 np0005531888 nova_compute[186788]: 2025-11-22 08:26:09.115 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:10 np0005531888 nova_compute[186788]: 2025-11-22 08:26:10.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:11 np0005531888 nova_compute[186788]: 2025-11-22 08:26:11.342 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:12 np0005531888 nova_compute[186788]: 2025-11-22 08:26:12.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:12 np0005531888 nova_compute[186788]: 2025-11-22 08:26:12.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:26:12 np0005531888 nova_compute[186788]: 2025-11-22 08:26:12.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:26:12 np0005531888 nova_compute[186788]: 2025-11-22 08:26:12.968 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:26:14 np0005531888 nova_compute[186788]: 2025-11-22 08:26:14.119 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:15 np0005531888 nova_compute[186788]: 2025-11-22 08:26:15.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:16 np0005531888 nova_compute[186788]: 2025-11-22 08:26:16.345 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:16 np0005531888 nova_compute[186788]: 2025-11-22 08:26:16.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:19 np0005531888 nova_compute[186788]: 2025-11-22 08:26:19.120 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:19 np0005531888 nova_compute[186788]: 2025-11-22 08:26:19.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:19 np0005531888 nova_compute[186788]: 2025-11-22 08:26:19.985 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:26:19 np0005531888 nova_compute[186788]: 2025-11-22 08:26:19.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:26:19 np0005531888 nova_compute[186788]: 2025-11-22 08:26:19.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:26:19 np0005531888 nova_compute[186788]: 2025-11-22 08:26:19.986 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.209 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.210 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5728MB free_disk=73.26642990112305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.210 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.211 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.294 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.295 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.328 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.356 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.358 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:26:20 np0005531888 nova_compute[186788]: 2025-11-22 08:26:20.358 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:26:20 np0005531888 podman[242530]: 2025-11-22 08:26:20.68296857 +0000 UTC m=+0.060795667 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:26:20 np0005531888 podman[242531]: 2025-11-22 08:26:20.684145159 +0000 UTC m=+0.056926491 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:26:21 np0005531888 nova_compute[186788]: 2025-11-22 08:26:21.348 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:23 np0005531888 podman[242573]: 2025-11-22 08:26:23.684621601 +0000 UTC m=+0.063673077 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, vcs-type=git, name=ubi9-minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41)
Nov 22 03:26:23 np0005531888 podman[242574]: 2025-11-22 08:26:23.687630865 +0000 UTC m=+0.058363447 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:26:23 np0005531888 podman[242580]: 2025-11-22 08:26:23.730496768 +0000 UTC m=+0.090952937 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:26:24 np0005531888 nova_compute[186788]: 2025-11-22 08:26:24.121 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:25 np0005531888 nova_compute[186788]: 2025-11-22 08:26:25.358 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:25 np0005531888 nova_compute[186788]: 2025-11-22 08:26:25.358 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:26 np0005531888 nova_compute[186788]: 2025-11-22 08:26:26.351 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:26 np0005531888 nova_compute[186788]: 2025-11-22 08:26:26.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:26 np0005531888 nova_compute[186788]: 2025-11-22 08:26:26.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:26:29 np0005531888 nova_compute[186788]: 2025-11-22 08:26:29.122 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:31 np0005531888 nova_compute[186788]: 2025-11-22 08:26:31.355 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:34 np0005531888 nova_compute[186788]: 2025-11-22 08:26:34.124 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:26:34Z|00630|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 22 03:26:36 np0005531888 nova_compute[186788]: 2025-11-22 08:26:36.357 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:26:36.843 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:26:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:26:36.843 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:26:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:26:36.843 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:26:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:39 np0005531888 nova_compute[186788]: 2025-11-22 08:26:39.125 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:39 np0005531888 podman[242638]: 2025-11-22 08:26:39.682276791 +0000 UTC m=+0.057365591 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:26:39 np0005531888 podman[242639]: 2025-11-22 08:26:39.699473885 +0000 UTC m=+0.068895776 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 03:26:41 np0005531888 nova_compute[186788]: 2025-11-22 08:26:41.359 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:41 np0005531888 nova_compute[186788]: 2025-11-22 08:26:41.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:44 np0005531888 nova_compute[186788]: 2025-11-22 08:26:44.127 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:46 np0005531888 nova_compute[186788]: 2025-11-22 08:26:46.361 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:49 np0005531888 nova_compute[186788]: 2025-11-22 08:26:49.129 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:50 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:26:50.715 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:26:50 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:26:50.716 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:26:50 np0005531888 nova_compute[186788]: 2025-11-22 08:26:50.717 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:51 np0005531888 nova_compute[186788]: 2025-11-22 08:26:51.363 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:51 np0005531888 podman[242683]: 2025-11-22 08:26:51.691454367 +0000 UTC m=+0.050273138 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:26:51 np0005531888 podman[242682]: 2025-11-22 08:26:51.691454617 +0000 UTC m=+0.054361578 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 03:26:54 np0005531888 nova_compute[186788]: 2025-11-22 08:26:54.141 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:54 np0005531888 podman[242722]: 2025-11-22 08:26:54.681678455 +0000 UTC m=+0.056784147 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:26:54 np0005531888 podman[242723]: 2025-11-22 08:26:54.681916891 +0000 UTC m=+0.054725537 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:26:54 np0005531888 podman[242724]: 2025-11-22 08:26:54.714554364 +0000 UTC m=+0.083387452 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:26:56 np0005531888 nova_compute[186788]: 2025-11-22 08:26:56.366 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:26:56.719 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:26:59 np0005531888 nova_compute[186788]: 2025-11-22 08:26:59.141 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:01 np0005531888 nova_compute[186788]: 2025-11-22 08:27:01.370 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:04 np0005531888 nova_compute[186788]: 2025-11-22 08:27:04.143 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:06 np0005531888 nova_compute[186788]: 2025-11-22 08:27:06.374 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:09 np0005531888 nova_compute[186788]: 2025-11-22 08:27:09.144 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:09 np0005531888 nova_compute[186788]: 2025-11-22 08:27:09.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:10 np0005531888 podman[242786]: 2025-11-22 08:27:10.691767361 +0000 UTC m=+0.060539920 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:27:10 np0005531888 podman[242787]: 2025-11-22 08:27:10.715899254 +0000 UTC m=+0.081018373 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Nov 22 03:27:10 np0005531888 nova_compute[186788]: 2025-11-22 08:27:10.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:11 np0005531888 nova_compute[186788]: 2025-11-22 08:27:11.377 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:14 np0005531888 nova_compute[186788]: 2025-11-22 08:27:14.146 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:14 np0005531888 nova_compute[186788]: 2025-11-22 08:27:14.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:14 np0005531888 nova_compute[186788]: 2025-11-22 08:27:14.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:27:14 np0005531888 nova_compute[186788]: 2025-11-22 08:27:14.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:27:14 np0005531888 nova_compute[186788]: 2025-11-22 08:27:14.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:27:16 np0005531888 nova_compute[186788]: 2025-11-22 08:27:16.379 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:16 np0005531888 nova_compute[186788]: 2025-11-22 08:27:16.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:16 np0005531888 nova_compute[186788]: 2025-11-22 08:27:16.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:19 np0005531888 nova_compute[186788]: 2025-11-22 08:27:19.149 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:21 np0005531888 nova_compute[186788]: 2025-11-22 08:27:21.382 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:21 np0005531888 nova_compute[186788]: 2025-11-22 08:27:21.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:21 np0005531888 nova_compute[186788]: 2025-11-22 08:27:21.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:27:21 np0005531888 nova_compute[186788]: 2025-11-22 08:27:21.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:27:21 np0005531888 nova_compute[186788]: 2025-11-22 08:27:21.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:27:21 np0005531888 nova_compute[186788]: 2025-11-22 08:27:21.977 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.140 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.142 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.26642990112305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.142 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.142 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.202 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.202 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.226 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.238 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.240 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:27:22 np0005531888 nova_compute[186788]: 2025-11-22 08:27:22.240 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:27:22 np0005531888 podman[242826]: 2025-11-22 08:27:22.680604192 +0000 UTC m=+0.055565406 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:27:22 np0005531888 podman[242827]: 2025-11-22 08:27:22.708386545 +0000 UTC m=+0.078505350 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:27:24 np0005531888 nova_compute[186788]: 2025-11-22 08:27:24.150 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:25 np0005531888 nova_compute[186788]: 2025-11-22 08:27:25.241 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:25 np0005531888 podman[242869]: 2025-11-22 08:27:25.680461988 +0000 UTC m=+0.057094945 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal)
Nov 22 03:27:25 np0005531888 podman[242871]: 2025-11-22 08:27:25.707290988 +0000 UTC m=+0.077999340 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:27:25 np0005531888 podman[242870]: 2025-11-22 08:27:25.715336506 +0000 UTC m=+0.087638347 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 03:27:25 np0005531888 nova_compute[186788]: 2025-11-22 08:27:25.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:26 np0005531888 nova_compute[186788]: 2025-11-22 08:27:26.384 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:27 np0005531888 nova_compute[186788]: 2025-11-22 08:27:27.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:27 np0005531888 nova_compute[186788]: 2025-11-22 08:27:27.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:27:29 np0005531888 nova_compute[186788]: 2025-11-22 08:27:29.152 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:31 np0005531888 nova_compute[186788]: 2025-11-22 08:27:31.387 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:34 np0005531888 nova_compute[186788]: 2025-11-22 08:27:34.154 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:36 np0005531888 nova_compute[186788]: 2025-11-22 08:27:36.390 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:27:36.844 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:27:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:27:36.845 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:27:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:27:36.845 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:27:39 np0005531888 nova_compute[186788]: 2025-11-22 08:27:39.156 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:41 np0005531888 nova_compute[186788]: 2025-11-22 08:27:41.393 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:41 np0005531888 podman[242933]: 2025-11-22 08:27:41.674446527 +0000 UTC m=+0.051147829 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:27:41 np0005531888 podman[242934]: 2025-11-22 08:27:41.683807417 +0000 UTC m=+0.053631890 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:27:44 np0005531888 nova_compute[186788]: 2025-11-22 08:27:44.159 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:46 np0005531888 nova_compute[186788]: 2025-11-22 08:27:46.396 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:49 np0005531888 nova_compute[186788]: 2025-11-22 08:27:49.160 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:27:49.620 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:27:49 np0005531888 nova_compute[186788]: 2025-11-22 08:27:49.620 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:27:49.621 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:27:51 np0005531888 nova_compute[186788]: 2025-11-22 08:27:51.398 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:53 np0005531888 podman[242980]: 2025-11-22 08:27:53.672794753 +0000 UTC m=+0.049928178 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 22 03:27:53 np0005531888 podman[242981]: 2025-11-22 08:27:53.678667178 +0000 UTC m=+0.049168000 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:27:55 np0005531888 nova_compute[186788]: 2025-11-22 08:27:55.326 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:27:55.624 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:27:56 np0005531888 nova_compute[186788]: 2025-11-22 08:27:56.400 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:56 np0005531888 podman[243026]: 2025-11-22 08:27:56.681773484 +0000 UTC m=+0.057267458 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6)
Nov 22 03:27:56 np0005531888 podman[243027]: 2025-11-22 08:27:56.686284796 +0000 UTC m=+0.058343916 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:27:56 np0005531888 podman[243028]: 2025-11-22 08:27:56.725192632 +0000 UTC m=+0.092440944 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:27:59 np0005531888 nova_compute[186788]: 2025-11-22 08:27:59.164 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:01 np0005531888 nova_compute[186788]: 2025-11-22 08:28:01.403 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:04 np0005531888 nova_compute[186788]: 2025-11-22 08:28:04.166 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:06 np0005531888 nova_compute[186788]: 2025-11-22 08:28:06.406 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:09 np0005531888 nova_compute[186788]: 2025-11-22 08:28:09.168 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:11 np0005531888 nova_compute[186788]: 2025-11-22 08:28:11.408 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:11 np0005531888 nova_compute[186788]: 2025-11-22 08:28:11.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:12 np0005531888 podman[243092]: 2025-11-22 08:28:12.689510732 +0000 UTC m=+0.061990074 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:28:12 np0005531888 podman[243093]: 2025-11-22 08:28:12.708295115 +0000 UTC m=+0.081711861 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 22 03:28:12 np0005531888 nova_compute[186788]: 2025-11-22 08:28:12.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:14 np0005531888 nova_compute[186788]: 2025-11-22 08:28:14.169 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:14 np0005531888 nova_compute[186788]: 2025-11-22 08:28:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:14 np0005531888 nova_compute[186788]: 2025-11-22 08:28:14.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:28:14 np0005531888 nova_compute[186788]: 2025-11-22 08:28:14.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:28:14 np0005531888 nova_compute[186788]: 2025-11-22 08:28:14.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:28:16 np0005531888 nova_compute[186788]: 2025-11-22 08:28:16.412 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:16 np0005531888 nova_compute[186788]: 2025-11-22 08:28:16.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:18 np0005531888 nova_compute[186788]: 2025-11-22 08:28:18.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:19 np0005531888 nova_compute[186788]: 2025-11-22 08:28:19.171 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:21 np0005531888 nova_compute[186788]: 2025-11-22 08:28:21.416 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:22 np0005531888 nova_compute[186788]: 2025-11-22 08:28:22.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:22 np0005531888 nova_compute[186788]: 2025-11-22 08:28:22.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:22 np0005531888 nova_compute[186788]: 2025-11-22 08:28:22.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:22 np0005531888 nova_compute[186788]: 2025-11-22 08:28:22.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:22 np0005531888 nova_compute[186788]: 2025-11-22 08:28:22.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.231 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.232 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5730MB free_disk=73.26580810546875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.233 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.233 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.330 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.330 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.369 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.382 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.384 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:28:23 np0005531888 nova_compute[186788]: 2025-11-22 08:28:23.384 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:24 np0005531888 nova_compute[186788]: 2025-11-22 08:28:24.173 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:24 np0005531888 podman[243134]: 2025-11-22 08:28:24.687749757 +0000 UTC m=+0.061522463 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:28:24 np0005531888 podman[243135]: 2025-11-22 08:28:24.695106118 +0000 UTC m=+0.062116728 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:28:25 np0005531888 nova_compute[186788]: 2025-11-22 08:28:25.384 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:26 np0005531888 nova_compute[186788]: 2025-11-22 08:28:26.419 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:26 np0005531888 nova_compute[186788]: 2025-11-22 08:28:26.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:27 np0005531888 podman[243179]: 2025-11-22 08:28:27.677453333 +0000 UTC m=+0.054845169 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9)
Nov 22 03:28:27 np0005531888 podman[243180]: 2025-11-22 08:28:27.689751136 +0000 UTC m=+0.060774986 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 03:28:27 np0005531888 podman[243181]: 2025-11-22 08:28:27.727539265 +0000 UTC m=+0.093823738 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:28:28 np0005531888 nova_compute[186788]: 2025-11-22 08:28:28.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:28 np0005531888 nova_compute[186788]: 2025-11-22 08:28:28.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:28:29 np0005531888 nova_compute[186788]: 2025-11-22 08:28:29.175 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:30 np0005531888 nova_compute[186788]: 2025-11-22 08:28:30.762 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:30 np0005531888 nova_compute[186788]: 2025-11-22 08:28:30.762 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:30 np0005531888 nova_compute[186788]: 2025-11-22 08:28:30.813 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:28:30 np0005531888 nova_compute[186788]: 2025-11-22 08:28:30.929 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:30 np0005531888 nova_compute[186788]: 2025-11-22 08:28:30.931 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:30 np0005531888 nova_compute[186788]: 2025-11-22 08:28:30.940 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:28:30 np0005531888 nova_compute[186788]: 2025-11-22 08:28:30.941 186792 INFO nova.compute.claims [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.199 186792 DEBUG nova.compute.provider_tree [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.214 186792 DEBUG nova.scheduler.client.report [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.422 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.443 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.444 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.720 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.721 186792 DEBUG nova.network.neutron [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.771 186792 INFO nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.817 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.982 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.984 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.985 186792 INFO nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Creating image(s)#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.985 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.986 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:31 np0005531888 nova_compute[186788]: 2025-11-22 08:28:31.986 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.001 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.063 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.065 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.065 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.081 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.142 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.143 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.513 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk 1073741824" returned: 0 in 0.370s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.514 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.448s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.514 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.575 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.576 186792 DEBUG nova.virt.disk.api [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.576 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.637 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.638 186792 DEBUG nova.virt.disk.api [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.639 186792 DEBUG nova.objects.instance [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.653 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.654 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Ensure instance console log exists: /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.654 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.655 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.655 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:32 np0005531888 nova_compute[186788]: 2025-11-22 08:28:32.971 186792 DEBUG nova.policy [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:28:34 np0005531888 nova_compute[186788]: 2025-11-22 08:28:34.177 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:34 np0005531888 nova_compute[186788]: 2025-11-22 08:28:34.781 186792 DEBUG nova.network.neutron [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Successfully created port: e6dd9383-6fd6-4da4-8c3b-126dd22ec505 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:28:36 np0005531888 nova_compute[186788]: 2025-11-22 08:28:36.424 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:36 np0005531888 nova_compute[186788]: 2025-11-22 08:28:36.709 186792 DEBUG nova.network.neutron [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Successfully updated port: e6dd9383-6fd6-4da4-8c3b-126dd22ec505 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:28:36 np0005531888 nova_compute[186788]: 2025-11-22 08:28:36.755 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:28:36 np0005531888 nova_compute[186788]: 2025-11-22 08:28:36.756 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:28:36 np0005531888 nova_compute[186788]: 2025-11-22 08:28:36.756 186792 DEBUG nova.network.neutron [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:28:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:36.845 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:36.845 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:36.846 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.849 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.850 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:28:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:37 np0005531888 nova_compute[186788]: 2025-11-22 08:28:37.056 186792 DEBUG nova.compute.manager [req-ab8e3eb1-5d2c-4959-bfbf-7109a9204a25 req-e6f9d5f3-0945-42eb-8058-6a957f68182d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:28:37 np0005531888 nova_compute[186788]: 2025-11-22 08:28:37.057 186792 DEBUG nova.compute.manager [req-ab8e3eb1-5d2c-4959-bfbf-7109a9204a25 req-e6f9d5f3-0945-42eb-8058-6a957f68182d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing instance network info cache due to event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:28:37 np0005531888 nova_compute[186788]: 2025-11-22 08:28:37.057 186792 DEBUG oslo_concurrency.lockutils [req-ab8e3eb1-5d2c-4959-bfbf-7109a9204a25 req-e6f9d5f3-0945-42eb-8058-6a957f68182d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:28:37 np0005531888 nova_compute[186788]: 2025-11-22 08:28:37.098 186792 DEBUG nova.network.neutron [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.034 186792 DEBUG nova.network.neutron [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.280 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.280 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance network_info: |[{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.281 186792 DEBUG oslo_concurrency.lockutils [req-ab8e3eb1-5d2c-4959-bfbf-7109a9204a25 req-e6f9d5f3-0945-42eb-8058-6a957f68182d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.281 186792 DEBUG nova.network.neutron [req-ab8e3eb1-5d2c-4959-bfbf-7109a9204a25 req-e6f9d5f3-0945-42eb-8058-6a957f68182d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.284 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Start _get_guest_xml network_info=[{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.290 186792 WARNING nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.296 186792 DEBUG nova.virt.libvirt.host [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.297 186792 DEBUG nova.virt.libvirt.host [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.304 186792 DEBUG nova.virt.libvirt.host [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.305 186792 DEBUG nova.virt.libvirt.host [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.306 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.306 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.306 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.306 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.307 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.307 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.307 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.307 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.308 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.308 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.308 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.308 186792 DEBUG nova.virt.hardware [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.311 186792 DEBUG nova.virt.libvirt.vif [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-670921403',display_name='tempest-TestNetworkAdvancedServerOps-server-670921403',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-670921403',id=162,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGYHYXp1Qqy6JOhGWOvuacfIk0P6wPolDsKlW4eLBP1reaf5YJ3b0p9NPF3wkmcarWaq/1pXj7o7/84igOB3Q0Y7op1kvlGqjlFnubXR8AIl2+F1RtClL7jm1Y/qEbrbsQ==',key_name='tempest-TestNetworkAdvancedServerOps-1145580001',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-804tq30l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:28:31Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.312 186792 DEBUG nova.network.os_vif_util [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.312 186792 DEBUG nova.network.os_vif_util [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.313 186792 DEBUG nova.objects.instance [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.344 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <uuid>1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3</uuid>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <name>instance-000000a2</name>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-670921403</nova:name>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:28:38</nova:creationTime>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        <nova:port uuid="e6dd9383-6fd6-4da4-8c3b-126dd22ec505">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <entry name="serial">1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3</entry>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <entry name="uuid">1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3</entry>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:09:92:29"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <target dev="tape6dd9383-6f"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/console.log" append="off"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:28:38 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:28:38 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:28:38 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:28:38 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.346 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Preparing to wait for external event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.347 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.347 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.347 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.348 186792 DEBUG nova.virt.libvirt.vif [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-670921403',display_name='tempest-TestNetworkAdvancedServerOps-server-670921403',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-670921403',id=162,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGYHYXp1Qqy6JOhGWOvuacfIk0P6wPolDsKlW4eLBP1reaf5YJ3b0p9NPF3wkmcarWaq/1pXj7o7/84igOB3Q0Y7op1kvlGqjlFnubXR8AIl2+F1RtClL7jm1Y/qEbrbsQ==',key_name='tempest-TestNetworkAdvancedServerOps-1145580001',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-804tq30l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:28:31Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.348 186792 DEBUG nova.network.os_vif_util [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.349 186792 DEBUG nova.network.os_vif_util [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.349 186792 DEBUG os_vif [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.350 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.351 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.351 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.356 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.357 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6dd9383-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.357 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape6dd9383-6f, col_values=(('external_ids', {'iface-id': 'e6dd9383-6fd6-4da4-8c3b-126dd22ec505', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:92:29', 'vm-uuid': '1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.359 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:38 np0005531888 NetworkManager[55166]: <info>  [1763800118.3599] manager: (tape6dd9383-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.361 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.371 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.373 186792 INFO os_vif [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f')#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.522 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.523 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.524 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:09:92:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.525 186792 INFO nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Using config drive#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.889 186792 INFO nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Creating config drive at /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config#033[00m
Nov 22 03:28:38 np0005531888 nova_compute[186788]: 2025-11-22 08:28:38.894 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6qumfy12 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.020 186792 DEBUG oslo_concurrency.processutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6qumfy12" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:28:39 np0005531888 kernel: tape6dd9383-6f: entered promiscuous mode
Nov 22 03:28:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:28:39Z|00631|binding|INFO|Claiming lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for this chassis.
Nov 22 03:28:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:28:39Z|00632|binding|INFO|e6dd9383-6fd6-4da4-8c3b-126dd22ec505: Claiming fa:16:3e:09:92:29 10.100.0.7
Nov 22 03:28:39 np0005531888 NetworkManager[55166]: <info>  [1763800119.0805] manager: (tape6dd9383-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/302)
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.080 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.082 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.089 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 systemd-udevd[243277]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:28:39 np0005531888 systemd-machined[153106]: New machine qemu-78-instance-000000a2.
Nov 22 03:28:39 np0005531888 NetworkManager[55166]: <info>  [1763800119.1158] device (tape6dd9383-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:28:39 np0005531888 NetworkManager[55166]: <info>  [1763800119.1166] device (tape6dd9383-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:28:39 np0005531888 systemd[1]: Started Virtual Machine qemu-78-instance-000000a2.
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.139 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:28:39Z|00633|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 ovn-installed in OVS
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.143 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:28:39Z|00634|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 up in Southbound
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.156 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:92:29 10.100.0.7'], port_security=['fa:16:3e:09:92:29 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': '24e159b0-64cc-460c-86db-d34db1ef5f3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bbea3c6-adfe-4d7f-816a-93045d66e49e, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e6dd9383-6fd6-4da4-8c3b-126dd22ec505) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.158 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 in datapath 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef bound to our chassis#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.159 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.168 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8555e5-9c5c-4a80-a4c6-997ad6c6525b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.169 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9ec8d93f-61 in ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.171 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9ec8d93f-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.171 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[207e365a-21c3-4f7a-b722-372ac4a7ab4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.172 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[453102a3-3c63-464b-b4df-b39e45ec7020]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.178 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.185 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4bf960-5abe-46a5-a507-cea99d499810]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.197 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5179275a-19b1-40ef-9813-f70d4d773973]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.225 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8560854a-c74e-4e14-943e-aa3cbc7153c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 NetworkManager[55166]: <info>  [1763800119.2350] manager: (tap9ec8d93f-60): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.234 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aeadc3d3-689f-4ca5-89f4-777cc2116220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.267 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[9b20fe27-d668-4d86-81a9-ea46b756e3d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.270 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[209380c7-64ee-4359-9b62-53573b82c074]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 NetworkManager[55166]: <info>  [1763800119.2938] device (tap9ec8d93f-60): carrier: link connected
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.300 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9859ff-6f0b-4d86-a5af-a4b4c3581859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.316 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[132d207f-afec-4ad7-b90a-72455509af93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ec8d93f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:a4:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685993, 'reachable_time': 16916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243313, 'error': None, 'target': 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.329 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9887d113-0bc3-4003-bedb-de3d9830ac02]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:a47c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 685993, 'tstamp': 685993}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243314, 'error': None, 'target': 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.342 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[92dd8393-c060-4ce6-9ddf-5758f6db6b3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ec8d93f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:a4:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685993, 'reachable_time': 16916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243315, 'error': None, 'target': 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.371 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[09f7223d-efc1-4f38-aa3d-1533b8a9e1b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.417 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5c040b93-d206-4d5e-ad4e-acf84f4d61f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.419 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ec8d93f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.419 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.420 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ec8d93f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:28:39 np0005531888 NetworkManager[55166]: <info>  [1763800119.4220] manager: (tap9ec8d93f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.421 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 kernel: tap9ec8d93f-60: entered promiscuous mode
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.424 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.425 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9ec8d93f-60, col_values=(('external_ids', {'iface-id': '2f82bda6-e5f9-45ef-96c2-3856b66571d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:28:39 np0005531888 ovn_controller[95067]: 2025-11-22T08:28:39Z|00635|binding|INFO|Releasing lport 2f82bda6-e5f9-45ef-96c2-3856b66571d8 from this chassis (sb_readonly=0)
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.426 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.440 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.441 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9ec8d93f-618c-42ae-9ef7-97cfef6c22ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9ec8d93f-618c-42ae-9ef7-97cfef6c22ef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.442 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c80a1679-dce9-49dd-bb42-ebbe07a70b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.443 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/9ec8d93f-618c-42ae-9ef7-97cfef6c22ef.pid.haproxy
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:28:39 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:28:39.443 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'env', 'PROCESS_TAG=haproxy-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9ec8d93f-618c-42ae-9ef7-97cfef6c22ef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.704 186792 DEBUG nova.compute.manager [req-6c5b4527-e21f-4098-96e2-365e42c171fc req-122a324b-aa98-45aa-b680-ce7cb07537b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.705 186792 DEBUG oslo_concurrency.lockutils [req-6c5b4527-e21f-4098-96e2-365e42c171fc req-122a324b-aa98-45aa-b680-ce7cb07537b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.705 186792 DEBUG oslo_concurrency.lockutils [req-6c5b4527-e21f-4098-96e2-365e42c171fc req-122a324b-aa98-45aa-b680-ce7cb07537b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.706 186792 DEBUG oslo_concurrency.lockutils [req-6c5b4527-e21f-4098-96e2-365e42c171fc req-122a324b-aa98-45aa-b680-ce7cb07537b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.706 186792 DEBUG nova.compute.manager [req-6c5b4527-e21f-4098-96e2-365e42c171fc req-122a324b-aa98-45aa-b680-ce7cb07537b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Processing event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.807 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800119.8070347, 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.808 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] VM Started (Lifecycle Event)#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.810 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.813 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.816 186792 INFO nova.virt.libvirt.driver [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance spawned successfully.#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.816 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.829 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.833 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.839 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.840 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.840 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.840 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.841 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.841 186792 DEBUG nova.virt.libvirt.driver [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.861 186792 DEBUG nova.network.neutron [req-ab8e3eb1-5d2c-4959-bfbf-7109a9204a25 req-e6f9d5f3-0945-42eb-8058-6a957f68182d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updated VIF entry in instance network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.862 186792 DEBUG nova.network.neutron [req-ab8e3eb1-5d2c-4959-bfbf-7109a9204a25 req-e6f9d5f3-0945-42eb-8058-6a957f68182d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.864 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.864 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800119.8071606, 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.864 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.883 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.885 186792 DEBUG oslo_concurrency.lockutils [req-ab8e3eb1-5d2c-4959-bfbf-7109a9204a25 req-e6f9d5f3-0945-42eb-8058-6a957f68182d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.889 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800119.812817, 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.889 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:28:39 np0005531888 podman[243353]: 2025-11-22 08:28:39.798545078 +0000 UTC m=+0.028966793 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.914 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:28:39 np0005531888 nova_compute[186788]: 2025-11-22 08:28:39.918 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:28:40 np0005531888 nova_compute[186788]: 2025-11-22 08:28:40.076 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:28:40 np0005531888 nova_compute[186788]: 2025-11-22 08:28:40.176 186792 INFO nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Took 8.19 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:28:40 np0005531888 nova_compute[186788]: 2025-11-22 08:28:40.177 186792 DEBUG nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:28:40 np0005531888 podman[243353]: 2025-11-22 08:28:40.239224365 +0000 UTC m=+0.469646070 container create 4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:28:40 np0005531888 nova_compute[186788]: 2025-11-22 08:28:40.336 186792 INFO nova.compute.manager [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Took 9.44 seconds to build instance.#033[00m
Nov 22 03:28:40 np0005531888 nova_compute[186788]: 2025-11-22 08:28:40.452 186792 DEBUG oslo_concurrency.lockutils [None req-4f282684-21f9-47d7-8f23-2fbfc492729c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:40 np0005531888 systemd[1]: Started libpod-conmon-4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f.scope.
Nov 22 03:28:40 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:28:40 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a6b8f54bf667823ba3ccb53b42778493a076174b0f486dfb13febcf7c1776ff/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:28:40 np0005531888 podman[243353]: 2025-11-22 08:28:40.654881296 +0000 UTC m=+0.885303021 container init 4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:28:40 np0005531888 podman[243353]: 2025-11-22 08:28:40.660883993 +0000 UTC m=+0.891305688 container start 4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:28:40 np0005531888 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[243369]: [NOTICE]   (243373) : New worker (243375) forked
Nov 22 03:28:40 np0005531888 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[243369]: [NOTICE]   (243373) : Loading success.
Nov 22 03:28:41 np0005531888 nova_compute[186788]: 2025-11-22 08:28:41.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:42 np0005531888 nova_compute[186788]: 2025-11-22 08:28:42.023 186792 DEBUG nova.compute.manager [req-403a2df8-fa1b-4c05-ae28-75da93eb9802 req-72062c91-39c4-47a2-8cd4-e846c4924e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:28:42 np0005531888 nova_compute[186788]: 2025-11-22 08:28:42.023 186792 DEBUG oslo_concurrency.lockutils [req-403a2df8-fa1b-4c05-ae28-75da93eb9802 req-72062c91-39c4-47a2-8cd4-e846c4924e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:42 np0005531888 nova_compute[186788]: 2025-11-22 08:28:42.024 186792 DEBUG oslo_concurrency.lockutils [req-403a2df8-fa1b-4c05-ae28-75da93eb9802 req-72062c91-39c4-47a2-8cd4-e846c4924e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:42 np0005531888 nova_compute[186788]: 2025-11-22 08:28:42.024 186792 DEBUG oslo_concurrency.lockutils [req-403a2df8-fa1b-4c05-ae28-75da93eb9802 req-72062c91-39c4-47a2-8cd4-e846c4924e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:42 np0005531888 nova_compute[186788]: 2025-11-22 08:28:42.024 186792 DEBUG nova.compute.manager [req-403a2df8-fa1b-4c05-ae28-75da93eb9802 req-72062c91-39c4-47a2-8cd4-e846c4924e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:28:42 np0005531888 nova_compute[186788]: 2025-11-22 08:28:42.025 186792 WARNING nova.compute.manager [req-403a2df8-fa1b-4c05-ae28-75da93eb9802 req-72062c91-39c4-47a2-8cd4-e846c4924e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:28:43 np0005531888 nova_compute[186788]: 2025-11-22 08:28:43.360 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:43 np0005531888 podman[243385]: 2025-11-22 08:28:43.68345498 +0000 UTC m=+0.050832922 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:28:43 np0005531888 podman[243384]: 2025-11-22 08:28:43.709947741 +0000 UTC m=+0.078161213 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:28:44 np0005531888 nova_compute[186788]: 2025-11-22 08:28:44.179 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:28:46Z|00636|binding|INFO|Releasing lport 2f82bda6-e5f9-45ef-96c2-3856b66571d8 from this chassis (sb_readonly=0)
Nov 22 03:28:46 np0005531888 nova_compute[186788]: 2025-11-22 08:28:46.292 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:46 np0005531888 NetworkManager[55166]: <info>  [1763800126.3111] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Nov 22 03:28:46 np0005531888 NetworkManager[55166]: <info>  [1763800126.3124] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Nov 22 03:28:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:28:46Z|00637|binding|INFO|Releasing lport 2f82bda6-e5f9-45ef-96c2-3856b66571d8 from this chassis (sb_readonly=0)
Nov 22 03:28:46 np0005531888 nova_compute[186788]: 2025-11-22 08:28:46.324 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:46 np0005531888 nova_compute[186788]: 2025-11-22 08:28:46.329 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:48 np0005531888 nova_compute[186788]: 2025-11-22 08:28:48.362 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:49 np0005531888 nova_compute[186788]: 2025-11-22 08:28:49.183 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:49 np0005531888 nova_compute[186788]: 2025-11-22 08:28:49.766 186792 DEBUG nova.compute.manager [req-0ea82331-aa0c-4da3-896b-cdd49bf19f7a req-5f747699-afe2-42e8-a3a2-e07f84261aa9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:28:49 np0005531888 nova_compute[186788]: 2025-11-22 08:28:49.766 186792 DEBUG nova.compute.manager [req-0ea82331-aa0c-4da3-896b-cdd49bf19f7a req-5f747699-afe2-42e8-a3a2-e07f84261aa9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing instance network info cache due to event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:28:49 np0005531888 nova_compute[186788]: 2025-11-22 08:28:49.767 186792 DEBUG oslo_concurrency.lockutils [req-0ea82331-aa0c-4da3-896b-cdd49bf19f7a req-5f747699-afe2-42e8-a3a2-e07f84261aa9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:28:49 np0005531888 nova_compute[186788]: 2025-11-22 08:28:49.767 186792 DEBUG oslo_concurrency.lockutils [req-0ea82331-aa0c-4da3-896b-cdd49bf19f7a req-5f747699-afe2-42e8-a3a2-e07f84261aa9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:28:49 np0005531888 nova_compute[186788]: 2025-11-22 08:28:49.767 186792 DEBUG nova.network.neutron [req-0ea82331-aa0c-4da3-896b-cdd49bf19f7a req-5f747699-afe2-42e8-a3a2-e07f84261aa9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:28:52 np0005531888 nova_compute[186788]: 2025-11-22 08:28:52.332 186792 DEBUG nova.network.neutron [req-0ea82331-aa0c-4da3-896b-cdd49bf19f7a req-5f747699-afe2-42e8-a3a2-e07f84261aa9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updated VIF entry in instance network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:28:52 np0005531888 nova_compute[186788]: 2025-11-22 08:28:52.333 186792 DEBUG nova.network.neutron [req-0ea82331-aa0c-4da3-896b-cdd49bf19f7a req-5f747699-afe2-42e8-a3a2-e07f84261aa9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:28:52 np0005531888 nova_compute[186788]: 2025-11-22 08:28:52.414 186792 DEBUG oslo_concurrency.lockutils [req-0ea82331-aa0c-4da3-896b-cdd49bf19f7a req-5f747699-afe2-42e8-a3a2-e07f84261aa9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:28:53 np0005531888 nova_compute[186788]: 2025-11-22 08:28:53.364 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:54 np0005531888 nova_compute[186788]: 2025-11-22 08:28:54.188 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:55 np0005531888 podman[243439]: 2025-11-22 08:28:55.692493951 +0000 UTC m=+0.065816170 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:28:55 np0005531888 podman[243438]: 2025-11-22 08:28:55.722654922 +0000 UTC m=+0.098587845 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:28:58 np0005531888 nova_compute[186788]: 2025-11-22 08:28:58.367 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:58 np0005531888 podman[243491]: 2025-11-22 08:28:58.691607768 +0000 UTC m=+0.061731979 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Nov 22 03:28:58 np0005531888 podman[243492]: 2025-11-22 08:28:58.720908398 +0000 UTC m=+0.081131966 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:28:58 np0005531888 podman[243493]: 2025-11-22 08:28:58.740315986 +0000 UTC m=+0.100588625 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:28:59 np0005531888 nova_compute[186788]: 2025-11-22 08:28:59.191 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:00Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:92:29 10.100.0.7
Nov 22 03:29:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:00Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:92:29 10.100.0.7
Nov 22 03:29:03 np0005531888 nova_compute[186788]: 2025-11-22 08:29:03.369 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:04 np0005531888 nova_compute[186788]: 2025-11-22 08:29:04.192 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:06 np0005531888 nova_compute[186788]: 2025-11-22 08:29:06.658 186792 INFO nova.compute.manager [None req-a96a9007-a9df-4478-80c7-b34202c27d54 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Get console output#033[00m
Nov 22 03:29:06 np0005531888 nova_compute[186788]: 2025-11-22 08:29:06.663 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:29:08 np0005531888 nova_compute[186788]: 2025-11-22 08:29:08.372 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:09 np0005531888 nova_compute[186788]: 2025-11-22 08:29:09.194 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:11 np0005531888 nova_compute[186788]: 2025-11-22 08:29:11.084 186792 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:29:11 np0005531888 nova_compute[186788]: 2025-11-22 08:29:11.085 186792 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:29:11 np0005531888 nova_compute[186788]: 2025-11-22 08:29:11.085 186792 DEBUG nova.network.neutron [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.015 186792 DEBUG nova.network.neutron [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.025 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.040 186792 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.144 186792 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.145 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Creating file /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/57a95b935c194d5eb076cede3e2e13d6.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.145 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/57a95b935c194d5eb076cede3e2e13d6.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.375 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.608 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/57a95b935c194d5eb076cede3e2e13d6.tmp" returned: 1 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.609 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/57a95b935c194d5eb076cede3e2e13d6.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.609 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Creating directory /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.610 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.814 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:29:13 np0005531888 nova_compute[186788]: 2025-11-22 08:29:13.819 186792 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:29:14 np0005531888 nova_compute[186788]: 2025-11-22 08:29:14.199 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:14 np0005531888 podman[243556]: 2025-11-22 08:29:14.672258839 +0000 UTC m=+0.046963596 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:29:14 np0005531888 podman[243557]: 2025-11-22 08:29:14.676310188 +0000 UTC m=+0.046863803 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:29:14 np0005531888 nova_compute[186788]: 2025-11-22 08:29:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:14 np0005531888 nova_compute[186788]: 2025-11-22 08:29:14.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:29:14 np0005531888 nova_compute[186788]: 2025-11-22 08:29:14.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:29:14 np0005531888 nova_compute[186788]: 2025-11-22 08:29:14.969 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:29:14 np0005531888 nova_compute[186788]: 2025-11-22 08:29:14.970 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:29:14 np0005531888 nova_compute[186788]: 2025-11-22 08:29:14.970 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:29:14 np0005531888 nova_compute[186788]: 2025-11-22 08:29:14.970 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:29:16 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:16Z|00638|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 22 03:29:17 np0005531888 kernel: tape6dd9383-6f (unregistering): left promiscuous mode
Nov 22 03:29:17 np0005531888 NetworkManager[55166]: <info>  [1763800157.3462] device (tape6dd9383-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00639|binding|INFO|Releasing lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 from this chassis (sb_readonly=0)
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00640|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 down in Southbound
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00641|binding|INFO|Removing iface tape6dd9383-6f ovn-installed in OVS
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.355 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:17.373 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:92:29 10.100.0.7'], port_security=['fa:16:3e:09:92:29 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '24e159b0-64cc-460c-86db-d34db1ef5f3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bbea3c6-adfe-4d7f-816a-93045d66e49e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e6dd9383-6fd6-4da4-8c3b-126dd22ec505) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:29:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:17.374 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 in datapath 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef unbound from our chassis#033[00m
Nov 22 03:29:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:17.376 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.378 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:17.377 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[becf57f5-25e4-49e5-b544-42c7c86a5bf2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:17.378 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef namespace which is not needed anymore#033[00m
Nov 22 03:29:17 np0005531888 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Nov 22 03:29:17 np0005531888 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a2.scope: Consumed 16.411s CPU time.
Nov 22 03:29:17 np0005531888 systemd-machined[153106]: Machine qemu-78-instance-000000a2 terminated.
Nov 22 03:29:17 np0005531888 kernel: tape6dd9383-6f: entered promiscuous mode
Nov 22 03:29:17 np0005531888 NetworkManager[55166]: <info>  [1763800157.5781] manager: (tape6dd9383-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/307)
Nov 22 03:29:17 np0005531888 kernel: tape6dd9383-6f (unregistering): left promiscuous mode
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00642|binding|INFO|Claiming lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for this chassis.
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.582 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00643|binding|INFO|e6dd9383-6fd6-4da4-8c3b-126dd22ec505: Claiming fa:16:3e:09:92:29 10.100.0.7
Nov 22 03:29:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:17.594 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:92:29 10.100.0.7'], port_security=['fa:16:3e:09:92:29 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '24e159b0-64cc-460c-86db-d34db1ef5f3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bbea3c6-adfe-4d7f-816a-93045d66e49e, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e6dd9383-6fd6-4da4-8c3b-126dd22ec505) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00644|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 ovn-installed in OVS
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00645|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 up in Southbound
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00646|binding|INFO|Releasing lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 from this chassis (sb_readonly=1)
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00647|if_status|INFO|Dropped 3 log messages in last 1127 seconds (most recently, 1127 seconds ago) due to excessive rate
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.601 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00648|if_status|INFO|Not setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 down as sb is readonly
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00649|binding|INFO|Removing iface tape6dd9383-6f ovn-installed in OVS
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.603 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00650|binding|INFO|Releasing lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 from this chassis (sb_readonly=0)
Nov 22 03:29:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:17Z|00651|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 down in Southbound
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.614 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:17.618 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:92:29 10.100.0.7'], port_security=['fa:16:3e:09:92:29 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '24e159b0-64cc-460c-86db-d34db1ef5f3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bbea3c6-adfe-4d7f-816a-93045d66e49e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e6dd9383-6fd6-4da4-8c3b-126dd22ec505) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.750 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.785 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.785 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.785 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:17 np0005531888 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[243369]: [NOTICE]   (243373) : haproxy version is 2.8.14-c23fe91
Nov 22 03:29:17 np0005531888 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[243369]: [NOTICE]   (243373) : path to executable is /usr/sbin/haproxy
Nov 22 03:29:17 np0005531888 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[243369]: [WARNING]  (243373) : Exiting Master process...
Nov 22 03:29:17 np0005531888 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[243369]: [ALERT]    (243373) : Current worker (243375) exited with code 143 (Terminated)
Nov 22 03:29:17 np0005531888 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[243369]: [WARNING]  (243373) : All workers exited. Exiting... (0)
Nov 22 03:29:17 np0005531888 systemd[1]: libpod-4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f.scope: Deactivated successfully.
Nov 22 03:29:17 np0005531888 podman[243621]: 2025-11-22 08:29:17.839648595 +0000 UTC m=+0.368176344 container died 4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.840 186792 INFO nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance shutdown successfully after 4 seconds.#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.846 186792 INFO nova.virt.libvirt.driver [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance destroyed successfully.#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.848 186792 DEBUG nova.virt.libvirt.vif [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-670921403',display_name='tempest-TestNetworkAdvancedServerOps-server-670921403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-670921403',id=162,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGYHYXp1Qqy6JOhGWOvuacfIk0P6wPolDsKlW4eLBP1reaf5YJ3b0p9NPF3wkmcarWaq/1pXj7o7/84igOB3Q0Y7op1kvlGqjlFnubXR8AIl2+F1RtClL7jm1Y/qEbrbsQ==',key_name='tempest-TestNetworkAdvancedServerOps-1145580001',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:28:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-804tq30l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:29:10Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2141158938", "vif_mac": "fa:16:3e:09:92:29"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.849 186792 DEBUG nova.network.os_vif_util [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2141158938", "vif_mac": "fa:16:3e:09:92:29"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.850 186792 DEBUG nova.network.os_vif_util [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.850 186792 DEBUG os_vif [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.852 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.853 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6dd9383-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.855 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.856 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.862 186792 INFO os_vif [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f')#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.867 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.932 186792 DEBUG nova.compute.manager [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.933 186792 DEBUG oslo_concurrency.lockutils [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.933 186792 DEBUG oslo_concurrency.lockutils [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.933 186792 DEBUG oslo_concurrency.lockutils [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.934 186792 DEBUG nova.compute.manager [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.934 186792 WARNING nova.compute.manager [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.938 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.938 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:29:17 np0005531888 nova_compute[186788]: 2025-11-22 08:29:17.998 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:29:18 np0005531888 nova_compute[186788]: 2025-11-22 08:29:18.000 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Copying file /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk to 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:29:18 np0005531888 nova_compute[186788]: 2025-11-22 08:29:18.000 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:29:18 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:18.135 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:29:18 np0005531888 nova_compute[186788]: 2025-11-22 08:29:18.136 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:18 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f-userdata-shm.mount: Deactivated successfully.
Nov 22 03:29:18 np0005531888 systemd[1]: var-lib-containers-storage-overlay-3a6b8f54bf667823ba3ccb53b42778493a076174b0f486dfb13febcf7c1776ff-merged.mount: Deactivated successfully.
Nov 22 03:29:18 np0005531888 nova_compute[186788]: 2025-11-22 08:29:18.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:19 np0005531888 nova_compute[186788]: 2025-11-22 08:29:19.198 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:19 np0005531888 podman[243621]: 2025-11-22 08:29:19.359414994 +0000 UTC m=+1.887942743 container cleanup 4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:29:19 np0005531888 systemd[1]: libpod-conmon-4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f.scope: Deactivated successfully.
Nov 22 03:29:19 np0005531888 nova_compute[186788]: 2025-11-22 08:29:19.603 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "scp -r /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk" returned: 0 in 1.603s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:29:19 np0005531888 nova_compute[186788]: 2025-11-22 08:29:19.603 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Copying file /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:29:19 np0005531888 nova_compute[186788]: 2025-11-22 08:29:19.604 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk.config 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:29:19 np0005531888 nova_compute[186788]: 2025-11-22 08:29:19.851 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "scp -C -r /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk.config 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config" returned: 0 in 0.247s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:29:19 np0005531888 nova_compute[186788]: 2025-11-22 08:29:19.852 186792 DEBUG nova.virt.libvirt.volume.remotefs [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Copying file /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 03:29:19 np0005531888 nova_compute[186788]: 2025-11-22 08:29:19.852 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk.info 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:29:19 np0005531888 nova_compute[186788]: 2025-11-22 08:29:19.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.053 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.054 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.054 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.055 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.055 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.055 186792 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.056 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.056 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.056 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.056 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.057 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.057 186792 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.057 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.057 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.057 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.058 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.058 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.058 186792 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.058 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.059 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.059 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.059 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.059 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.060 186792 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.060 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.060 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.060 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.061 186792 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.061 186792 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.061 186792 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.062 186792 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "scp -C -r /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_resize/disk.info 192.168.122.100:/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:29:20 np0005531888 podman[243676]: 2025-11-22 08:29:20.112822941 +0000 UTC m=+0.725060021 container remove 4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.119 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[26250198-4f6b-4b86-bd0c-2f5d625dce2c]: (4, ('Sat Nov 22 08:29:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef (4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f)\n4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f\nSat Nov 22 08:29:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef (4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f)\n4945f24af5b03582a92f062dc529b14a864b56620669acea51051df863a0fa3f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.122 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[422e9c58-6f91-493f-9cb1-d8e0013fad05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.123 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ec8d93f-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.125 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:20 np0005531888 kernel: tap9ec8d93f-60: left promiscuous mode
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.138 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.140 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b8720e4e-3374-407e-baee-1ed633424e12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.158 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9027a800-b7a9-4059-82a2-c8e5b35c82a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.159 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ca191d-5127-4c13-8b76-868d4dc57f3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.175 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6d926099-6e90-45e2-b8dc-46711547ab05]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685986, 'reachable_time': 23254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243696, 'error': None, 'target': 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.178 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.178 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3da742-1d23-442d-a627-da887595248c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 systemd[1]: run-netns-ovnmeta\x2d9ec8d93f\x2d618c\x2d42ae\x2d9ef7\x2d97cfef6c22ef.mount: Deactivated successfully.
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.179 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 in datapath 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef unbound from our chassis#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.181 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.182 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fd722b00-fab6-4bff-aff2-113a7fc09651]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.182 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 in datapath 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef unbound from our chassis#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.183 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.184 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfb7927-558e-4c1e-80af-dac02dfc5d80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:29:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:20.184 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:29:20 np0005531888 nova_compute[186788]: 2025-11-22 08:29:20.353 186792 DEBUG neutronclient.v2_0.client [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:29:21 np0005531888 nova_compute[186788]: 2025-11-22 08:29:21.272 186792 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:21 np0005531888 nova_compute[186788]: 2025-11-22 08:29:21.272 186792 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:21 np0005531888 nova_compute[186788]: 2025-11-22 08:29:21.272 186792 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:21 np0005531888 nova_compute[186788]: 2025-11-22 08:29:21.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:21 np0005531888 nova_compute[186788]: 2025-11-22 08:29:21.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:29:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:22.187 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:29:22 np0005531888 nova_compute[186788]: 2025-11-22 08:29:22.856 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:24 np0005531888 nova_compute[186788]: 2025-11-22 08:29:24.200 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:24 np0005531888 nova_compute[186788]: 2025-11-22 08:29:24.980 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.005 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.005 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.006 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.006 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.078 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-000000a2, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.207 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.208 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5700MB free_disk=73.23701477050781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.209 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.209 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.268 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Migration for instance 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.305 186792 INFO nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating resource usage from migration 379e335f-1bd3-4f90-85f8-6f71327f225f#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.305 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Starting to track outgoing migration 379e335f-1bd3-4f90-85f8-6f71327f225f with flavor 31612188-3cd6-428b-9166-9568f0affd4a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.550 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Migration 379e335f-1bd3-4f90-85f8-6f71327f225f is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.551 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.551 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.833 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.874 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.941 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:29:25 np0005531888 nova_compute[186788]: 2025-11-22 08:29:25.942 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:26 np0005531888 podman[243699]: 2025-11-22 08:29:26.686961777 +0000 UTC m=+0.055352611 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:29:26 np0005531888 podman[243698]: 2025-11-22 08:29:26.694537674 +0000 UTC m=+0.063630206 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:29:26 np0005531888 nova_compute[186788]: 2025-11-22 08:29:26.916 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:26 np0005531888 nova_compute[186788]: 2025-11-22 08:29:26.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:27 np0005531888 nova_compute[186788]: 2025-11-22 08:29:27.857 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:27 np0005531888 nova_compute[186788]: 2025-11-22 08:29:27.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:27 np0005531888 nova_compute[186788]: 2025-11-22 08:29:27.970 186792 DEBUG nova.compute.manager [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:27 np0005531888 nova_compute[186788]: 2025-11-22 08:29:27.970 186792 DEBUG nova.compute.manager [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing instance network info cache due to event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:29:27 np0005531888 nova_compute[186788]: 2025-11-22 08:29:27.970 186792 DEBUG oslo_concurrency.lockutils [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:29:27 np0005531888 nova_compute[186788]: 2025-11-22 08:29:27.971 186792 DEBUG oslo_concurrency.lockutils [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:29:27 np0005531888 nova_compute[186788]: 2025-11-22 08:29:27.971 186792 DEBUG nova.network.neutron [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:29:29 np0005531888 nova_compute[186788]: 2025-11-22 08:29:29.201 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:29 np0005531888 podman[243742]: 2025-11-22 08:29:29.699002854 +0000 UTC m=+0.077377835 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:29:29 np0005531888 podman[243743]: 2025-11-22 08:29:29.70009755 +0000 UTC m=+0.065008559 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:29:29 np0005531888 podman[243744]: 2025-11-22 08:29:29.769594719 +0000 UTC m=+0.128802968 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 03:29:29 np0005531888 nova_compute[186788]: 2025-11-22 08:29:29.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:29 np0005531888 nova_compute[186788]: 2025-11-22 08:29:29.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:29:30 np0005531888 nova_compute[186788]: 2025-11-22 08:29:30.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:30 np0005531888 nova_compute[186788]: 2025-11-22 08:29:30.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:29:30 np0005531888 nova_compute[186788]: 2025-11-22 08:29:30.973 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:29:31 np0005531888 nova_compute[186788]: 2025-11-22 08:29:31.755 186792 DEBUG nova.network.neutron [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updated VIF entry in instance network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:29:31 np0005531888 nova_compute[186788]: 2025-11-22 08:29:31.756 186792 DEBUG nova.network.neutron [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:29:31 np0005531888 nova_compute[186788]: 2025-11-22 08:29:31.815 186792 DEBUG oslo_concurrency.lockutils [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:29:32 np0005531888 nova_compute[186788]: 2025-11-22 08:29:32.635 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800157.633999, 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:29:32 np0005531888 nova_compute[186788]: 2025-11-22 08:29:32.636 186792 INFO nova.compute.manager [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:29:32 np0005531888 nova_compute[186788]: 2025-11-22 08:29:32.658 186792 DEBUG nova.compute.manager [None req-3baf9c24-326f-4f94-9dc3-4aad0a731a03 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:29:32 np0005531888 nova_compute[186788]: 2025-11-22 08:29:32.662 186792 DEBUG nova.compute.manager [None req-3baf9c24-326f-4f94-9dc3-4aad0a731a03 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:29:32 np0005531888 nova_compute[186788]: 2025-11-22 08:29:32.701 186792 INFO nova.compute.manager [None req-3baf9c24-326f-4f94-9dc3-4aad0a731a03 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 22 03:29:32 np0005531888 nova_compute[186788]: 2025-11-22 08:29:32.858 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:33 np0005531888 nova_compute[186788]: 2025-11-22 08:29:33.388 186792 DEBUG nova.compute.manager [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:33 np0005531888 nova_compute[186788]: 2025-11-22 08:29:33.388 186792 DEBUG oslo_concurrency.lockutils [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:33 np0005531888 nova_compute[186788]: 2025-11-22 08:29:33.388 186792 DEBUG oslo_concurrency.lockutils [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:33 np0005531888 nova_compute[186788]: 2025-11-22 08:29:33.388 186792 DEBUG oslo_concurrency.lockutils [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:33 np0005531888 nova_compute[186788]: 2025-11-22 08:29:33.389 186792 DEBUG nova.compute.manager [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:29:33 np0005531888 nova_compute[186788]: 2025-11-22 08:29:33.389 186792 WARNING nova.compute.manager [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_finish.#033[00m
Nov 22 03:29:34 np0005531888 nova_compute[186788]: 2025-11-22 08:29:34.204 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:35 np0005531888 nova_compute[186788]: 2025-11-22 08:29:35.735 186792 DEBUG nova.compute.manager [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:29:35 np0005531888 nova_compute[186788]: 2025-11-22 08:29:35.736 186792 DEBUG oslo_concurrency.lockutils [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:35 np0005531888 nova_compute[186788]: 2025-11-22 08:29:35.736 186792 DEBUG oslo_concurrency.lockutils [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:35 np0005531888 nova_compute[186788]: 2025-11-22 08:29:35.737 186792 DEBUG oslo_concurrency.lockutils [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:35 np0005531888 nova_compute[186788]: 2025-11-22 08:29:35.737 186792 DEBUG nova.compute.manager [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:29:35 np0005531888 nova_compute[186788]: 2025-11-22 08:29:35.737 186792 WARNING nova.compute.manager [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state resized and task_state None.#033[00m
Nov 22 03:29:36 np0005531888 nova_compute[186788]: 2025-11-22 08:29:36.432 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:36 np0005531888 nova_compute[186788]: 2025-11-22 08:29:36.432 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:36 np0005531888 nova_compute[186788]: 2025-11-22 08:29:36.432 186792 DEBUG nova.compute.manager [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Going to confirm migration 20 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 22 03:29:36 np0005531888 nova_compute[186788]: 2025-11-22 08:29:36.455 186792 DEBUG nova.objects.instance [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'info_cache' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:29:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:36.847 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:36.847 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:36.847 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:37 np0005531888 nova_compute[186788]: 2025-11-22 08:29:37.860 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:38 np0005531888 nova_compute[186788]: 2025-11-22 08:29:38.803 186792 DEBUG neutronclient.v2_0.client [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:29:38 np0005531888 nova_compute[186788]: 2025-11-22 08:29:38.803 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:29:38 np0005531888 nova_compute[186788]: 2025-11-22 08:29:38.804 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:29:38 np0005531888 nova_compute[186788]: 2025-11-22 08:29:38.804 186792 DEBUG nova.network.neutron [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:29:39 np0005531888 nova_compute[186788]: 2025-11-22 08:29:39.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:42 np0005531888 nova_compute[186788]: 2025-11-22 08:29:42.861 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.902 186792 DEBUG nova.network.neutron [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.958 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.958 186792 DEBUG nova.objects.instance [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.996 186792 DEBUG nova.virt.libvirt.vif [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-670921403',display_name='tempest-TestNetworkAdvancedServerOps-server-670921403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-670921403',id=162,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGYHYXp1Qqy6JOhGWOvuacfIk0P6wPolDsKlW4eLBP1reaf5YJ3b0p9NPF3wkmcarWaq/1pXj7o7/84igOB3Q0Y7op1kvlGqjlFnubXR8AIl2+F1RtClL7jm1Y/qEbrbsQ==',key_name='tempest-TestNetworkAdvancedServerOps-1145580001',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:29:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-804tq30l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:29:33Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.996 186792 DEBUG nova.network.os_vif_util [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.997 186792 DEBUG nova.network.os_vif_util [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.997 186792 DEBUG os_vif [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.998 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.999 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6dd9383-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:29:43 np0005531888 nova_compute[186788]: 2025-11-22 08:29:43.999 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.001 186792 INFO os_vif [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f')#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.001 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.001 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.189 186792 DEBUG nova.compute.provider_tree [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.208 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.218 186792 DEBUG nova.scheduler.client.report [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.263 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.531 186792 INFO nova.scheduler.client.report [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocation for migration 379e335f-1bd3-4f90-85f8-6f71327f225f#033[00m
Nov 22 03:29:44 np0005531888 nova_compute[186788]: 2025-11-22 08:29:44.642 186792 DEBUG oslo_concurrency.lockutils [None req-83cc1400-d01d-4349-b0f6-5aef27411c8e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 8.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:45 np0005531888 podman[243811]: 2025-11-22 08:29:45.683795125 +0000 UTC m=+0.052242105 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:29:45 np0005531888 podman[243812]: 2025-11-22 08:29:45.683823666 +0000 UTC m=+0.046997176 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:29:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:29:47Z|00652|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Nov 22 03:29:47 np0005531888 nova_compute[186788]: 2025-11-22 08:29:47.863 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:49 np0005531888 nova_compute[186788]: 2025-11-22 08:29:49.211 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:52 np0005531888 nova_compute[186788]: 2025-11-22 08:29:52.865 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:54 np0005531888 nova_compute[186788]: 2025-11-22 08:29:54.213 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:57 np0005531888 podman[243858]: 2025-11-22 08:29:57.69383151 +0000 UTC m=+0.061960915 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:29:57 np0005531888 podman[243857]: 2025-11-22 08:29:57.698878984 +0000 UTC m=+0.069606033 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:29:57 np0005531888 nova_compute[186788]: 2025-11-22 08:29:57.869 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:58.153 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:29:58 np0005531888 nova_compute[186788]: 2025-11-22 08:29:58.153 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:29:58.154 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:29:59 np0005531888 nova_compute[186788]: 2025-11-22 08:29:59.215 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:00 np0005531888 podman[243898]: 2025-11-22 08:30:00.681671499 +0000 UTC m=+0.054360257 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 03:30:00 np0005531888 podman[243899]: 2025-11-22 08:30:00.701748063 +0000 UTC m=+0.066651500 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:30:00 np0005531888 podman[243900]: 2025-11-22 08:30:00.758485758 +0000 UTC m=+0.120608136 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:30:02 np0005531888 nova_compute[186788]: 2025-11-22 08:30:02.870 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:04 np0005531888 nova_compute[186788]: 2025-11-22 08:30:04.218 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:05 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:05.155 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:30:07 np0005531888 nova_compute[186788]: 2025-11-22 08:30:07.872 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:07 np0005531888 nova_compute[186788]: 2025-11-22 08:30:07.969 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:08 np0005531888 nova_compute[186788]: 2025-11-22 08:30:08.041 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:09 np0005531888 nova_compute[186788]: 2025-11-22 08:30:09.220 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:12 np0005531888 nova_compute[186788]: 2025-11-22 08:30:12.873 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:13 np0005531888 nova_compute[186788]: 2025-11-22 08:30:13.968 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:14 np0005531888 nova_compute[186788]: 2025-11-22 08:30:14.224 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:14 np0005531888 nova_compute[186788]: 2025-11-22 08:30:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:15 np0005531888 nova_compute[186788]: 2025-11-22 08:30:15.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:15 np0005531888 nova_compute[186788]: 2025-11-22 08:30:15.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:30:15 np0005531888 nova_compute[186788]: 2025-11-22 08:30:15.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:30:15 np0005531888 nova_compute[186788]: 2025-11-22 08:30:15.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:30:16 np0005531888 podman[243967]: 2025-11-22 08:30:16.679643607 +0000 UTC m=+0.051690172 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 22 03:30:16 np0005531888 podman[243966]: 2025-11-22 08:30:16.679992765 +0000 UTC m=+0.056813587 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:30:17 np0005531888 nova_compute[186788]: 2025-11-22 08:30:17.875 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:18 np0005531888 nova_compute[186788]: 2025-11-22 08:30:18.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:19 np0005531888 nova_compute[186788]: 2025-11-22 08:30:19.226 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:20 np0005531888 nova_compute[186788]: 2025-11-22 08:30:20.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:22 np0005531888 nova_compute[186788]: 2025-11-22 08:30:22.876 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:24 np0005531888 nova_compute[186788]: 2025-11-22 08:30:24.227 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:26 np0005531888 nova_compute[186788]: 2025-11-22 08:30:26.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:26 np0005531888 nova_compute[186788]: 2025-11-22 08:30:26.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:26 np0005531888 nova_compute[186788]: 2025-11-22 08:30:26.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:26 np0005531888 nova_compute[186788]: 2025-11-22 08:30:26.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:26 np0005531888 nova_compute[186788]: 2025-11-22 08:30:26.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:26 np0005531888 nova_compute[186788]: 2025-11-22 08:30:26.980 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.166 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.167 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5705MB free_disk=73.26581954956055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.167 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.167 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.231 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.231 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.245 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.317 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.318 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.338 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.358 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.389 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.402 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.403 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.403 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:27 np0005531888 nova_compute[186788]: 2025-11-22 08:30:27.878 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:28 np0005531888 podman[244011]: 2025-11-22 08:30:28.679673814 +0000 UTC m=+0.056656194 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:30:28 np0005531888 podman[244012]: 2025-11-22 08:30:28.679683274 +0000 UTC m=+0.050760749 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:30:29 np0005531888 nova_compute[186788]: 2025-11-22 08:30:29.230 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:29 np0005531888 nova_compute[186788]: 2025-11-22 08:30:29.402 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:29 np0005531888 nova_compute[186788]: 2025-11-22 08:30:29.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:29 np0005531888 nova_compute[186788]: 2025-11-22 08:30:29.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:30:31 np0005531888 podman[244053]: 2025-11-22 08:30:31.689174358 +0000 UTC m=+0.060834218 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, architecture=x86_64)
Nov 22 03:30:31 np0005531888 podman[244054]: 2025-11-22 08:30:31.691605618 +0000 UTC m=+0.059240689 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:30:31 np0005531888 podman[244055]: 2025-11-22 08:30:31.722641641 +0000 UTC m=+0.086675703 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 22 03:30:32 np0005531888 nova_compute[186788]: 2025-11-22 08:30:32.880 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.491 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.492 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.515 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.628 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.628 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.634 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.634 186792 INFO nova.compute.claims [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.781 186792 DEBUG nova.compute.provider_tree [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.792 186792 DEBUG nova.scheduler.client.report [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.816 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.817 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.879 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.880 186792 DEBUG nova.network.neutron [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.903 186792 INFO nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:30:33 np0005531888 nova_compute[186788]: 2025-11-22 08:30:33.924 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.024 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.025 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.026 186792 INFO nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Creating image(s)#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.027 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.027 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.028 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.040 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.099 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.100 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.101 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.113 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.171 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.172 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.230 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.503 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk 1073741824" returned: 0 in 0.331s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.504 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.504 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.558 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.559 186792 DEBUG nova.virt.disk.api [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.559 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.614 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.615 186792 DEBUG nova.virt.disk.api [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.615 186792 DEBUG nova.objects.instance [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.650 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.651 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Ensure instance console log exists: /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.651 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.652 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.652 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:34 np0005531888 nova_compute[186788]: 2025-11-22 08:30:34.898 186792 DEBUG nova.policy [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:30:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:36.848 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:36.848 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:36.848 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:30:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531888 nova_compute[186788]: 2025-11-22 08:30:36.857 186792 DEBUG nova.network.neutron [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Successfully created port: 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:30:37 np0005531888 nova_compute[186788]: 2025-11-22 08:30:37.881 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:38 np0005531888 nova_compute[186788]: 2025-11-22 08:30:38.299 186792 DEBUG nova.network.neutron [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Successfully updated port: 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:30:38 np0005531888 nova_compute[186788]: 2025-11-22 08:30:38.436 186792 DEBUG nova.compute.manager [req-d38aa9b6-1d0a-45e3-872b-a9d8f6316c7f req-2f193ff7-a19c-4e9d-bcc9-8a5596e49a12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-changed-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:30:38 np0005531888 nova_compute[186788]: 2025-11-22 08:30:38.436 186792 DEBUG nova.compute.manager [req-d38aa9b6-1d0a-45e3-872b-a9d8f6316c7f req-2f193ff7-a19c-4e9d-bcc9-8a5596e49a12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Refreshing instance network info cache due to event network-changed-9ea5c424-dfb2-4fc9-adaa-06a42cf88172. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:30:38 np0005531888 nova_compute[186788]: 2025-11-22 08:30:38.436 186792 DEBUG oslo_concurrency.lockutils [req-d38aa9b6-1d0a-45e3-872b-a9d8f6316c7f req-2f193ff7-a19c-4e9d-bcc9-8a5596e49a12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:30:38 np0005531888 nova_compute[186788]: 2025-11-22 08:30:38.436 186792 DEBUG oslo_concurrency.lockutils [req-d38aa9b6-1d0a-45e3-872b-a9d8f6316c7f req-2f193ff7-a19c-4e9d-bcc9-8a5596e49a12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:30:38 np0005531888 nova_compute[186788]: 2025-11-22 08:30:38.436 186792 DEBUG nova.network.neutron [req-d38aa9b6-1d0a-45e3-872b-a9d8f6316c7f req-2f193ff7-a19c-4e9d-bcc9-8a5596e49a12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Refreshing network info cache for port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:30:38 np0005531888 nova_compute[186788]: 2025-11-22 08:30:38.674 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:30:38 np0005531888 nova_compute[186788]: 2025-11-22 08:30:38.891 186792 DEBUG nova.network.neutron [req-d38aa9b6-1d0a-45e3-872b-a9d8f6316c7f req-2f193ff7-a19c-4e9d-bcc9-8a5596e49a12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:30:39 np0005531888 nova_compute[186788]: 2025-11-22 08:30:39.232 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:39 np0005531888 nova_compute[186788]: 2025-11-22 08:30:39.863 186792 DEBUG nova.network.neutron [req-d38aa9b6-1d0a-45e3-872b-a9d8f6316c7f req-2f193ff7-a19c-4e9d-bcc9-8a5596e49a12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:30:39 np0005531888 nova_compute[186788]: 2025-11-22 08:30:39.876 186792 DEBUG oslo_concurrency.lockutils [req-d38aa9b6-1d0a-45e3-872b-a9d8f6316c7f req-2f193ff7-a19c-4e9d-bcc9-8a5596e49a12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:30:39 np0005531888 nova_compute[186788]: 2025-11-22 08:30:39.877 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:30:39 np0005531888 nova_compute[186788]: 2025-11-22 08:30:39.877 186792 DEBUG nova.network.neutron [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:30:40 np0005531888 nova_compute[186788]: 2025-11-22 08:30:40.016 186792 DEBUG nova.network.neutron [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.183 186792 DEBUG nova.network.neutron [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updating instance_info_cache with network_info: [{"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.208 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.209 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance network_info: |[{"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.211 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Start _get_guest_xml network_info=[{"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.214 186792 WARNING nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.218 186792 DEBUG nova.virt.libvirt.host [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.218 186792 DEBUG nova.virt.libvirt.host [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.221 186792 DEBUG nova.virt.libvirt.host [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.222 186792 DEBUG nova.virt.libvirt.host [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.223 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.223 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.224 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.224 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.224 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.224 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.225 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.225 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.225 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.225 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.226 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.226 186792 DEBUG nova.virt.hardware [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.229 186792 DEBUG nova.virt.libvirt.vif [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096437739',display_name='tempest-TestNetworkAdvancedServerOps-server-2096437739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096437739',id=163,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG13vP6TorI1BfgP51WB45YlXJdiZADs9Vr8WDj5tFB5h4MOu4V2srvEo0mvIjwOIArHDXlyFjzjTA9S2znrw3FTS7dEtIJ8YprpE+/VSrV3SFmnANGESGFkInD+qAEFVA==',key_name='tempest-TestNetworkAdvancedServerOps-415974526',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-uyrdhl9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:30:33Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=6074046e-cf5c-4db5-9662-721f727de670,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.229 186792 DEBUG nova.network.os_vif_util [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.230 186792 DEBUG nova.network.os_vif_util [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.231 186792 DEBUG nova.objects.instance [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.248 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <uuid>6074046e-cf5c-4db5-9662-721f727de670</uuid>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <name>instance-000000a3</name>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2096437739</nova:name>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:30:41</nova:creationTime>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        <nova:port uuid="9ea5c424-dfb2-4fc9-adaa-06a42cf88172">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <entry name="serial">6074046e-cf5c-4db5-9662-721f727de670</entry>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <entry name="uuid">6074046e-cf5c-4db5-9662-721f727de670</entry>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.config"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:e9:97:d8"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <target dev="tap9ea5c424-df"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/console.log" append="off"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:30:41 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:30:41 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:30:41 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:30:41 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.249 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Preparing to wait for external event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.249 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.249 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.249 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.250 186792 DEBUG nova.virt.libvirt.vif [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096437739',display_name='tempest-TestNetworkAdvancedServerOps-server-2096437739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096437739',id=163,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG13vP6TorI1BfgP51WB45YlXJdiZADs9Vr8WDj5tFB5h4MOu4V2srvEo0mvIjwOIArHDXlyFjzjTA9S2znrw3FTS7dEtIJ8YprpE+/VSrV3SFmnANGESGFkInD+qAEFVA==',key_name='tempest-TestNetworkAdvancedServerOps-415974526',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-uyrdhl9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:30:33Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=6074046e-cf5c-4db5-9662-721f727de670,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.250 186792 DEBUG nova.network.os_vif_util [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.251 186792 DEBUG nova.network.os_vif_util [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.251 186792 DEBUG os_vif [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.252 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.252 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.252 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.254 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.255 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ea5c424-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.255 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ea5c424-df, col_values=(('external_ids', {'iface-id': '9ea5c424-dfb2-4fc9-adaa-06a42cf88172', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:97:d8', 'vm-uuid': '6074046e-cf5c-4db5-9662-721f727de670'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:30:41 np0005531888 NetworkManager[55166]: <info>  [1763800241.2578] manager: (tap9ea5c424-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.258 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.263 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.264 186792 INFO os_vif [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df')#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.418 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.418 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.419 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:e9:97:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:30:41 np0005531888 nova_compute[186788]: 2025-11-22 08:30:41.419 186792 INFO nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Using config drive#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.001 186792 INFO nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Creating config drive at /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.config#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.005 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwuezvubb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.131 186792 DEBUG oslo_concurrency.processutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwuezvubb" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:30:42 np0005531888 kernel: tap9ea5c424-df: entered promiscuous mode
Nov 22 03:30:42 np0005531888 NetworkManager[55166]: <info>  [1763800242.1814] manager: (tap9ea5c424-df): new Tun device (/org/freedesktop/NetworkManager/Devices/309)
Nov 22 03:30:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:42Z|00653|binding|INFO|Claiming lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for this chassis.
Nov 22 03:30:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:42Z|00654|binding|INFO|9ea5c424-dfb2-4fc9-adaa-06a42cf88172: Claiming fa:16:3e:e9:97:d8 10.100.0.3
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.182 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.186 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.189 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:42 np0005531888 systemd-udevd[244154]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:30:42 np0005531888 systemd-machined[153106]: New machine qemu-79-instance-000000a3.
Nov 22 03:30:42 np0005531888 NetworkManager[55166]: <info>  [1763800242.2230] device (tap9ea5c424-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:30:42 np0005531888 NetworkManager[55166]: <info>  [1763800242.2238] device (tap9ea5c424-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.236 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:97:d8 10.100.0.3'], port_security=['fa:16:3e:e9:97:d8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6074046e-cf5c-4db5-9662-721f727de670', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99ffb7a4-3c4f-451f-858c-67610bd9b1c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30043869-5124-4227-86b6-fe04ab3139a4, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=9ea5c424-dfb2-4fc9-adaa-06a42cf88172) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.238 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 in datapath 3a9ba314-47b1-4454-bcbf-13054f5b67cd bound to our chassis#033[00m
Nov 22 03:30:42 np0005531888 systemd[1]: Started Virtual Machine qemu-79-instance-000000a3.
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.239 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a9ba314-47b1-4454-bcbf-13054f5b67cd#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.241 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:42Z|00655|binding|INFO|Setting lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 ovn-installed in OVS
Nov 22 03:30:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:42Z|00656|binding|INFO|Setting lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 up in Southbound
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.249 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.252 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ab314f-7ed8-486a-a1e1-5b366c023caa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.253 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a9ba314-41 in ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.255 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a9ba314-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.255 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b64fd01a-1371-4a8e-8794-dddaa4ba1e72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.256 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1df83693-d9eb-41da-ab2a-aa614a57d016]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.267 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0846476d-fc90-4502-8336-9edda4d33a63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.291 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec03683-62e2-4c8e-83c5-63b9e40eba3b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.320 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8b73e654-28b6-4b32-9f61-d528cfbd88a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 systemd-udevd[244157]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.328 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f7d848-07d5-4dfd-ba5f-d8e5f4a4ecc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 NetworkManager[55166]: <info>  [1763800242.3309] manager: (tap3a9ba314-40): new Veth device (/org/freedesktop/NetworkManager/Devices/310)
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.363 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[94e84ed2-c4c1-4d4b-8d4e-72391cebbb50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.366 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[19be0590-0c7f-4d23-b651-4dbc2a228877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 NetworkManager[55166]: <info>  [1763800242.3895] device (tap3a9ba314-40): carrier: link connected
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.395 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[83c077a8-38dd-4525-835b-cf2f81817d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.410 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[56005382-e873-4f38-b4ae-5ee067424314]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a9ba314-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c2:73:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698303, 'reachable_time': 41932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244188, 'error': None, 'target': 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.426 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a8fc9d39-5346-4a66-a9ff-72e672b2ee2a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec2:73c6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 698303, 'tstamp': 698303}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244189, 'error': None, 'target': 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.442 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[183f47cf-73e6-47a0-b911-e1b9d2f805b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a9ba314-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c2:73:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698303, 'reachable_time': 41932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244190, 'error': None, 'target': 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.470 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a79e7fb3-200b-4a46-845f-2d17d2d24481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.521 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[afebe90e-a483-47cb-bfc3-1ea787c343a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.523 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9ba314-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.523 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.524 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a9ba314-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:30:42 np0005531888 NetworkManager[55166]: <info>  [1763800242.5261] manager: (tap3a9ba314-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Nov 22 03:30:42 np0005531888 kernel: tap3a9ba314-40: entered promiscuous mode
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.528 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a9ba314-40, col_values=(('external_ids', {'iface-id': 'c72d72ff-6f7f-4cf0-b2ac-4983e5b20e5d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:30:42 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:42Z|00657|binding|INFO|Releasing lport c72d72ff-6f7f-4cf0-b2ac-4983e5b20e5d from this chassis (sb_readonly=0)
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.530 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a9ba314-47b1-4454-bcbf-13054f5b67cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a9ba314-47b1-4454-bcbf-13054f5b67cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.531 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9272b636-082e-4398-b0d1-b59add52b9e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.532 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-3a9ba314-47b1-4454-bcbf-13054f5b67cd
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/3a9ba314-47b1-4454-bcbf-13054f5b67cd.pid.haproxy
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 3a9ba314-47b1-4454-bcbf-13054f5b67cd
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:30:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:30:42.533 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'env', 'PROCESS_TAG=haproxy-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a9ba314-47b1-4454-bcbf-13054f5b67cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.539 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.597 186792 DEBUG nova.compute.manager [req-54d23d9e-eea8-4d09-b3e3-7c7dff965506 req-9bac94c8-2e01-42ea-b28a-7f2fa1aa40e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.598 186792 DEBUG oslo_concurrency.lockutils [req-54d23d9e-eea8-4d09-b3e3-7c7dff965506 req-9bac94c8-2e01-42ea-b28a-7f2fa1aa40e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.598 186792 DEBUG oslo_concurrency.lockutils [req-54d23d9e-eea8-4d09-b3e3-7c7dff965506 req-9bac94c8-2e01-42ea-b28a-7f2fa1aa40e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.599 186792 DEBUG oslo_concurrency.lockutils [req-54d23d9e-eea8-4d09-b3e3-7c7dff965506 req-9bac94c8-2e01-42ea-b28a-7f2fa1aa40e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:42 np0005531888 nova_compute[186788]: 2025-11-22 08:30:42.599 186792 DEBUG nova.compute.manager [req-54d23d9e-eea8-4d09-b3e3-7c7dff965506 req-9bac94c8-2e01-42ea-b28a-7f2fa1aa40e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Processing event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:30:42 np0005531888 podman[244223]: 2025-11-22 08:30:42.865798947 +0000 UTC m=+0.037515932 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.034 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.035 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800243.0332766, 6074046e-cf5c-4db5-9662-721f727de670 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.035 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] VM Started (Lifecycle Event)#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.038 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.041 186792 INFO nova.virt.libvirt.driver [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance spawned successfully.#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.042 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.054 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.060 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.065 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.066 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.066 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.066 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.067 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.067 186792 DEBUG nova.virt.libvirt.driver [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.089 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.089 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800243.034545, 6074046e-cf5c-4db5-9662-721f727de670 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.090 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.109 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.114 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800243.0373137, 6074046e-cf5c-4db5-9662-721f727de670 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.114 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.125 186792 INFO nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Took 9.10 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.127 186792 DEBUG nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.130 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.136 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.164 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.200 186792 INFO nova.compute.manager [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Took 9.62 seconds to build instance.#033[00m
Nov 22 03:30:43 np0005531888 nova_compute[186788]: 2025-11-22 08:30:43.223 186792 DEBUG oslo_concurrency.lockutils [None req-3c3c8717-bb7d-4f26-a3f9-f06731398fd6 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:43 np0005531888 podman[244223]: 2025-11-22 08:30:43.224603741 +0000 UTC m=+0.396320696 container create 57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 03:30:43 np0005531888 systemd[1]: Started libpod-conmon-57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3.scope.
Nov 22 03:30:43 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:30:43 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6126303fcc9351f80dfc6d35bbe1a43198876648c013687a0b9e61c9593b004/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:30:43 np0005531888 podman[244223]: 2025-11-22 08:30:43.446794884 +0000 UTC m=+0.618511899 container init 57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:30:43 np0005531888 podman[244223]: 2025-11-22 08:30:43.454113525 +0000 UTC m=+0.625830480 container start 57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 03:30:43 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244245]: [NOTICE]   (244249) : New worker (244251) forked
Nov 22 03:30:43 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244245]: [NOTICE]   (244249) : Loading success.
Nov 22 03:30:44 np0005531888 nova_compute[186788]: 2025-11-22 08:30:44.233 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:44 np0005531888 nova_compute[186788]: 2025-11-22 08:30:44.720 186792 DEBUG nova.compute.manager [req-be166d2f-dbf6-4896-af79-460e6c831a2d req-d7e07370-db96-4375-b949-713eec54c507 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:30:44 np0005531888 nova_compute[186788]: 2025-11-22 08:30:44.721 186792 DEBUG oslo_concurrency.lockutils [req-be166d2f-dbf6-4896-af79-460e6c831a2d req-d7e07370-db96-4375-b949-713eec54c507 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:44 np0005531888 nova_compute[186788]: 2025-11-22 08:30:44.721 186792 DEBUG oslo_concurrency.lockutils [req-be166d2f-dbf6-4896-af79-460e6c831a2d req-d7e07370-db96-4375-b949-713eec54c507 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:44 np0005531888 nova_compute[186788]: 2025-11-22 08:30:44.722 186792 DEBUG oslo_concurrency.lockutils [req-be166d2f-dbf6-4896-af79-460e6c831a2d req-d7e07370-db96-4375-b949-713eec54c507 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:44 np0005531888 nova_compute[186788]: 2025-11-22 08:30:44.722 186792 DEBUG nova.compute.manager [req-be166d2f-dbf6-4896-af79-460e6c831a2d req-d7e07370-db96-4375-b949-713eec54c507 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] No waiting events found dispatching network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:30:44 np0005531888 nova_compute[186788]: 2025-11-22 08:30:44.722 186792 WARNING nova.compute.manager [req-be166d2f-dbf6-4896-af79-460e6c831a2d req-d7e07370-db96-4375-b949-713eec54c507 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received unexpected event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:30:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:46Z|00658|binding|INFO|Releasing lport c72d72ff-6f7f-4cf0-b2ac-4983e5b20e5d from this chassis (sb_readonly=0)
Nov 22 03:30:46 np0005531888 NetworkManager[55166]: <info>  [1763800246.1437] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Nov 22 03:30:46 np0005531888 NetworkManager[55166]: <info>  [1763800246.1443] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.161 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:46 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:46Z|00659|binding|INFO|Releasing lport c72d72ff-6f7f-4cf0-b2ac-4983e5b20e5d from this chassis (sb_readonly=0)
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.175 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.180 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.258 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.396 186792 DEBUG nova.compute.manager [req-c61b31a3-7f31-4faf-82f8-3d9700d8d545 req-3aa47559-2b0d-4eeb-953b-6313b927d4dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-changed-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.397 186792 DEBUG nova.compute.manager [req-c61b31a3-7f31-4faf-82f8-3d9700d8d545 req-3aa47559-2b0d-4eeb-953b-6313b927d4dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Refreshing instance network info cache due to event network-changed-9ea5c424-dfb2-4fc9-adaa-06a42cf88172. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.397 186792 DEBUG oslo_concurrency.lockutils [req-c61b31a3-7f31-4faf-82f8-3d9700d8d545 req-3aa47559-2b0d-4eeb-953b-6313b927d4dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.397 186792 DEBUG oslo_concurrency.lockutils [req-c61b31a3-7f31-4faf-82f8-3d9700d8d545 req-3aa47559-2b0d-4eeb-953b-6313b927d4dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.397 186792 DEBUG nova.network.neutron [req-c61b31a3-7f31-4faf-82f8-3d9700d8d545 req-3aa47559-2b0d-4eeb-953b-6313b927d4dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Refreshing network info cache for port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:30:46 np0005531888 nova_compute[186788]: 2025-11-22 08:30:46.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:47 np0005531888 nova_compute[186788]: 2025-11-22 08:30:47.589 186792 DEBUG nova.network.neutron [req-c61b31a3-7f31-4faf-82f8-3d9700d8d545 req-3aa47559-2b0d-4eeb-953b-6313b927d4dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updated VIF entry in instance network info cache for port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:30:47 np0005531888 nova_compute[186788]: 2025-11-22 08:30:47.590 186792 DEBUG nova.network.neutron [req-c61b31a3-7f31-4faf-82f8-3d9700d8d545 req-3aa47559-2b0d-4eeb-953b-6313b927d4dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updating instance_info_cache with network_info: [{"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:30:47 np0005531888 nova_compute[186788]: 2025-11-22 08:30:47.613 186792 DEBUG oslo_concurrency.lockutils [req-c61b31a3-7f31-4faf-82f8-3d9700d8d545 req-3aa47559-2b0d-4eeb-953b-6313b927d4dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:30:47 np0005531888 podman[244262]: 2025-11-22 08:30:47.676777451 +0000 UTC m=+0.051729463 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:30:47 np0005531888 podman[244263]: 2025-11-22 08:30:47.692308673 +0000 UTC m=+0.058085339 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:30:49 np0005531888 nova_compute[186788]: 2025-11-22 08:30:49.235 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:51 np0005531888 nova_compute[186788]: 2025-11-22 08:30:51.261 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:54 np0005531888 nova_compute[186788]: 2025-11-22 08:30:54.237 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:56 np0005531888 nova_compute[186788]: 2025-11-22 08:30:56.264 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:59 np0005531888 nova_compute[186788]: 2025-11-22 08:30:59.238 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:59Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e9:97:d8 10.100.0.3
Nov 22 03:30:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:30:59Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:97:d8 10.100.0.3
Nov 22 03:30:59 np0005531888 podman[244322]: 2025-11-22 08:30:59.692448465 +0000 UTC m=+0.054536862 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:30:59 np0005531888 podman[244321]: 2025-11-22 08:30:59.709083064 +0000 UTC m=+0.074223936 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:31:01 np0005531888 nova_compute[186788]: 2025-11-22 08:31:01.267 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:02 np0005531888 podman[244364]: 2025-11-22 08:31:02.690390774 +0000 UTC m=+0.058601891 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 22 03:31:02 np0005531888 podman[244365]: 2025-11-22 08:31:02.699384946 +0000 UTC m=+0.065622365 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:31:02 np0005531888 podman[244366]: 2025-11-22 08:31:02.728705057 +0000 UTC m=+0.090713312 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:31:04 np0005531888 nova_compute[186788]: 2025-11-22 08:31:04.240 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:05 np0005531888 nova_compute[186788]: 2025-11-22 08:31:05.779 186792 INFO nova.compute.manager [None req-32da4f06-ddc8-40e7-b241-324a4fb73ce4 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Get console output#033[00m
Nov 22 03:31:05 np0005531888 nova_compute[186788]: 2025-11-22 08:31:05.787 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:31:06 np0005531888 nova_compute[186788]: 2025-11-22 08:31:06.141 186792 DEBUG oslo_concurrency.lockutils [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:06 np0005531888 nova_compute[186788]: 2025-11-22 08:31:06.142 186792 DEBUG oslo_concurrency.lockutils [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:06 np0005531888 nova_compute[186788]: 2025-11-22 08:31:06.142 186792 DEBUG nova.compute.manager [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:31:06 np0005531888 nova_compute[186788]: 2025-11-22 08:31:06.146 186792 DEBUG nova.compute.manager [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 22 03:31:06 np0005531888 nova_compute[186788]: 2025-11-22 08:31:06.148 186792 DEBUG nova.objects.instance [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'flavor' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:06 np0005531888 nova_compute[186788]: 2025-11-22 08:31:06.178 186792 DEBUG nova.objects.instance [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'info_cache' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:06 np0005531888 nova_compute[186788]: 2025-11-22 08:31:06.199 186792 DEBUG nova.virt.libvirt.driver [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:31:06 np0005531888 nova_compute[186788]: 2025-11-22 08:31:06.269 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:08 np0005531888 kernel: tap9ea5c424-df (unregistering): left promiscuous mode
Nov 22 03:31:08 np0005531888 NetworkManager[55166]: <info>  [1763800268.3862] device (tap9ea5c424-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.398 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:08Z|00660|binding|INFO|Releasing lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 from this chassis (sb_readonly=0)
Nov 22 03:31:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:08Z|00661|binding|INFO|Setting lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 down in Southbound
Nov 22 03:31:08 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:08Z|00662|binding|INFO|Removing iface tap9ea5c424-df ovn-installed in OVS
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.401 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:08.407 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:97:d8 10.100.0.3'], port_security=['fa:16:3e:e9:97:d8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6074046e-cf5c-4db5-9662-721f727de670', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99ffb7a4-3c4f-451f-858c-67610bd9b1c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30043869-5124-4227-86b6-fe04ab3139a4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=9ea5c424-dfb2-4fc9-adaa-06a42cf88172) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:31:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:08.409 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 in datapath 3a9ba314-47b1-4454-bcbf-13054f5b67cd unbound from our chassis#033[00m
Nov 22 03:31:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:08.410 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a9ba314-47b1-4454-bcbf-13054f5b67cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:31:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:08.411 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[86b86142-3c68-4be4-a476-f7fd9d1930ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:08.412 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd namespace which is not needed anymore#033[00m
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.415 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:08 np0005531888 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Nov 22 03:31:08 np0005531888 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a3.scope: Consumed 16.449s CPU time.
Nov 22 03:31:08 np0005531888 systemd-machined[153106]: Machine qemu-79-instance-000000a3 terminated.
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.578 186792 DEBUG nova.compute.manager [req-26e23801-5018-4f9d-b96a-114020d5c026 req-8b0d31b0-9b12-44eb-90ac-79ef9a951690 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-unplugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.579 186792 DEBUG oslo_concurrency.lockutils [req-26e23801-5018-4f9d-b96a-114020d5c026 req-8b0d31b0-9b12-44eb-90ac-79ef9a951690 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.579 186792 DEBUG oslo_concurrency.lockutils [req-26e23801-5018-4f9d-b96a-114020d5c026 req-8b0d31b0-9b12-44eb-90ac-79ef9a951690 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.579 186792 DEBUG oslo_concurrency.lockutils [req-26e23801-5018-4f9d-b96a-114020d5c026 req-8b0d31b0-9b12-44eb-90ac-79ef9a951690 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.579 186792 DEBUG nova.compute.manager [req-26e23801-5018-4f9d-b96a-114020d5c026 req-8b0d31b0-9b12-44eb-90ac-79ef9a951690 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] No waiting events found dispatching network-vif-unplugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:31:08 np0005531888 nova_compute[186788]: 2025-11-22 08:31:08.579 186792 WARNING nova.compute.manager [req-26e23801-5018-4f9d-b96a-114020d5c026 req-8b0d31b0-9b12-44eb-90ac-79ef9a951690 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received unexpected event network-vif-unplugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for instance with vm_state active and task_state powering-off.#033[00m
Nov 22 03:31:08 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244245]: [NOTICE]   (244249) : haproxy version is 2.8.14-c23fe91
Nov 22 03:31:08 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244245]: [NOTICE]   (244249) : path to executable is /usr/sbin/haproxy
Nov 22 03:31:08 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244245]: [WARNING]  (244249) : Exiting Master process...
Nov 22 03:31:08 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244245]: [WARNING]  (244249) : Exiting Master process...
Nov 22 03:31:08 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244245]: [ALERT]    (244249) : Current worker (244251) exited with code 143 (Terminated)
Nov 22 03:31:08 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244245]: [WARNING]  (244249) : All workers exited. Exiting... (0)
Nov 22 03:31:08 np0005531888 systemd[1]: libpod-57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3.scope: Deactivated successfully.
Nov 22 03:31:08 np0005531888 podman[244450]: 2025-11-22 08:31:08.713616425 +0000 UTC m=+0.209202165 container died 57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:31:08 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3-userdata-shm.mount: Deactivated successfully.
Nov 22 03:31:08 np0005531888 systemd[1]: var-lib-containers-storage-overlay-a6126303fcc9351f80dfc6d35bbe1a43198876648c013687a0b9e61c9593b004-merged.mount: Deactivated successfully.
Nov 22 03:31:09 np0005531888 nova_compute[186788]: 2025-11-22 08:31:09.215 186792 INFO nova.virt.libvirt.driver [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance shutdown successfully after 3 seconds.#033[00m
Nov 22 03:31:09 np0005531888 nova_compute[186788]: 2025-11-22 08:31:09.220 186792 INFO nova.virt.libvirt.driver [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance destroyed successfully.#033[00m
Nov 22 03:31:09 np0005531888 nova_compute[186788]: 2025-11-22 08:31:09.220 186792 DEBUG nova.objects.instance [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'numa_topology' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:09 np0005531888 nova_compute[186788]: 2025-11-22 08:31:09.234 186792 DEBUG nova.compute.manager [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:31:09 np0005531888 nova_compute[186788]: 2025-11-22 08:31:09.241 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:09 np0005531888 nova_compute[186788]: 2025-11-22 08:31:09.303 186792 DEBUG oslo_concurrency.lockutils [None req-46fc6841-fb40-4adf-a989-10e07cac761c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:09 np0005531888 podman[244450]: 2025-11-22 08:31:09.606619484 +0000 UTC m=+1.102205224 container cleanup 57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:31:09 np0005531888 systemd[1]: libpod-conmon-57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3.scope: Deactivated successfully.
Nov 22 03:31:10 np0005531888 podman[244498]: 2025-11-22 08:31:10.234203716 +0000 UTC m=+0.606874314 container remove 57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.241 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f0da32-a5c9-4ee1-b5a6-a6559cfc8193]: (4, ('Sat Nov 22 08:31:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd (57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3)\n57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3\nSat Nov 22 08:31:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd (57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3)\n57a3bee297451863a6db66e3e53b6d7d1a3926f96657ed4ea362350df0c976f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.243 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[47c3162c-267d-48d5-ac6d-c2a6747f19d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.244 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9ba314-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:10 np0005531888 nova_compute[186788]: 2025-11-22 08:31:10.246 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:10 np0005531888 kernel: tap3a9ba314-40: left promiscuous mode
Nov 22 03:31:10 np0005531888 nova_compute[186788]: 2025-11-22 08:31:10.262 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.266 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9ff53fe0-a05d-44fc-aab5-ca366bb773b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.283 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[47b46101-2abc-4147-b58e-d1a507d49ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.286 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[aea6573b-8484-4103-ac46-17e3cbee8c86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.307 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c34ecb32-cdce-4580-b380-19bc8d00d9b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698295, 'reachable_time': 17547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244518, 'error': None, 'target': 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.309 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:31:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:10.310 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[631313bd-75e7-46eb-bebe-aab72e4441bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:10 np0005531888 systemd[1]: run-netns-ovnmeta\x2d3a9ba314\x2d47b1\x2d4454\x2dbcbf\x2d13054f5b67cd.mount: Deactivated successfully.
Nov 22 03:31:10 np0005531888 nova_compute[186788]: 2025-11-22 08:31:10.679 186792 DEBUG nova.compute.manager [req-1d26953f-b345-4e14-8ddf-93946eb090f7 req-aff80759-d93a-4440-a93d-f7e2479df291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:31:10 np0005531888 nova_compute[186788]: 2025-11-22 08:31:10.680 186792 DEBUG oslo_concurrency.lockutils [req-1d26953f-b345-4e14-8ddf-93946eb090f7 req-aff80759-d93a-4440-a93d-f7e2479df291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:10 np0005531888 nova_compute[186788]: 2025-11-22 08:31:10.680 186792 DEBUG oslo_concurrency.lockutils [req-1d26953f-b345-4e14-8ddf-93946eb090f7 req-aff80759-d93a-4440-a93d-f7e2479df291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:10 np0005531888 nova_compute[186788]: 2025-11-22 08:31:10.681 186792 DEBUG oslo_concurrency.lockutils [req-1d26953f-b345-4e14-8ddf-93946eb090f7 req-aff80759-d93a-4440-a93d-f7e2479df291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:10 np0005531888 nova_compute[186788]: 2025-11-22 08:31:10.681 186792 DEBUG nova.compute.manager [req-1d26953f-b345-4e14-8ddf-93946eb090f7 req-aff80759-d93a-4440-a93d-f7e2479df291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] No waiting events found dispatching network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:31:10 np0005531888 nova_compute[186788]: 2025-11-22 08:31:10.682 186792 WARNING nova.compute.manager [req-1d26953f-b345-4e14-8ddf-93946eb090f7 req-aff80759-d93a-4440-a93d-f7e2479df291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received unexpected event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for instance with vm_state stopped and task_state None.#033[00m
Nov 22 03:31:11 np0005531888 nova_compute[186788]: 2025-11-22 08:31:11.270 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:14 np0005531888 nova_compute[186788]: 2025-11-22 08:31:14.243 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:14 np0005531888 nova_compute[186788]: 2025-11-22 08:31:14.302 186792 INFO nova.compute.manager [None req-2f612900-b65e-4aac-937a-0041ca2d1f95 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Get console output#033[00m
Nov 22 03:31:14 np0005531888 nova_compute[186788]: 2025-11-22 08:31:14.566 186792 DEBUG nova.objects.instance [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'flavor' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:14 np0005531888 nova_compute[186788]: 2025-11-22 08:31:14.611 186792 DEBUG nova.objects.instance [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'info_cache' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:14 np0005531888 nova_compute[186788]: 2025-11-22 08:31:14.641 186792 DEBUG oslo_concurrency.lockutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:31:14 np0005531888 nova_compute[186788]: 2025-11-22 08:31:14.642 186792 DEBUG oslo_concurrency.lockutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:31:14 np0005531888 nova_compute[186788]: 2025-11-22 08:31:14.642 186792 DEBUG nova.network.neutron [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:31:15 np0005531888 nova_compute[186788]: 2025-11-22 08:31:15.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:15 np0005531888 nova_compute[186788]: 2025-11-22 08:31:15.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:15 np0005531888 nova_compute[186788]: 2025-11-22 08:31:15.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:31:15 np0005531888 nova_compute[186788]: 2025-11-22 08:31:15.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:31:15 np0005531888 nova_compute[186788]: 2025-11-22 08:31:15.968 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:31:16 np0005531888 nova_compute[186788]: 2025-11-22 08:31:16.274 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:16 np0005531888 nova_compute[186788]: 2025-11-22 08:31:16.934 186792 DEBUG nova.network.neutron [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updating instance_info_cache with network_info: [{"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:31:16 np0005531888 nova_compute[186788]: 2025-11-22 08:31:16.954 186792 DEBUG oslo_concurrency.lockutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:31:16 np0005531888 nova_compute[186788]: 2025-11-22 08:31:16.956 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:31:16 np0005531888 nova_compute[186788]: 2025-11-22 08:31:16.957 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:31:16 np0005531888 nova_compute[186788]: 2025-11-22 08:31:16.958 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:16 np0005531888 nova_compute[186788]: 2025-11-22 08:31:16.996 186792 INFO nova.virt.libvirt.driver [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance destroyed successfully.#033[00m
Nov 22 03:31:16 np0005531888 nova_compute[186788]: 2025-11-22 08:31:16.997 186792 DEBUG nova.objects.instance [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'numa_topology' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.009 186792 DEBUG nova.objects.instance [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.021 186792 DEBUG nova.virt.libvirt.vif [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096437739',display_name='tempest-TestNetworkAdvancedServerOps-server-2096437739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096437739',id=163,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG13vP6TorI1BfgP51WB45YlXJdiZADs9Vr8WDj5tFB5h4MOu4V2srvEo0mvIjwOIArHDXlyFjzjTA9S2znrw3FTS7dEtIJ8YprpE+/VSrV3SFmnANGESGFkInD+qAEFVA==',key_name='tempest-TestNetworkAdvancedServerOps-415974526',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:30:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-uyrdhl9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:31:09Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=6074046e-cf5c-4db5-9662-721f727de670,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.022 186792 DEBUG nova.network.os_vif_util [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.023 186792 DEBUG nova.network.os_vif_util [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.024 186792 DEBUG os_vif [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.025 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.026 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ea5c424-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.028 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.029 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.031 186792 INFO os_vif [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df')#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.039 186792 DEBUG nova.virt.libvirt.driver [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Start _get_guest_xml network_info=[{"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.043 186792 WARNING nova.virt.libvirt.driver [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.048 186792 DEBUG nova.virt.libvirt.host [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.049 186792 DEBUG nova.virt.libvirt.host [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.053 186792 DEBUG nova.virt.libvirt.host [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.053 186792 DEBUG nova.virt.libvirt.host [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.054 186792 DEBUG nova.virt.libvirt.driver [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.054 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.055 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.055 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.055 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.056 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.056 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.056 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.056 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.057 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.057 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.057 186792 DEBUG nova.virt.hardware [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.057 186792 DEBUG nova.objects.instance [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.072 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.128 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.config --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.129 186792 DEBUG oslo_concurrency.lockutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.129 186792 DEBUG oslo_concurrency.lockutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.130 186792 DEBUG oslo_concurrency.lockutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.131 186792 DEBUG nova.virt.libvirt.vif [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096437739',display_name='tempest-TestNetworkAdvancedServerOps-server-2096437739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096437739',id=163,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG13vP6TorI1BfgP51WB45YlXJdiZADs9Vr8WDj5tFB5h4MOu4V2srvEo0mvIjwOIArHDXlyFjzjTA9S2znrw3FTS7dEtIJ8YprpE+/VSrV3SFmnANGESGFkInD+qAEFVA==',key_name='tempest-TestNetworkAdvancedServerOps-415974526',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:30:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-uyrdhl9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:31:09Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=6074046e-cf5c-4db5-9662-721f727de670,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.131 186792 DEBUG nova.network.os_vif_util [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.132 186792 DEBUG nova.network.os_vif_util [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.133 186792 DEBUG nova.objects.instance [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.146 186792 DEBUG nova.virt.libvirt.driver [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <uuid>6074046e-cf5c-4db5-9662-721f727de670</uuid>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <name>instance-000000a3</name>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2096437739</nova:name>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:31:17</nova:creationTime>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        <nova:port uuid="9ea5c424-dfb2-4fc9-adaa-06a42cf88172">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <entry name="serial">6074046e-cf5c-4db5-9662-721f727de670</entry>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <entry name="uuid">6074046e-cf5c-4db5-9662-721f727de670</entry>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk.config"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:e9:97:d8"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <target dev="tap9ea5c424-df"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/console.log" append="off"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <input type="keyboard" bus="usb"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:31:17 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:31:17 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:31:17 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:31:17 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.148 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.217 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.219 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.274 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.276 186792 DEBUG nova.objects.instance [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.290 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.351 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.352 186792 DEBUG nova.virt.disk.api [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.352 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.418 186792 DEBUG oslo_concurrency.processutils [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.419 186792 DEBUG nova.virt.disk.api [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.419 186792 DEBUG nova.objects.instance [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.433 186792 DEBUG nova.virt.libvirt.vif [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096437739',display_name='tempest-TestNetworkAdvancedServerOps-server-2096437739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096437739',id=163,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG13vP6TorI1BfgP51WB45YlXJdiZADs9Vr8WDj5tFB5h4MOu4V2srvEo0mvIjwOIArHDXlyFjzjTA9S2znrw3FTS7dEtIJ8YprpE+/VSrV3SFmnANGESGFkInD+qAEFVA==',key_name='tempest-TestNetworkAdvancedServerOps-415974526',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:30:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-uyrdhl9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:31:09Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=6074046e-cf5c-4db5-9662-721f727de670,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.433 186792 DEBUG nova.network.os_vif_util [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.434 186792 DEBUG nova.network.os_vif_util [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.434 186792 DEBUG os_vif [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.435 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.435 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.436 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.443 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ea5c424-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.445 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ea5c424-df, col_values=(('external_ids', {'iface-id': '9ea5c424-dfb2-4fc9-adaa-06a42cf88172', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:97:d8', 'vm-uuid': '6074046e-cf5c-4db5-9662-721f727de670'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:17 np0005531888 NetworkManager[55166]: <info>  [1763800277.4487] manager: (tap9ea5c424-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.452 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.453 186792 INFO os_vif [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df')#033[00m
Nov 22 03:31:17 np0005531888 kernel: tap9ea5c424-df: entered promiscuous mode
Nov 22 03:31:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:17Z|00663|binding|INFO|Claiming lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for this chassis.
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 NetworkManager[55166]: <info>  [1763800277.6510] manager: (tap9ea5c424-df): new Tun device (/org/freedesktop/NetworkManager/Devices/315)
Nov 22 03:31:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:17Z|00664|binding|INFO|9ea5c424-dfb2-4fc9-adaa-06a42cf88172: Claiming fa:16:3e:e9:97:d8 10.100.0.3
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.659 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:97:d8 10.100.0.3'], port_security=['fa:16:3e:e9:97:d8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6074046e-cf5c-4db5-9662-721f727de670', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '5', 'neutron:security_group_ids': '99ffb7a4-3c4f-451f-858c-67610bd9b1c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30043869-5124-4227-86b6-fe04ab3139a4, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=9ea5c424-dfb2-4fc9-adaa-06a42cf88172) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.660 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 in datapath 3a9ba314-47b1-4454-bcbf-13054f5b67cd bound to our chassis#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.661 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a9ba314-47b1-4454-bcbf-13054f5b67cd#033[00m
Nov 22 03:31:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:17Z|00665|binding|INFO|Setting lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 ovn-installed in OVS
Nov 22 03:31:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:17Z|00666|binding|INFO|Setting lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 up in Southbound
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.663 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.665 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.675 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e06f119f-1106-4ae0-98de-e4ef52338dba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.676 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a9ba314-41 in ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.678 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a9ba314-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.678 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fd4660fc-b308-4023-9648-6a67e2429025]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 systemd-udevd[244551]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.679 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[625d3d2c-9a7c-452e-8b78-9353df920ca1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 systemd-machined[153106]: New machine qemu-80-instance-000000a3.
Nov 22 03:31:17 np0005531888 NetworkManager[55166]: <info>  [1763800277.6911] device (tap9ea5c424-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:31:17 np0005531888 NetworkManager[55166]: <info>  [1763800277.6917] device (tap9ea5c424-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.691 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[5e728d1f-9d74-4a7a-841d-0bf40d2dbe40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.704 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0b32eed0-1efb-4006-8989-fbdae57fd44c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 systemd[1]: Started Virtual Machine qemu-80-instance-000000a3.
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.733 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[54aad874-82b5-44c9-a8c8-1282b985d2da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 NetworkManager[55166]: <info>  [1763800277.7406] manager: (tap3a9ba314-40): new Veth device (/org/freedesktop/NetworkManager/Devices/316)
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.739 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[640f5dc4-6788-4995-8f0b-2ec2956795ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.772 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[d116ebd7-8fae-4a75-8bbd-dc4e0a4deec5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.775 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[31fa2069-f216-4fb7-ae38-8e0c626f61f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 podman[244558]: 2025-11-22 08:31:17.79567109 +0000 UTC m=+0.069148272 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:31:17 np0005531888 podman[244559]: 2025-11-22 08:31:17.795758933 +0000 UTC m=+0.068364453 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:31:17 np0005531888 NetworkManager[55166]: <info>  [1763800277.7965] device (tap3a9ba314-40): carrier: link connected
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.802 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[513b849e-b0eb-49d1-b7ef-0dcdef89539a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.818 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6887b0bf-5e27-4081-b7bf-61b71ccc1b35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a9ba314-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c2:73:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701844, 'reachable_time': 30816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244626, 'error': None, 'target': 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.832 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1b8098-8a43-44aa-942d-09288ddbddf5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec2:73c6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 701844, 'tstamp': 701844}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244627, 'error': None, 'target': 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.850 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe2f113-383e-43ef-8e54-fa0087022567]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a9ba314-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c2:73:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701844, 'reachable_time': 30816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244628, 'error': None, 'target': 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.883 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[03aba41c-bde1-4f58-b8da-1e9ae70bfb54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.943 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b83759d8-5ab7-431b-be8c-eec11b9e7877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.949 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9ba314-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.949 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.949 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a9ba314-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:17 np0005531888 NetworkManager[55166]: <info>  [1763800277.9518] manager: (tap3a9ba314-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Nov 22 03:31:17 np0005531888 kernel: tap3a9ba314-40: entered promiscuous mode
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.951 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.953 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.956 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a9ba314-40, col_values=(('external_ids', {'iface-id': 'c72d72ff-6f7f-4cf0-b2ac-4983e5b20e5d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:17 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:17Z|00667|binding|INFO|Releasing lport c72d72ff-6f7f-4cf0-b2ac-4983e5b20e5d from this chassis (sb_readonly=0)
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.959 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a9ba314-47b1-4454-bcbf-13054f5b67cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a9ba314-47b1-4454-bcbf-13054f5b67cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.971 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[228659f8-1d9e-4ea0-89e0-94385eacc959]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.973 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-3a9ba314-47b1-4454-bcbf-13054f5b67cd
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/3a9ba314-47b1-4454-bcbf-13054f5b67cd.pid.haproxy
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 3a9ba314-47b1-4454-bcbf-13054f5b67cd
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:31:17 np0005531888 nova_compute[186788]: 2025-11-22 08:31:17.973 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:17.974 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'env', 'PROCESS_TAG=haproxy-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a9ba314-47b1-4454-bcbf-13054f5b67cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.103 186792 DEBUG nova.compute.manager [req-cb39789a-08b1-4aec-ac23-27420729b6d2 req-1b1cdb6e-d576-4822-9f1a-cf7489a59cc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.104 186792 DEBUG oslo_concurrency.lockutils [req-cb39789a-08b1-4aec-ac23-27420729b6d2 req-1b1cdb6e-d576-4822-9f1a-cf7489a59cc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.104 186792 DEBUG oslo_concurrency.lockutils [req-cb39789a-08b1-4aec-ac23-27420729b6d2 req-1b1cdb6e-d576-4822-9f1a-cf7489a59cc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.104 186792 DEBUG oslo_concurrency.lockutils [req-cb39789a-08b1-4aec-ac23-27420729b6d2 req-1b1cdb6e-d576-4822-9f1a-cf7489a59cc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.104 186792 DEBUG nova.compute.manager [req-cb39789a-08b1-4aec-ac23-27420729b6d2 req-1b1cdb6e-d576-4822-9f1a-cf7489a59cc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] No waiting events found dispatching network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.104 186792 WARNING nova.compute.manager [req-cb39789a-08b1-4aec-ac23-27420729b6d2 req-1b1cdb6e-d576-4822-9f1a-cf7489a59cc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received unexpected event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 22 03:31:18 np0005531888 podman[244659]: 2025-11-22 08:31:18.339134334 +0000 UTC m=+0.022550095 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.610 186792 DEBUG nova.virt.libvirt.host [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Removed pending event for 6074046e-cf5c-4db5-9662-721f727de670 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.611 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800278.6098814, 6074046e-cf5c-4db5-9662-721f727de670 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.611 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.613 186792 DEBUG nova.compute.manager [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.615 186792 INFO nova.virt.libvirt.driver [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance rebooted successfully.#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.615 186792 DEBUG nova.compute.manager [None req-632c0f8c-c2cb-496b-944c-c3c0a09f733c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.632 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.636 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.656 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.656 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800278.6126423, 6074046e-cf5c-4db5-9662-721f727de670 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.656 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] VM Started (Lifecycle Event)#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.681 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:31:18 np0005531888 nova_compute[186788]: 2025-11-22 08:31:18.690 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:31:19 np0005531888 podman[244659]: 2025-11-22 08:31:19.057084948 +0000 UTC m=+0.740500689 container create 715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:31:19 np0005531888 systemd[1]: Started libpod-conmon-715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c.scope.
Nov 22 03:31:19 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:31:19 np0005531888 nova_compute[186788]: 2025-11-22 08:31:19.245 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:19 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/733edea4ca00c77b0ab450bd50ed4e7970581546f8e8ad7f2ec2e430cd5abe08/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:31:19 np0005531888 nova_compute[186788]: 2025-11-22 08:31:19.277 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updating instance_info_cache with network_info: [{"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:31:19 np0005531888 nova_compute[186788]: 2025-11-22 08:31:19.292 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:31:19 np0005531888 nova_compute[186788]: 2025-11-22 08:31:19.292 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:31:19 np0005531888 nova_compute[186788]: 2025-11-22 08:31:19.292 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:19 np0005531888 podman[244659]: 2025-11-22 08:31:19.486813845 +0000 UTC m=+1.170229616 container init 715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:31:19 np0005531888 podman[244659]: 2025-11-22 08:31:19.492086544 +0000 UTC m=+1.175502285 container start 715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:31:19 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244682]: [NOTICE]   (244686) : New worker (244688) forked
Nov 22 03:31:19 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244682]: [NOTICE]   (244686) : Loading success.
Nov 22 03:31:19 np0005531888 nova_compute[186788]: 2025-11-22 08:31:19.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:20 np0005531888 nova_compute[186788]: 2025-11-22 08:31:20.177 186792 DEBUG nova.compute.manager [req-df18e2b4-b291-4546-9c8b-5d416486f5b8 req-206b629b-bc9a-4ae4-b69e-bd63dc75a8a2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:31:20 np0005531888 nova_compute[186788]: 2025-11-22 08:31:20.177 186792 DEBUG oslo_concurrency.lockutils [req-df18e2b4-b291-4546-9c8b-5d416486f5b8 req-206b629b-bc9a-4ae4-b69e-bd63dc75a8a2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:20 np0005531888 nova_compute[186788]: 2025-11-22 08:31:20.177 186792 DEBUG oslo_concurrency.lockutils [req-df18e2b4-b291-4546-9c8b-5d416486f5b8 req-206b629b-bc9a-4ae4-b69e-bd63dc75a8a2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:20 np0005531888 nova_compute[186788]: 2025-11-22 08:31:20.178 186792 DEBUG oslo_concurrency.lockutils [req-df18e2b4-b291-4546-9c8b-5d416486f5b8 req-206b629b-bc9a-4ae4-b69e-bd63dc75a8a2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:20 np0005531888 nova_compute[186788]: 2025-11-22 08:31:20.178 186792 DEBUG nova.compute.manager [req-df18e2b4-b291-4546-9c8b-5d416486f5b8 req-206b629b-bc9a-4ae4-b69e-bd63dc75a8a2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] No waiting events found dispatching network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:31:20 np0005531888 nova_compute[186788]: 2025-11-22 08:31:20.178 186792 WARNING nova.compute.manager [req-df18e2b4-b291-4546-9c8b-5d416486f5b8 req-206b629b-bc9a-4ae4-b69e-bd63dc75a8a2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received unexpected event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:31:21 np0005531888 nova_compute[186788]: 2025-11-22 08:31:21.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:22 np0005531888 nova_compute[186788]: 2025-11-22 08:31:22.449 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:24 np0005531888 nova_compute[186788]: 2025-11-22 08:31:24.247 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:26 np0005531888 nova_compute[186788]: 2025-11-22 08:31:26.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:27 np0005531888 nova_compute[186788]: 2025-11-22 08:31:27.452 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:28 np0005531888 nova_compute[186788]: 2025-11-22 08:31:28.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:28 np0005531888 nova_compute[186788]: 2025-11-22 08:31:28.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:28 np0005531888 nova_compute[186788]: 2025-11-22 08:31:28.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:28 np0005531888 nova_compute[186788]: 2025-11-22 08:31:28.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:28 np0005531888 nova_compute[186788]: 2025-11-22 08:31:28.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:28 np0005531888 nova_compute[186788]: 2025-11-22 08:31:28.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.041 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.110 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.111 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.174 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.249 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.357 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.358 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5519MB free_disk=73.23665237426758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.358 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:29 np0005531888 nova_compute[186788]: 2025-11-22 08:31:29.359 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:30 np0005531888 nova_compute[186788]: 2025-11-22 08:31:30.145 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 6074046e-cf5c-4db5-9662-721f727de670 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:31:30 np0005531888 nova_compute[186788]: 2025-11-22 08:31:30.145 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:31:30 np0005531888 nova_compute[186788]: 2025-11-22 08:31:30.146 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:31:30 np0005531888 podman[244704]: 2025-11-22 08:31:30.697759731 +0000 UTC m=+0.062391636 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:31:30 np0005531888 podman[244705]: 2025-11-22 08:31:30.716782708 +0000 UTC m=+0.081436224 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:31:30 np0005531888 nova_compute[186788]: 2025-11-22 08:31:30.902 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:31:30 np0005531888 nova_compute[186788]: 2025-11-22 08:31:30.916 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:31:30 np0005531888 nova_compute[186788]: 2025-11-22 08:31:30.934 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:31:30 np0005531888 nova_compute[186788]: 2025-11-22 08:31:30.935 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:32 np0005531888 nova_compute[186788]: 2025-11-22 08:31:32.454 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:32 np0005531888 nova_compute[186788]: 2025-11-22 08:31:32.936 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:32 np0005531888 nova_compute[186788]: 2025-11-22 08:31:32.936 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:31:33 np0005531888 podman[244760]: 2025-11-22 08:31:33.690357248 +0000 UTC m=+0.056875580 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6)
Nov 22 03:31:33 np0005531888 podman[244761]: 2025-11-22 08:31:33.696986591 +0000 UTC m=+0.058985762 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:31:33 np0005531888 podman[244762]: 2025-11-22 08:31:33.726923797 +0000 UTC m=+0.085665338 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:31:34 np0005531888 nova_compute[186788]: 2025-11-22 08:31:34.252 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:34Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e9:97:d8 10.100.0.3
Nov 22 03:31:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:36.849 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:36.850 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:36.850 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:37 np0005531888 nova_compute[186788]: 2025-11-22 08:31:37.457 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:38 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:38Z|00668|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 22 03:31:39 np0005531888 nova_compute[186788]: 2025-11-22 08:31:39.254 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:39 np0005531888 nova_compute[186788]: 2025-11-22 08:31:39.943 186792 INFO nova.compute.manager [None req-85454f52-387d-47e6-a3c1-68cf7b0ff0ff d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Get console output#033[00m
Nov 22 03:31:39 np0005531888 nova_compute[186788]: 2025-11-22 08:31:39.948 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:31:42 np0005531888 nova_compute[186788]: 2025-11-22 08:31:42.460 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:42 np0005531888 nova_compute[186788]: 2025-11-22 08:31:42.651 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:42.652 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:31:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:42.654 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.458 186792 DEBUG nova.compute.manager [req-47b58b56-0186-49fa-ab22-6040c7b592bc req-13343343-6b09-42ff-8387-7fe4778a1726 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-changed-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.459 186792 DEBUG nova.compute.manager [req-47b58b56-0186-49fa-ab22-6040c7b592bc req-13343343-6b09-42ff-8387-7fe4778a1726 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Refreshing instance network info cache due to event network-changed-9ea5c424-dfb2-4fc9-adaa-06a42cf88172. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.459 186792 DEBUG oslo_concurrency.lockutils [req-47b58b56-0186-49fa-ab22-6040c7b592bc req-13343343-6b09-42ff-8387-7fe4778a1726 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.459 186792 DEBUG oslo_concurrency.lockutils [req-47b58b56-0186-49fa-ab22-6040c7b592bc req-13343343-6b09-42ff-8387-7fe4778a1726 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.459 186792 DEBUG nova.network.neutron [req-47b58b56-0186-49fa-ab22-6040c7b592bc req-13343343-6b09-42ff-8387-7fe4778a1726 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Refreshing network info cache for port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.815 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.816 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.816 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.816 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.817 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.825 186792 INFO nova.compute.manager [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Terminating instance#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.835 186792 DEBUG nova.compute.manager [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:31:43 np0005531888 kernel: tap9ea5c424-df (unregistering): left promiscuous mode
Nov 22 03:31:43 np0005531888 NetworkManager[55166]: <info>  [1763800303.8626] device (tap9ea5c424-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:31:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:43Z|00669|binding|INFO|Releasing lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 from this chassis (sb_readonly=0)
Nov 22 03:31:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:43Z|00670|binding|INFO|Setting lport 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 down in Southbound
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.872 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:43 np0005531888 ovn_controller[95067]: 2025-11-22T08:31:43Z|00671|binding|INFO|Removing iface tap9ea5c424-df ovn-installed in OVS
Nov 22 03:31:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:43.887 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:97:d8 10.100.0.3'], port_security=['fa:16:3e:e9:97:d8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6074046e-cf5c-4db5-9662-721f727de670', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '6', 'neutron:security_group_ids': '99ffb7a4-3c4f-451f-858c-67610bd9b1c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30043869-5124-4227-86b6-fe04ab3139a4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=9ea5c424-dfb2-4fc9-adaa-06a42cf88172) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:31:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:43.888 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172 in datapath 3a9ba314-47b1-4454-bcbf-13054f5b67cd unbound from our chassis#033[00m
Nov 22 03:31:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:43.889 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a9ba314-47b1-4454-bcbf-13054f5b67cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:31:43 np0005531888 nova_compute[186788]: 2025-11-22 08:31:43.890 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:43.891 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a948b9be-4f54-42b4-85f4-9c16e34b7924]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:43.892 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd namespace which is not needed anymore#033[00m
Nov 22 03:31:43 np0005531888 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Nov 22 03:31:43 np0005531888 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a3.scope: Consumed 17.150s CPU time.
Nov 22 03:31:43 np0005531888 systemd-machined[153106]: Machine qemu-80-instance-000000a3 terminated.
Nov 22 03:31:44 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244682]: [NOTICE]   (244686) : haproxy version is 2.8.14-c23fe91
Nov 22 03:31:44 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244682]: [NOTICE]   (244686) : path to executable is /usr/sbin/haproxy
Nov 22 03:31:44 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244682]: [WARNING]  (244686) : Exiting Master process...
Nov 22 03:31:44 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244682]: [ALERT]    (244686) : Current worker (244688) exited with code 143 (Terminated)
Nov 22 03:31:44 np0005531888 neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd[244682]: [WARNING]  (244686) : All workers exited. Exiting... (0)
Nov 22 03:31:44 np0005531888 systemd[1]: libpod-715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c.scope: Deactivated successfully.
Nov 22 03:31:44 np0005531888 podman[244847]: 2025-11-22 08:31:44.043594532 +0000 UTC m=+0.064493477 container died 715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.060 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.065 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.113 186792 INFO nova.virt.libvirt.driver [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Instance destroyed successfully.#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.114 186792 DEBUG nova.objects.instance [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 6074046e-cf5c-4db5-9662-721f727de670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.128 186792 DEBUG nova.virt.libvirt.vif [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096437739',display_name='tempest-TestNetworkAdvancedServerOps-server-2096437739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096437739',id=163,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG13vP6TorI1BfgP51WB45YlXJdiZADs9Vr8WDj5tFB5h4MOu4V2srvEo0mvIjwOIArHDXlyFjzjTA9S2znrw3FTS7dEtIJ8YprpE+/VSrV3SFmnANGESGFkInD+qAEFVA==',key_name='tempest-TestNetworkAdvancedServerOps-415974526',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:30:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-uyrdhl9b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:31:18Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=6074046e-cf5c-4db5-9662-721f727de670,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.128 186792 DEBUG nova.network.os_vif_util [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.129 186792 DEBUG nova.network.os_vif_util [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.129 186792 DEBUG os_vif [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.131 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.131 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ea5c424-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.134 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.136 186792 INFO os_vif [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:97:d8,bridge_name='br-int',has_traffic_filtering=True,id=9ea5c424-dfb2-4fc9-adaa-06a42cf88172,network=Network(3a9ba314-47b1-4454-bcbf-13054f5b67cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ea5c424-df')#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.136 186792 INFO nova.virt.libvirt.driver [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Deleting instance files /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670_del#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.137 186792 INFO nova.virt.libvirt.driver [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Deletion of /var/lib/nova/instances/6074046e-cf5c-4db5-9662-721f727de670_del complete#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.256 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:44 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c-userdata-shm.mount: Deactivated successfully.
Nov 22 03:31:44 np0005531888 systemd[1]: var-lib-containers-storage-overlay-733edea4ca00c77b0ab450bd50ed4e7970581546f8e8ad7f2ec2e430cd5abe08-merged.mount: Deactivated successfully.
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.374 186792 INFO nova.compute.manager [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Took 0.54 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.375 186792 DEBUG oslo.service.loopingcall [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.375 186792 DEBUG nova.compute.manager [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.375 186792 DEBUG nova.network.neutron [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:31:44 np0005531888 podman[244847]: 2025-11-22 08:31:44.444204644 +0000 UTC m=+0.465103599 container cleanup 715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:31:44 np0005531888 systemd[1]: libpod-conmon-715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c.scope: Deactivated successfully.
Nov 22 03:31:44 np0005531888 podman[244894]: 2025-11-22 08:31:44.96541117 +0000 UTC m=+0.495543096 container remove 715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 03:31:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:44.971 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8f787a67-748f-41c9-9a99-b294c936a881]: (4, ('Sat Nov 22 08:31:43 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd (715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c)\n715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c\nSat Nov 22 08:31:44 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd (715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c)\n715d5bd1db23c0fe7826329289c02efad9138961617a96a7909246fb06ef913c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:44.974 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[509994e9-c296-49cd-89cc-3f6e17bbc1d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:44.975 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9ba314-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:44 np0005531888 nova_compute[186788]: 2025-11-22 08:31:44.977 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:44 np0005531888 kernel: tap3a9ba314-40: left promiscuous mode
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.001 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:45.005 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1df0e29c-1e07-432d-9275-27dbdc0f571a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:45.025 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c8cd6887-ae9a-4b29-9fc9-d089f7f02133]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:45.026 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6487f027-f5b0-4af2-916d-6db63aab3c79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:45.047 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[030f3978-c463-4b8f-83c2-8b3e2a0f5fb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 701837, 'reachable_time': 15384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244908, 'error': None, 'target': 'ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:45.050 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a9ba314-47b1-4454-bcbf-13054f5b67cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:31:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:45.050 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[5694a1e6-a94b-4d74-b987-d3d57bdf7cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:31:45 np0005531888 systemd[1]: run-netns-ovnmeta\x2d3a9ba314\x2d47b1\x2d4454\x2dbcbf\x2d13054f5b67cd.mount: Deactivated successfully.
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.101 186792 DEBUG nova.network.neutron [req-47b58b56-0186-49fa-ab22-6040c7b592bc req-13343343-6b09-42ff-8387-7fe4778a1726 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updated VIF entry in instance network info cache for port 9ea5c424-dfb2-4fc9-adaa-06a42cf88172. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.102 186792 DEBUG nova.network.neutron [req-47b58b56-0186-49fa-ab22-6040c7b592bc req-13343343-6b09-42ff-8387-7fe4778a1726 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updating instance_info_cache with network_info: [{"id": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "address": "fa:16:3e:e9:97:d8", "network": {"id": "3a9ba314-47b1-4454-bcbf-13054f5b67cd", "bridge": "br-int", "label": "tempest-network-smoke--1922468522", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ea5c424-df", "ovs_interfaceid": "9ea5c424-dfb2-4fc9-adaa-06a42cf88172", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.148 186792 DEBUG nova.network.neutron [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.154 186792 DEBUG oslo_concurrency.lockutils [req-47b58b56-0186-49fa-ab22-6040c7b592bc req-13343343-6b09-42ff-8387-7fe4778a1726 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6074046e-cf5c-4db5-9662-721f727de670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.167 186792 INFO nova.compute.manager [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Took 0.79 seconds to deallocate network for instance.#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.224 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.225 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.295 186792 DEBUG nova.compute.provider_tree [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.317 186792 DEBUG nova.scheduler.client.report [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.337 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.374 186792 INFO nova.scheduler.client.report [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance 6074046e-cf5c-4db5-9662-721f727de670#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.664 186792 DEBUG nova.compute.manager [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-unplugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.665 186792 DEBUG oslo_concurrency.lockutils [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.665 186792 DEBUG oslo_concurrency.lockutils [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.665 186792 DEBUG oslo_concurrency.lockutils [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.665 186792 DEBUG nova.compute.manager [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] No waiting events found dispatching network-vif-unplugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.666 186792 WARNING nova.compute.manager [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received unexpected event network-vif-unplugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.666 186792 DEBUG nova.compute.manager [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.666 186792 DEBUG oslo_concurrency.lockutils [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6074046e-cf5c-4db5-9662-721f727de670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.666 186792 DEBUG oslo_concurrency.lockutils [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.667 186792 DEBUG oslo_concurrency.lockutils [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.667 186792 DEBUG nova.compute.manager [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] No waiting events found dispatching network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.667 186792 WARNING nova.compute.manager [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received unexpected event network-vif-plugged-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:31:45 np0005531888 nova_compute[186788]: 2025-11-22 08:31:45.667 186792 DEBUG nova.compute.manager [req-05faa14c-feb5-4f87-bfa0-ffeb8380858c req-5824d9bf-9cd8-4740-b685-913745df45ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Received event network-vif-deleted-9ea5c424-dfb2-4fc9-adaa-06a42cf88172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:31:46 np0005531888 nova_compute[186788]: 2025-11-22 08:31:46.154 186792 DEBUG oslo_concurrency.lockutils [None req-ce6fd4c7-3d2a-4edd-8e83-38e578b4c56e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "6074046e-cf5c-4db5-9662-721f727de670" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:31:48.661 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:48 np0005531888 podman[244909]: 2025-11-22 08:31:48.69950784 +0000 UTC m=+0.070711540 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:31:48 np0005531888 podman[244910]: 2025-11-22 08:31:48.724178106 +0000 UTC m=+0.093332775 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:31:49 np0005531888 nova_compute[186788]: 2025-11-22 08:31:49.134 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:49 np0005531888 nova_compute[186788]: 2025-11-22 08:31:49.257 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:51 np0005531888 nova_compute[186788]: 2025-11-22 08:31:51.962 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:52 np0005531888 nova_compute[186788]: 2025-11-22 08:31:52.048 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:54 np0005531888 nova_compute[186788]: 2025-11-22 08:31:54.138 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:54 np0005531888 nova_compute[186788]: 2025-11-22 08:31:54.258 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:59 np0005531888 nova_compute[186788]: 2025-11-22 08:31:59.110 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800304.1092098, 6074046e-cf5c-4db5-9662-721f727de670 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:31:59 np0005531888 nova_compute[186788]: 2025-11-22 08:31:59.111 186792 INFO nova.compute.manager [-] [instance: 6074046e-cf5c-4db5-9662-721f727de670] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:31:59 np0005531888 nova_compute[186788]: 2025-11-22 08:31:59.137 186792 DEBUG nova.compute.manager [None req-f31e2d65-dfab-4957-80b0-99dbfb5e6f93 - - - - - -] [instance: 6074046e-cf5c-4db5-9662-721f727de670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:31:59 np0005531888 nova_compute[186788]: 2025-11-22 08:31:59.141 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:59 np0005531888 nova_compute[186788]: 2025-11-22 08:31:59.261 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:01 np0005531888 podman[244953]: 2025-11-22 08:32:01.678368627 +0000 UTC m=+0.050182754 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:32:01 np0005531888 podman[244952]: 2025-11-22 08:32:01.718043653 +0000 UTC m=+0.091146702 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:32:04 np0005531888 nova_compute[186788]: 2025-11-22 08:32:04.144 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:04 np0005531888 nova_compute[186788]: 2025-11-22 08:32:04.262 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:04 np0005531888 podman[244992]: 2025-11-22 08:32:04.735336429 +0000 UTC m=+0.102506782 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, architecture=x86_64, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:32:04 np0005531888 podman[244993]: 2025-11-22 08:32:04.738874845 +0000 UTC m=+0.091911850 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:32:04 np0005531888 podman[244994]: 2025-11-22 08:32:04.755258948 +0000 UTC m=+0.102133912 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:32:09 np0005531888 nova_compute[186788]: 2025-11-22 08:32:09.147 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:09 np0005531888 nova_compute[186788]: 2025-11-22 08:32:09.263 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:14 np0005531888 nova_compute[186788]: 2025-11-22 08:32:14.152 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:14 np0005531888 nova_compute[186788]: 2025-11-22 08:32:14.266 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:15 np0005531888 nova_compute[186788]: 2025-11-22 08:32:15.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:16 np0005531888 nova_compute[186788]: 2025-11-22 08:32:16.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:16 np0005531888 nova_compute[186788]: 2025-11-22 08:32:16.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:16 np0005531888 nova_compute[186788]: 2025-11-22 08:32:16.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:32:16 np0005531888 nova_compute[186788]: 2025-11-22 08:32:16.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:32:16 np0005531888 nova_compute[186788]: 2025-11-22 08:32:16.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:32:19 np0005531888 nova_compute[186788]: 2025-11-22 08:32:19.154 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:19 np0005531888 nova_compute[186788]: 2025-11-22 08:32:19.269 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:19 np0005531888 podman[245060]: 2025-11-22 08:32:19.676803475 +0000 UTC m=+0.050411381 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:32:19 np0005531888 podman[245061]: 2025-11-22 08:32:19.677785469 +0000 UTC m=+0.047129300 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:32:19 np0005531888 nova_compute[186788]: 2025-11-22 08:32:19.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:23 np0005531888 nova_compute[186788]: 2025-11-22 08:32:23.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:24 np0005531888 nova_compute[186788]: 2025-11-22 08:32:24.157 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:24 np0005531888 nova_compute[186788]: 2025-11-22 08:32:24.270 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:28 np0005531888 nova_compute[186788]: 2025-11-22 08:32:28.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:29 np0005531888 nova_compute[186788]: 2025-11-22 08:32:29.159 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:29 np0005531888 nova_compute[186788]: 2025-11-22 08:32:29.271 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:30 np0005531888 nova_compute[186788]: 2025-11-22 08:32:30.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:30 np0005531888 nova_compute[186788]: 2025-11-22 08:32:30.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:30 np0005531888 nova_compute[186788]: 2025-11-22 08:32:30.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:32:30 np0005531888 nova_compute[186788]: 2025-11-22 08:32:30.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:32:30 np0005531888 nova_compute[186788]: 2025-11-22 08:32:30.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:32:30 np0005531888 nova_compute[186788]: 2025-11-22 08:32:30.978 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.138 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.139 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5707MB free_disk=73.26580429077148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.139 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.139 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.188 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.188 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.208 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.218 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.286 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:32:31 np0005531888 nova_compute[186788]: 2025-11-22 08:32:31.286 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:32:32 np0005531888 ovn_controller[95067]: 2025-11-22T08:32:32Z|00672|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 22 03:32:32 np0005531888 podman[245105]: 2025-11-22 08:32:32.683829125 +0000 UTC m=+0.050055212 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:32:32 np0005531888 podman[245104]: 2025-11-22 08:32:32.705003206 +0000 UTC m=+0.077711392 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 22 03:32:33 np0005531888 nova_compute[186788]: 2025-11-22 08:32:33.286 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:33 np0005531888 nova_compute[186788]: 2025-11-22 08:32:33.287 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:32:34 np0005531888 nova_compute[186788]: 2025-11-22 08:32:34.161 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:34 np0005531888 nova_compute[186788]: 2025-11-22 08:32:34.273 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:35 np0005531888 podman[245149]: 2025-11-22 08:32:35.693087172 +0000 UTC m=+0.062064767 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:32:35 np0005531888 podman[245148]: 2025-11-22 08:32:35.704088073 +0000 UTC m=+0.078033820 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Nov 22 03:32:35 np0005531888 podman[245150]: 2025-11-22 08:32:35.733356013 +0000 UTC m=+0.103270451 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 03:32:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:32:36.851 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:32:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:32:36.851 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:32:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:32:36.851 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.851 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.852 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:32:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:39 np0005531888 nova_compute[186788]: 2025-11-22 08:32:39.163 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:39 np0005531888 nova_compute[186788]: 2025-11-22 08:32:39.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:44 np0005531888 nova_compute[186788]: 2025-11-22 08:32:44.167 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:44 np0005531888 nova_compute[186788]: 2025-11-22 08:32:44.277 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:32:44.909 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:32:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:32:44.910 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:32:44 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:32:44.911 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:32:44 np0005531888 nova_compute[186788]: 2025-11-22 08:32:44.911 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:46 np0005531888 nova_compute[186788]: 2025-11-22 08:32:46.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:49 np0005531888 nova_compute[186788]: 2025-11-22 08:32:49.170 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:49 np0005531888 nova_compute[186788]: 2025-11-22 08:32:49.280 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:50 np0005531888 podman[245214]: 2025-11-22 08:32:50.68578302 +0000 UTC m=+0.054552803 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:32:50 np0005531888 podman[245215]: 2025-11-22 08:32:50.695402916 +0000 UTC m=+0.057551266 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:32:54 np0005531888 nova_compute[186788]: 2025-11-22 08:32:54.172 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:54 np0005531888 nova_compute[186788]: 2025-11-22 08:32:54.282 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:59 np0005531888 nova_compute[186788]: 2025-11-22 08:32:59.176 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:59 np0005531888 nova_compute[186788]: 2025-11-22 08:32:59.284 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:03 np0005531888 podman[245258]: 2025-11-22 08:33:03.683367717 +0000 UTC m=+0.054997223 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:33:03 np0005531888 podman[245257]: 2025-11-22 08:33:03.695624798 +0000 UTC m=+0.070420712 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:33:04 np0005531888 nova_compute[186788]: 2025-11-22 08:33:04.180 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:04 np0005531888 nova_compute[186788]: 2025-11-22 08:33:04.286 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:06 np0005531888 podman[245302]: 2025-11-22 08:33:06.68999482 +0000 UTC m=+0.061780601 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3)
Nov 22 03:33:06 np0005531888 podman[245301]: 2025-11-22 08:33:06.68998056 +0000 UTC m=+0.063161965 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Nov 22 03:33:06 np0005531888 podman[245303]: 2025-11-22 08:33:06.747988526 +0000 UTC m=+0.116927266 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:33:09 np0005531888 nova_compute[186788]: 2025-11-22 08:33:09.183 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:09 np0005531888 nova_compute[186788]: 2025-11-22 08:33:09.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:14 np0005531888 nova_compute[186788]: 2025-11-22 08:33:14.186 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:14 np0005531888 nova_compute[186788]: 2025-11-22 08:33:14.292 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:15 np0005531888 nova_compute[186788]: 2025-11-22 08:33:15.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:16 np0005531888 nova_compute[186788]: 2025-11-22 08:33:16.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:17 np0005531888 nova_compute[186788]: 2025-11-22 08:33:17.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:17 np0005531888 nova_compute[186788]: 2025-11-22 08:33:17.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:33:17 np0005531888 nova_compute[186788]: 2025-11-22 08:33:17.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:33:17 np0005531888 nova_compute[186788]: 2025-11-22 08:33:17.966 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:33:19 np0005531888 nova_compute[186788]: 2025-11-22 08:33:19.188 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:19 np0005531888 nova_compute[186788]: 2025-11-22 08:33:19.294 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:21 np0005531888 podman[245366]: 2025-11-22 08:33:21.694041387 +0000 UTC m=+0.063537054 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:33:21 np0005531888 podman[245365]: 2025-11-22 08:33:21.695449641 +0000 UTC m=+0.066732452 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:33:21 np0005531888 nova_compute[186788]: 2025-11-22 08:33:21.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:23 np0005531888 nova_compute[186788]: 2025-11-22 08:33:23.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:24 np0005531888 nova_compute[186788]: 2025-11-22 08:33:24.190 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:24 np0005531888 nova_compute[186788]: 2025-11-22 08:33:24.296 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.292 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "28c4266d-8891-42aa-b05f-9e25e32a2105" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.293 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.316 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.426 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.427 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.433 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.433 186792 INFO nova.compute.claims [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.580 186792 DEBUG nova.compute.provider_tree [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.597 186792 DEBUG nova.scheduler.client.report [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.670 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.671 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.828 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.829 186792 DEBUG nova.network.neutron [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.954 186792 INFO nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:33:25 np0005531888 nova_compute[186788]: 2025-11-22 08:33:25.976 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.222 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.224 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.224 186792 INFO nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Creating image(s)#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.225 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.225 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.226 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.238 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.300 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.301 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.302 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.328 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.393 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.395 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.950 186792 DEBUG nova.policy [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:33:26 np0005531888 nova_compute[186788]: 2025-11-22 08:33:26.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.038 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk 1073741824" returned: 0 in 0.643s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.039 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.737s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.040 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.093 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.094 186792 DEBUG nova.virt.disk.api [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.095 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.150 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.150 186792 DEBUG nova.virt.disk.api [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.151 186792 DEBUG nova.objects.instance [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid 28c4266d-8891-42aa-b05f-9e25e32a2105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.162 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.162 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Ensure instance console log exists: /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.162 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.163 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:27 np0005531888 nova_compute[186788]: 2025-11-22 08:33:27.163 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:29 np0005531888 nova_compute[186788]: 2025-11-22 08:33:29.192 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:29 np0005531888 nova_compute[186788]: 2025-11-22 08:33:29.297 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:29 np0005531888 nova_compute[186788]: 2025-11-22 08:33:29.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:30 np0005531888 nova_compute[186788]: 2025-11-22 08:33:30.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:30 np0005531888 nova_compute[186788]: 2025-11-22 08:33:30.973 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:30 np0005531888 nova_compute[186788]: 2025-11-22 08:33:30.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:30 np0005531888 nova_compute[186788]: 2025-11-22 08:33:30.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:30 np0005531888 nova_compute[186788]: 2025-11-22 08:33:30.974 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.067 186792 DEBUG nova.network.neutron [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Successfully created port: bfb6d679-0393-475e-aa21-80b081b6dd4a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.118 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.119 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.27116012573242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.119 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.119 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.273 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 28c4266d-8891-42aa-b05f-9e25e32a2105 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.274 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.274 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.320 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.332 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.352 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:33:31 np0005531888 nova_compute[186788]: 2025-11-22 08:33:31.353 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.085 186792 DEBUG nova.network.neutron [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Successfully updated port: bfb6d679-0393-475e-aa21-80b081b6dd4a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.108 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.108 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.108 186792 DEBUG nova.network.neutron [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.202 186792 DEBUG nova.compute.manager [req-85d05bc9-d804-4325-bdf3-e8bd65b28c71 req-5297ca79-efc9-44c0-ba7e-342a8ca916ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-changed-bfb6d679-0393-475e-aa21-80b081b6dd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.203 186792 DEBUG nova.compute.manager [req-85d05bc9-d804-4325-bdf3-e8bd65b28c71 req-5297ca79-efc9-44c0-ba7e-342a8ca916ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Refreshing instance network info cache due to event network-changed-bfb6d679-0393-475e-aa21-80b081b6dd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.203 186792 DEBUG oslo_concurrency.lockutils [req-85d05bc9-d804-4325-bdf3-e8bd65b28c71 req-5297ca79-efc9-44c0-ba7e-342a8ca916ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.259 186792 DEBUG nova.network.neutron [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:33:32 np0005531888 nova_compute[186788]: 2025-11-22 08:33:32.353 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.158 186792 DEBUG nova.network.neutron [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updating instance_info_cache with network_info: [{"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.340 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.340 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Instance network_info: |[{"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.341 186792 DEBUG oslo_concurrency.lockutils [req-85d05bc9-d804-4325-bdf3-e8bd65b28c71 req-5297ca79-efc9-44c0-ba7e-342a8ca916ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.341 186792 DEBUG nova.network.neutron [req-85d05bc9-d804-4325-bdf3-e8bd65b28c71 req-5297ca79-efc9-44c0-ba7e-342a8ca916ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Refreshing network info cache for port bfb6d679-0393-475e-aa21-80b081b6dd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.344 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Start _get_guest_xml network_info=[{"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.348 186792 WARNING nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.359 186792 DEBUG nova.virt.libvirt.host [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.360 186792 DEBUG nova.virt.libvirt.host [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.366 186792 DEBUG nova.virt.libvirt.host [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.367 186792 DEBUG nova.virt.libvirt.host [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.368 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.369 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.369 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.369 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.370 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.370 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.370 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.370 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.371 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.371 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.371 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.371 186792 DEBUG nova.virt.hardware [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.375 186792 DEBUG nova.virt.libvirt.vif [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-839460819',display_name='tempest-TestNetworkBasicOps-server-839460819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-839460819',id=165,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOfiPITIfaG7HOmmrU+dUnF4R1GQwF/SErx/rVrjQAma6JYHV+pzvBu8LoitrnT3oUHPG7p23mRG8NO9kH96MBDZSSjQFLRMUUQzr+b8Rb5gn4T1YV7NdB2v+tYxsKF1Q==',key_name='tempest-TestNetworkBasicOps-609619719',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-0vb8uett',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:33:26Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=28c4266d-8891-42aa-b05f-9e25e32a2105,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.376 186792 DEBUG nova.network.os_vif_util [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.376 186792 DEBUG nova.network.os_vif_util [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:bf:79,bridge_name='br-int',has_traffic_filtering=True,id=bfb6d679-0393-475e-aa21-80b081b6dd4a,network=Network(e034377b-e7dc-4d6e-b7c8-53f948b62761),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb6d679-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.378 186792 DEBUG nova.objects.instance [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid 28c4266d-8891-42aa-b05f-9e25e32a2105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.398 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <uuid>28c4266d-8891-42aa-b05f-9e25e32a2105</uuid>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <name>instance-000000a5</name>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkBasicOps-server-839460819</nova:name>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:33:33</nova:creationTime>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        <nova:port uuid="bfb6d679-0393-475e-aa21-80b081b6dd4a">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <entry name="serial">28c4266d-8891-42aa-b05f-9e25e32a2105</entry>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <entry name="uuid">28c4266d-8891-42aa-b05f-9e25e32a2105</entry>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk.config"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:72:bf:79"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <target dev="tapbfb6d679-03"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/console.log" append="off"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:33:33 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:33:33 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:33:33 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:33:33 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.399 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Preparing to wait for external event network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.399 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.400 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.400 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.401 186792 DEBUG nova.virt.libvirt.vif [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-839460819',display_name='tempest-TestNetworkBasicOps-server-839460819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-839460819',id=165,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOfiPITIfaG7HOmmrU+dUnF4R1GQwF/SErx/rVrjQAma6JYHV+pzvBu8LoitrnT3oUHPG7p23mRG8NO9kH96MBDZSSjQFLRMUUQzr+b8Rb5gn4T1YV7NdB2v+tYxsKF1Q==',key_name='tempest-TestNetworkBasicOps-609619719',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-0vb8uett',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:33:26Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=28c4266d-8891-42aa-b05f-9e25e32a2105,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.401 186792 DEBUG nova.network.os_vif_util [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.402 186792 DEBUG nova.network.os_vif_util [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:bf:79,bridge_name='br-int',has_traffic_filtering=True,id=bfb6d679-0393-475e-aa21-80b081b6dd4a,network=Network(e034377b-e7dc-4d6e-b7c8-53f948b62761),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb6d679-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.402 186792 DEBUG os_vif [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:bf:79,bridge_name='br-int',has_traffic_filtering=True,id=bfb6d679-0393-475e-aa21-80b081b6dd4a,network=Network(e034377b-e7dc-4d6e-b7c8-53f948b62761),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb6d679-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.403 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.403 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.404 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.406 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.406 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfb6d679-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.407 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbfb6d679-03, col_values=(('external_ids', {'iface-id': 'bfb6d679-0393-475e-aa21-80b081b6dd4a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:bf:79', 'vm-uuid': '28c4266d-8891-42aa-b05f-9e25e32a2105'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.408 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:33 np0005531888 NetworkManager[55166]: <info>  [1763800413.4097] manager: (tapbfb6d679-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.411 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.420 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.421 186792 INFO os_vif [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:bf:79,bridge_name='br-int',has_traffic_filtering=True,id=bfb6d679-0393-475e-aa21-80b081b6dd4a,network=Network(e034377b-e7dc-4d6e-b7c8-53f948b62761),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb6d679-03')#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.598 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.598 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.598 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:72:bf:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:33:33 np0005531888 nova_compute[186788]: 2025-11-22 08:33:33.599 186792 INFO nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Using config drive#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.165 186792 INFO nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Creating config drive at /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk.config#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.172 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp17oiwr0u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.296 186792 DEBUG oslo_concurrency.processutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp17oiwr0u" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.299 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 kernel: tapbfb6d679-03: entered promiscuous mode
Nov 22 03:33:34 np0005531888 NetworkManager[55166]: <info>  [1763800414.3906] manager: (tapbfb6d679-03): new Tun device (/org/freedesktop/NetworkManager/Devices/319)
Nov 22 03:33:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:34Z|00673|binding|INFO|Claiming lport bfb6d679-0393-475e-aa21-80b081b6dd4a for this chassis.
Nov 22 03:33:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:34Z|00674|binding|INFO|bfb6d679-0393-475e-aa21-80b081b6dd4a: Claiming fa:16:3e:72:bf:79 10.100.0.4
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.390 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.394 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.401 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 systemd-udevd[245462]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:33:34 np0005531888 systemd-machined[153106]: New machine qemu-81-instance-000000a5.
Nov 22 03:33:34 np0005531888 NetworkManager[55166]: <info>  [1763800414.4390] device (tapbfb6d679-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:33:34 np0005531888 NetworkManager[55166]: <info>  [1763800414.4397] device (tapbfb6d679-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:33:34 np0005531888 systemd[1]: Started Virtual Machine qemu-81-instance-000000a5.
Nov 22 03:33:34 np0005531888 podman[245431]: 2025-11-22 08:33:34.459890078 +0000 UTC m=+0.095728115 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 22 03:33:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:34Z|00675|binding|INFO|Setting lport bfb6d679-0393-475e-aa21-80b081b6dd4a ovn-installed in OVS
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.461 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 podman[245432]: 2025-11-22 08:33:34.470317035 +0000 UTC m=+0.087498493 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.499 186792 DEBUG nova.network.neutron [req-85d05bc9-d804-4325-bdf3-e8bd65b28c71 req-5297ca79-efc9-44c0-ba7e-342a8ca916ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updated VIF entry in instance network info cache for port bfb6d679-0393-475e-aa21-80b081b6dd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.500 186792 DEBUG nova.network.neutron [req-85d05bc9-d804-4325-bdf3-e8bd65b28c71 req-5297ca79-efc9-44c0-ba7e-342a8ca916ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updating instance_info_cache with network_info: [{"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.518 186792 DEBUG oslo_concurrency.lockutils [req-85d05bc9-d804-4325-bdf3-e8bd65b28c71 req-5297ca79-efc9-44c0-ba7e-342a8ca916ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.533 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:bf:79 10.100.0.4'], port_security=['fa:16:3e:72:bf:79 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e034377b-e7dc-4d6e-b7c8-53f948b62761', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4283ce39-f10b-4253-aafd-7adef97db372', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5e07858-1687-49e6-a10b-a34ed4fdb0a1, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=bfb6d679-0393-475e-aa21-80b081b6dd4a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:33:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:34Z|00676|binding|INFO|Setting lport bfb6d679-0393-475e-aa21-80b081b6dd4a up in Southbound
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.534 104023 INFO neutron.agent.ovn.metadata.agent [-] Port bfb6d679-0393-475e-aa21-80b081b6dd4a in datapath e034377b-e7dc-4d6e-b7c8-53f948b62761 bound to our chassis#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.535 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e034377b-e7dc-4d6e-b7c8-53f948b62761#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.546 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8f93df-5474-4342-9d96-9b677048c1ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.546 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape034377b-e1 in ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.549 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape034377b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.549 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c3a563-c360-47d1-991c-bfad50e4514a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.550 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a1cc7d38-d8d8-46e5-9922-2dbae53552d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.560 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[d7690a67-b4df-487a-a60b-4cc4adac843e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.572 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[86271917-b25e-4f56-a79a-341db3021c1d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.603 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[44ed47b7-1cab-4fc9-8974-11f531b09835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.611 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9a43abd0-292f-4c2f-af02-f6d7ec729b8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 NetworkManager[55166]: <info>  [1763800414.6121] manager: (tape034377b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/320)
Nov 22 03:33:34 np0005531888 systemd-udevd[245467]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.645 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef65bd1-ccff-478f-85e5-e5e51e132257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.648 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea88b67-6c2e-4e1f-86f8-32e0c30a7d0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 NetworkManager[55166]: <info>  [1763800414.6689] device (tape034377b-e0): carrier: link connected
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.676 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a75bc286-f38e-4a90-8235-d381466b9fbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.697 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b2097e59-e294-41d8-ac54-c8fa2f22725f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape034377b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:bf:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715531, 'reachable_time': 15657, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245515, 'error': None, 'target': 'ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.711 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2650cb8d-3270-42c4-b13b-84ad333a7027]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed3:bfec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 715531, 'tstamp': 715531}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245516, 'error': None, 'target': 'ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.728 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[39d5605a-714a-4a7f-8451-5e1726f04431]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape034377b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:bf:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715531, 'reachable_time': 15657, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245517, 'error': None, 'target': 'ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.762 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1eab4620-5ef7-4155-a135-d23082fd376b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.827 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6f846be9-d52e-4b3c-bcde-68fa228cebaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.829 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape034377b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.830 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.831 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape034377b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.833 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 NetworkManager[55166]: <info>  [1763800414.8343] manager: (tape034377b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Nov 22 03:33:34 np0005531888 kernel: tape034377b-e0: entered promiscuous mode
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.836 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.837 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape034377b-e0, col_values=(('external_ids', {'iface-id': 'c2e4c3a5-2adc-45b3-861d-eb69d02fc3dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.839 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:34Z|00677|binding|INFO|Releasing lport c2e4c3a5-2adc-45b3-861d-eb69d02fc3dd from this chassis (sb_readonly=0)
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.841 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e034377b-e7dc-4d6e-b7c8-53f948b62761.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e034377b-e7dc-4d6e-b7c8-53f948b62761.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.841 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3c56eccb-7931-4ad2-a680-3f2ab8eebbae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.842 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-e034377b-e7dc-4d6e-b7c8-53f948b62761
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/e034377b-e7dc-4d6e-b7c8-53f948b62761.pid.haproxy
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID e034377b-e7dc-4d6e-b7c8-53f948b62761
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:33:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:34.843 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761', 'env', 'PROCESS_TAG=haproxy-e034377b-e7dc-4d6e-b7c8-53f948b62761', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e034377b-e7dc-4d6e-b7c8-53f948b62761.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.851 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:34 np0005531888 nova_compute[186788]: 2025-11-22 08:33:34.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.087 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800415.0871162, 28c4266d-8891-42aa-b05f-9e25e32a2105 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.087 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] VM Started (Lifecycle Event)#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.105 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.110 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800415.0872784, 28c4266d-8891-42aa-b05f-9e25e32a2105 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.111 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.129 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.133 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.153 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.176 186792 DEBUG nova.compute.manager [req-2239de5c-40c2-4e34-9b9b-c9e17e25f87b req-54ec0bf4-c4a6-4afb-bc6d-c93d85079aa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.177 186792 DEBUG oslo_concurrency.lockutils [req-2239de5c-40c2-4e34-9b9b-c9e17e25f87b req-54ec0bf4-c4a6-4afb-bc6d-c93d85079aa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.177 186792 DEBUG oslo_concurrency.lockutils [req-2239de5c-40c2-4e34-9b9b-c9e17e25f87b req-54ec0bf4-c4a6-4afb-bc6d-c93d85079aa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.177 186792 DEBUG oslo_concurrency.lockutils [req-2239de5c-40c2-4e34-9b9b-c9e17e25f87b req-54ec0bf4-c4a6-4afb-bc6d-c93d85079aa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.178 186792 DEBUG nova.compute.manager [req-2239de5c-40c2-4e34-9b9b-c9e17e25f87b req-54ec0bf4-c4a6-4afb-bc6d-c93d85079aa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Processing event network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.178 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.182 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800415.181899, 28c4266d-8891-42aa-b05f-9e25e32a2105 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.182 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.184 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.188 186792 INFO nova.virt.libvirt.driver [-] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Instance spawned successfully.#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.189 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.200 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.205 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.208 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.208 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.209 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.209 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.209 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.210 186792 DEBUG nova.virt.libvirt.driver [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.232 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:33:35 np0005531888 podman[245556]: 2025-11-22 08:33:35.253765839 +0000 UTC m=+0.028782738 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.374 186792 INFO nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Took 9.15 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.375 186792 DEBUG nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.628 186792 INFO nova.compute.manager [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Took 10.25 seconds to build instance.#033[00m
Nov 22 03:33:35 np0005531888 nova_compute[186788]: 2025-11-22 08:33:35.685 186792 DEBUG oslo_concurrency.lockutils [None req-60a62759-df56-45f8-8ae9-283f462d692b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:36 np0005531888 podman[245556]: 2025-11-22 08:33:36.384410002 +0000 UTC m=+1.159426881 container create 17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:33:36 np0005531888 systemd[1]: Started libpod-conmon-17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9.scope.
Nov 22 03:33:36 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:33:36 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f1b481e7e02a7f11b2697903e95d424d89a633f8953440536b30ec15739c669/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:33:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:36.853 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:36.855 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:36.855 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:36 np0005531888 podman[245556]: 2025-11-22 08:33:36.879949317 +0000 UTC m=+1.654966196 container init 17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:33:36 np0005531888 podman[245556]: 2025-11-22 08:33:36.886125678 +0000 UTC m=+1.661142597 container start 17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:33:36 np0005531888 neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761[245572]: [NOTICE]   (245576) : New worker (245578) forked
Nov 22 03:33:36 np0005531888 neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761[245572]: [NOTICE]   (245576) : Loading success.
Nov 22 03:33:37 np0005531888 nova_compute[186788]: 2025-11-22 08:33:37.253 186792 DEBUG nova.compute.manager [req-3992368a-ec03-4878-b079-339c46cc7e55 req-0864cdc2-27f7-497b-aa3a-b257ff8bf5a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:33:37 np0005531888 nova_compute[186788]: 2025-11-22 08:33:37.253 186792 DEBUG oslo_concurrency.lockutils [req-3992368a-ec03-4878-b079-339c46cc7e55 req-0864cdc2-27f7-497b-aa3a-b257ff8bf5a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:37 np0005531888 nova_compute[186788]: 2025-11-22 08:33:37.254 186792 DEBUG oslo_concurrency.lockutils [req-3992368a-ec03-4878-b079-339c46cc7e55 req-0864cdc2-27f7-497b-aa3a-b257ff8bf5a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:37 np0005531888 nova_compute[186788]: 2025-11-22 08:33:37.254 186792 DEBUG oslo_concurrency.lockutils [req-3992368a-ec03-4878-b079-339c46cc7e55 req-0864cdc2-27f7-497b-aa3a-b257ff8bf5a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:37 np0005531888 nova_compute[186788]: 2025-11-22 08:33:37.255 186792 DEBUG nova.compute.manager [req-3992368a-ec03-4878-b079-339c46cc7e55 req-0864cdc2-27f7-497b-aa3a-b257ff8bf5a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] No waiting events found dispatching network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:33:37 np0005531888 nova_compute[186788]: 2025-11-22 08:33:37.255 186792 WARNING nova.compute.manager [req-3992368a-ec03-4878-b079-339c46cc7e55 req-0864cdc2-27f7-497b-aa3a-b257ff8bf5a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received unexpected event network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a for instance with vm_state active and task_state None.#033[00m
Nov 22 03:33:37 np0005531888 podman[245588]: 2025-11-22 08:33:37.846838702 +0000 UTC m=+0.219456277 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:33:37 np0005531888 podman[245589]: 2025-11-22 08:33:37.858018837 +0000 UTC m=+0.221838236 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:33:37 np0005531888 podman[245590]: 2025-11-22 08:33:37.887393729 +0000 UTC m=+0.249433314 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Nov 22 03:33:38 np0005531888 nova_compute[186788]: 2025-11-22 08:33:38.409 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:39 np0005531888 nova_compute[186788]: 2025-11-22 08:33:39.301 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:41Z|00678|binding|INFO|Releasing lport c2e4c3a5-2adc-45b3-861d-eb69d02fc3dd from this chassis (sb_readonly=0)
Nov 22 03:33:41 np0005531888 NetworkManager[55166]: <info>  [1763800421.1974] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Nov 22 03:33:41 np0005531888 nova_compute[186788]: 2025-11-22 08:33:41.196 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:41 np0005531888 NetworkManager[55166]: <info>  [1763800421.1984] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Nov 22 03:33:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:41Z|00679|binding|INFO|Releasing lport c2e4c3a5-2adc-45b3-861d-eb69d02fc3dd from this chassis (sb_readonly=0)
Nov 22 03:33:41 np0005531888 nova_compute[186788]: 2025-11-22 08:33:41.240 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:41 np0005531888 nova_compute[186788]: 2025-11-22 08:33:41.245 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:42 np0005531888 nova_compute[186788]: 2025-11-22 08:33:42.407 186792 DEBUG nova.compute.manager [req-6eb9f070-6918-47f5-b1e8-f0836b612872 req-8bc254ca-5513-4337-8222-9f9e4606ac92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-changed-bfb6d679-0393-475e-aa21-80b081b6dd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:33:42 np0005531888 nova_compute[186788]: 2025-11-22 08:33:42.407 186792 DEBUG nova.compute.manager [req-6eb9f070-6918-47f5-b1e8-f0836b612872 req-8bc254ca-5513-4337-8222-9f9e4606ac92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Refreshing instance network info cache due to event network-changed-bfb6d679-0393-475e-aa21-80b081b6dd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:33:42 np0005531888 nova_compute[186788]: 2025-11-22 08:33:42.408 186792 DEBUG oslo_concurrency.lockutils [req-6eb9f070-6918-47f5-b1e8-f0836b612872 req-8bc254ca-5513-4337-8222-9f9e4606ac92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:33:42 np0005531888 nova_compute[186788]: 2025-11-22 08:33:42.408 186792 DEBUG oslo_concurrency.lockutils [req-6eb9f070-6918-47f5-b1e8-f0836b612872 req-8bc254ca-5513-4337-8222-9f9e4606ac92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:33:42 np0005531888 nova_compute[186788]: 2025-11-22 08:33:42.408 186792 DEBUG nova.network.neutron [req-6eb9f070-6918-47f5-b1e8-f0836b612872 req-8bc254ca-5513-4337-8222-9f9e4606ac92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Refreshing network info cache for port bfb6d679-0393-475e-aa21-80b081b6dd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:33:43 np0005531888 nova_compute[186788]: 2025-11-22 08:33:43.412 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:44 np0005531888 nova_compute[186788]: 2025-11-22 08:33:44.303 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:44 np0005531888 nova_compute[186788]: 2025-11-22 08:33:44.973 186792 DEBUG nova.network.neutron [req-6eb9f070-6918-47f5-b1e8-f0836b612872 req-8bc254ca-5513-4337-8222-9f9e4606ac92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updated VIF entry in instance network info cache for port bfb6d679-0393-475e-aa21-80b081b6dd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:33:44 np0005531888 nova_compute[186788]: 2025-11-22 08:33:44.974 186792 DEBUG nova.network.neutron [req-6eb9f070-6918-47f5-b1e8-f0836b612872 req-8bc254ca-5513-4337-8222-9f9e4606ac92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updating instance_info_cache with network_info: [{"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:33:44 np0005531888 nova_compute[186788]: 2025-11-22 08:33:44.988 186792 DEBUG oslo_concurrency.lockutils [req-6eb9f070-6918-47f5-b1e8-f0836b612872 req-8bc254ca-5513-4337-8222-9f9e4606ac92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:33:48 np0005531888 nova_compute[186788]: 2025-11-22 08:33:48.414 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:49 np0005531888 nova_compute[186788]: 2025-11-22 08:33:49.305 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:50 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:50Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:72:bf:79 10.100.0.4
Nov 22 03:33:50 np0005531888 ovn_controller[95067]: 2025-11-22T08:33:50Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:72:bf:79 10.100.0.4
Nov 22 03:33:52 np0005531888 podman[245669]: 2025-11-22 08:33:52.674346428 +0000 UTC m=+0.052414939 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:33:52 np0005531888 podman[245670]: 2025-11-22 08:33:52.710504008 +0000 UTC m=+0.082989712 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:33:53 np0005531888 nova_compute[186788]: 2025-11-22 08:33:53.416 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:54 np0005531888 nova_compute[186788]: 2025-11-22 08:33:54.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:57 np0005531888 nova_compute[186788]: 2025-11-22 08:33:57.083 186792 INFO nova.compute.manager [None req-aa0cb0ff-63cc-4800-ab26-827bdaa8b6b7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Get console output#033[00m
Nov 22 03:33:57 np0005531888 nova_compute[186788]: 2025-11-22 08:33:57.089 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:33:58 np0005531888 nova_compute[186788]: 2025-11-22 08:33:58.418 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:59 np0005531888 nova_compute[186788]: 2025-11-22 08:33:59.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:59.987 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:33:59 np0005531888 nova_compute[186788]: 2025-11-22 08:33:59.987 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:33:59.988 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:34:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:00.991 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:34:03 np0005531888 nova_compute[186788]: 2025-11-22 08:34:03.420 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:04 np0005531888 nova_compute[186788]: 2025-11-22 08:34:04.310 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:04 np0005531888 podman[245715]: 2025-11-22 08:34:04.67768474 +0000 UTC m=+0.047660103 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:34:04 np0005531888 podman[245714]: 2025-11-22 08:34:04.683048432 +0000 UTC m=+0.054841089 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 03:34:08 np0005531888 nova_compute[186788]: 2025-11-22 08:34:08.422 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:08 np0005531888 podman[245760]: 2025-11-22 08:34:08.693200092 +0000 UTC m=+0.063023981 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:34:08 np0005531888 podman[245759]: 2025-11-22 08:34:08.701912405 +0000 UTC m=+0.066547676 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:34:08 np0005531888 podman[245761]: 2025-11-22 08:34:08.72200029 +0000 UTC m=+0.089019170 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:34:09 np0005531888 nova_compute[186788]: 2025-11-22 08:34:09.312 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:13 np0005531888 nova_compute[186788]: 2025-11-22 08:34:13.424 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:14 np0005531888 nova_compute[186788]: 2025-11-22 08:34:14.313 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:15 np0005531888 nova_compute[186788]: 2025-11-22 08:34:15.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:16 np0005531888 nova_compute[186788]: 2025-11-22 08:34:16.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:17 np0005531888 nova_compute[186788]: 2025-11-22 08:34:17.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:17 np0005531888 nova_compute[186788]: 2025-11-22 08:34:17.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:34:17 np0005531888 nova_compute[186788]: 2025-11-22 08:34:17.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:34:18 np0005531888 nova_compute[186788]: 2025-11-22 08:34:18.090 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:34:18 np0005531888 nova_compute[186788]: 2025-11-22 08:34:18.091 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:34:18 np0005531888 nova_compute[186788]: 2025-11-22 08:34:18.091 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:34:18 np0005531888 nova_compute[186788]: 2025-11-22 08:34:18.091 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 28c4266d-8891-42aa-b05f-9e25e32a2105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:34:18 np0005531888 nova_compute[186788]: 2025-11-22 08:34:18.425 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:19 np0005531888 nova_compute[186788]: 2025-11-22 08:34:19.315 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:19 np0005531888 nova_compute[186788]: 2025-11-22 08:34:19.561 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updating instance_info_cache with network_info: [{"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:34:19 np0005531888 nova_compute[186788]: 2025-11-22 08:34:19.572 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:34:19 np0005531888 nova_compute[186788]: 2025-11-22 08:34:19.572 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:34:21 np0005531888 nova_compute[186788]: 2025-11-22 08:34:21.748 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:22 np0005531888 nova_compute[186788]: 2025-11-22 08:34:22.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:23 np0005531888 nova_compute[186788]: 2025-11-22 08:34:23.428 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:23 np0005531888 podman[245826]: 2025-11-22 08:34:23.681707578 +0000 UTC m=+0.049847406 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:34:23 np0005531888 podman[245827]: 2025-11-22 08:34:23.686901026 +0000 UTC m=+0.050379000 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:34:23 np0005531888 nova_compute[186788]: 2025-11-22 08:34:23.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:23 np0005531888 nova_compute[186788]: 2025-11-22 08:34:23.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:23 np0005531888 nova_compute[186788]: 2025-11-22 08:34:23.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:34:24 np0005531888 nova_compute[186788]: 2025-11-22 08:34:24.317 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:28 np0005531888 nova_compute[186788]: 2025-11-22 08:34:28.429 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:29 np0005531888 nova_compute[186788]: 2025-11-22 08:34:29.319 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:30 np0005531888 nova_compute[186788]: 2025-11-22 08:34:30.965 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:30 np0005531888 nova_compute[186788]: 2025-11-22 08:34:30.993 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:30 np0005531888 nova_compute[186788]: 2025-11-22 08:34:30.993 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:30 np0005531888 nova_compute[186788]: 2025-11-22 08:34:30.994 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:30 np0005531888 nova_compute[186788]: 2025-11-22 08:34:30.994 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.074 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.133 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.134 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.189 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.363 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.364 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5557MB free_disk=73.23796463012695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.365 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.365 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.448 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 28c4266d-8891-42aa-b05f-9e25e32a2105 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.449 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.449 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.483 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.508 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.569 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:34:31 np0005531888 nova_compute[186788]: 2025-11-22 08:34:31.569 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:32 np0005531888 nova_compute[186788]: 2025-11-22 08:34:32.559 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:32 np0005531888 nova_compute[186788]: 2025-11-22 08:34:32.559 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:33 np0005531888 nova_compute[186788]: 2025-11-22 08:34:33.432 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:34 np0005531888 nova_compute[186788]: 2025-11-22 08:34:34.321 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:35 np0005531888 podman[245873]: 2025-11-22 08:34:35.702710032 +0000 UTC m=+0.057436193 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:34:35 np0005531888 podman[245872]: 2025-11-22 08:34:35.727900492 +0000 UTC m=+0.091477161 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:34:35 np0005531888 nova_compute[186788]: 2025-11-22 08:34:35.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:35 np0005531888 nova_compute[186788]: 2025-11-22 08:34:35.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:34:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:36.854 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:36.854 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:36.855 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.857 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'name': 'tempest-TestNetworkBasicOps-server-839460819', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a5', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '12f63a6d87a947758ab928c0d625ff06', 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'hostId': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.858 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.861 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 28c4266d-8891-42aa-b05f-9e25e32a2105 / tapbfb6d679-03 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.862 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58ca41d1-3d79-4038-9a7e-2ec4fd63742a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.858328', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '14490bbe-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': '22e52cd0646d8b3dac8ed8b506af5780099c113c7856697d811db6ced521d6d0'}]}, 'timestamp': '2025-11-22 08:34:36.862578', '_unique_id': '8b41bd67431842f0be7409b5f6a78239'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.863 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.864 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.876 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.877 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af22f10a-3452-44d8-b0ef-de595bb9195f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.865052', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '144b52fc-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.564560757, 'message_signature': '1d64a0802d38b23687d14b70c358142ac4f657fdf64c665e51eb8369fdb71d41'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.865052', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '144b638c-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.564560757, 'message_signature': 'ca39882c68c3540c15c9663364e586f0c6c82bad6ba9ad07d31ec43239e026e1'}]}, 'timestamp': '2025-11-22 08:34:36.877826', '_unique_id': '8eda7b63ea7d4e79ac565ddd6b1f93f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.878 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.879 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.incoming.packets volume: 272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2df3e070-2bc6-41fe-9452-3ed65b7f8e8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 272, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.879940', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '144bc322-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': '619021a1003c179918e92bf1dbb1561b3e6b3921f529692c9436e6f27dd7dd6a'}]}, 'timestamp': '2025-11-22 08:34:36.880276', '_unique_id': '40952549050049e2b7884a782490f248'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.881 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.900 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/memory.usage volume: 42.8984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7311ca6d-8a65-409f-9c8d-e9c44fbbb557', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.8984375, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'timestamp': '2025-11-22T08:34:36.881938', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '144ee930-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.599755442, 'message_signature': '3930452508659f8c365a170667e99453fa529b2ffa60950dafc0ab35c6e9d206'}]}, 'timestamp': '2025-11-22 08:34:36.900957', '_unique_id': '76d220ccd41a47c585f787157eacddd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.901 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.903 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.903 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.903 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-839460819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-839460819>]
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.903 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.903 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.903 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66b75342-b3b0-4d6d-bb97-ec053286b081', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.903703', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '144f6158-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.564560757, 'message_signature': '11bfe9a342d428a6d0ec9e3644418b05771033612a1008274d55a823fcdfc848'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.903703', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '144f6bee-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.564560757, 'message_signature': 'e4a1d70761117f219c6e2d73df51c19e8ecdfdd615c729c3aa46ddff37b4589a'}]}, 'timestamp': '2025-11-22 08:34:36.904277', '_unique_id': 'b1da187870a0442188126048ffad6745'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.904 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.906 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.906 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.incoming.bytes volume: 51999 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c25ec9e4-8e96-4a5f-b48c-adb491ddf721', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 51999, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.906370', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '144fcb48-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': '9386508a75727fb27c3b786bcc150b379ec381f5bc2506f75eb69c000fc1bfc5'}]}, 'timestamp': '2025-11-22 08:34:36.906705', '_unique_id': '8eaa55ca2e2748a09fc97dbdcf2bc518'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.907 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.908 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.908 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.908 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80229de0-1598-4054-a824-6cdbdd5f0bde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.908464', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '14501c60-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.564560757, 'message_signature': 'd7270189645f6420edb26af01f9e66f31bfc15d89110edce605f3599a29c89ed'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.908464', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1450261a-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.564560757, 'message_signature': '74161a1de8c3e2d29bd8d74ddc01463b03950f8603eda5d8e2376f0838e1c2b9'}]}, 'timestamp': '2025-11-22 08:34:36.909012', '_unique_id': '3c1cb48eb580433b8e91a3a3c45bb81e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.909 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.910 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '346cd25d-936e-4f1e-9c52-61c421a1808c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.910757', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '145074da-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': '0be4439714a02b9adb4b9293f90ab85c4058b7c13048a8a3323408567e5ffe18'}]}, 'timestamp': '2025-11-22 08:34:36.911044', '_unique_id': 'aec07010f9094683a8b95ac653693a48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.912 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f788e369-ddba-47aa-9eec-20ba64ca92d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.912619', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '1450bea4-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': 'efb8ed0644f561c244f71b329538e107f13baad44eb6bbe0a2110dc0788e4604'}]}, 'timestamp': '2025-11-22 08:34:36.912922', '_unique_id': 'f991f93f6dbb4968a72b0c524005d92f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.914 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.914 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-839460819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-839460819>]
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.915 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.outgoing.packets volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6f20009-5f05-4bd9-a7c6-1a83bfaa9709', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 308, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.915173', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '14512376-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': 'e1bbb2ef3dc59606905e51c7de44310552cc033193a8b1f48a7a3ce2e46d4926'}]}, 'timestamp': '2025-11-22 08:34:36.915528', '_unique_id': '268f73d518384f3ab320661d5c42c6b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.917 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.917 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-839460819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-839460819>]
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b6e4e63-be5e-4c72-acc2-377fb82e78f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.918033', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '145190cc-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': '02fa0b037866003eea845c9659ac0339c53bd12c2bb88a1b0eb2ec2a58baec8c'}]}, 'timestamp': '2025-11-22 08:34:36.918300', '_unique_id': 'bea89a325e9144ab80af071624537818'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.944 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.write.requests volume: 329 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.945 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3036c029-baf6-430a-910a-7ae1fbb645a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 329, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.919982', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1455a4b4-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '64f0cf2d05823ebfd96f193ae9ba20376f70c079504c1b020e43fe17dfeb7233'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.919982', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1455b53a-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '7856a2d5ef614e441929ee3895d62cf74b36aff900ba33896fb75f4dff6641e3'}]}, 'timestamp': '2025-11-22 08:34:36.945531', '_unique_id': 'ae6fa43588854749b6444cebd87fade6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.948 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.948 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.read.latency volume: 3502563652 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.948 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.read.latency volume: 111889879 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '305e06f6-7370-4113-930b-5dcb63cde29d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3502563652, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.948122', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1456293e-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': 'df45b829dcfb7526eeb181a45dec1bb69d6479468c026b3745235474a8afa70d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 111889879, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.948122', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '145634d8-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '303288638b06da1da43cf2ec29b0b69e9c2ac529ad5bf9c94eade4f63f221574'}]}, 'timestamp': '2025-11-22 08:34:36.948749', '_unique_id': '8527fb07ac184e45a43d8de4facc0cbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.951 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.write.latency volume: 5156058974 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.951 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56f7c581-7663-4ddf-bc4a-ad24dbb3bda8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5156058974, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.950983', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '14569a04-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '9e221b19c7c6f0491a425454efccfdf3f95d43494704697400761dc9380bf227'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.950983', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1456a6e8-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '2befdbc0e4e5676b2cd55052f8e127ec5b7ac588380b1c6faff16eea77b4cbdd'}]}, 'timestamp': '2025-11-22 08:34:36.951689', '_unique_id': '0211e96cf04149dba81f4bcc222568da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.952 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.953 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.954 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.read.requests volume: 1141 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.954 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4370eb7-041c-4ce7-a89a-207adddc1af4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1141, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.954002', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '14570fc0-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '12deb8f2ee89c7824ff98b10c084839eff60cb2dbff5bddff67716aa331b12c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.954002', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '14571c18-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': 'c09be84e1dfea2f5ba5efd1d3340918e047de19de4235106830554fbc08db927'}]}, 'timestamp': '2025-11-22 08:34:36.954682', '_unique_id': 'a6adccd54e174fbfade1c4c67931d19c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.956 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.956 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.write.bytes volume: 73048064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.957 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '347f77ba-ae86-4f6f-a8e1-919cf62a6b7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73048064, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.956931', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '145782fc-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '26ee65f517beaeddd506f2e277f5832c529ec65fb3e31f8406f91b0dac7b96b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.956931', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '14579170-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '500ac93da8411cceb7d0af09af7e453c0ac5775640aef3a7bce7e71dc71b3eec'}]}, 'timestamp': '2025-11-22 08:34:36.957687', '_unique_id': 'c41068b9a18d4d5fab4039a235543840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.959 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab5dcdb9-324c-4b5d-b2c1-b22846098ca9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.959877', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '1457f5e8-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': '81a1fd2651c129c6c95f6bb6430a16480d5de5d8f93b8f1b02a1afc4df97ef2f'}]}, 'timestamp': '2025-11-22 08:34:36.960263', '_unique_id': '7ab2e76a99874373b735c101396aac91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.961 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.outgoing.bytes volume: 45678 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80dcec62-ffc0-4e86-8096-c215c5bc8b6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 45678, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.961995', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '145847a0-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': '959a461d20fed1ed8f0e959801e9aaa5659a012b50254b7ff3a6214963288265'}]}, 'timestamp': '2025-11-22 08:34:36.962311', '_unique_id': 'f5003a3cdf66459ca074262fb0ac37a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.963 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.964 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.read.bytes volume: 31091200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.965 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27c18328-491e-4e59-be8e-9ed7c07a45bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31091200, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-vda', 'timestamp': '2025-11-22T08:34:36.964413', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1458ad3a-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': '86403075320cdcc97a20175bd8ce4a9c207cf4c91a2e73448b8636a57f3b258e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105-sda', 'timestamp': '2025-11-22T08:34:36.964413', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1458bdc0-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.619494847, 'message_signature': 'b2a2e98165faf08858438e4dcec851f5cf287f1e84747553e8c6f7e158bcb098'}]}, 'timestamp': '2025-11-22 08:34:36.965403', '_unique_id': '2e4f1157c922434aa87e464d1d2a6c93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.968 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '042f3286-3b93-42cf-9bd5-3181bda16418', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a5-28c4266d-8891-42aa-b05f-9e25e32a2105-tapbfb6d679-03', 'timestamp': '2025-11-22T08:34:36.968883', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'tapbfb6d679-03', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:bf:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb6d679-03'}, 'message_id': '14595898-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.557819011, 'message_signature': '094b768a61f765ce275fdb84186af14a7b39f540876c9c4e34189c7d27d386b7'}]}, 'timestamp': '2025-11-22 08:34:36.969430', '_unique_id': 'ae5abdbe28fb4df498ce6888dcafe5c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.970 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.971 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.972 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.972 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-839460819>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-839460819>]
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.972 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.972 12 DEBUG ceilometer.compute.pollsters [-] 28c4266d-8891-42aa-b05f-9e25e32a2105/cpu volume: 15290000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb4f6175-bb25-4730-84c0-9bd05b527ef8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15290000000, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'timestamp': '2025-11-22T08:34:36.972913', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-839460819', 'name': 'instance-000000a5', 'instance_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1459f654-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7217.599755442, 'message_signature': 'a3e784c158a9ffa6d39a66841522bc13119b446d3d884930e7f8c65bb3fdecc6'}]}, 'timestamp': '2025-11-22 08:34:36.973470', '_unique_id': 'c55ac9f2bb7849e68360d00d18b35c7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:34:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:34:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:34:38 np0005531888 nova_compute[186788]: 2025-11-22 08:34:38.434 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:38 np0005531888 nova_compute[186788]: 2025-11-22 08:34:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:39 np0005531888 nova_compute[186788]: 2025-11-22 08:34:39.323 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:39 np0005531888 podman[245917]: 2025-11-22 08:34:39.687094706 +0000 UTC m=+0.055133556 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:34:39 np0005531888 podman[245918]: 2025-11-22 08:34:39.719509954 +0000 UTC m=+0.084231303 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 03:34:39 np0005531888 podman[245916]: 2025-11-22 08:34:39.725593094 +0000 UTC m=+0.092327262 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, distribution-scope=public)
Nov 22 03:34:41 np0005531888 nova_compute[186788]: 2025-11-22 08:34:41.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:41 np0005531888 nova_compute[186788]: 2025-11-22 08:34:41.964 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:34:41 np0005531888 nova_compute[186788]: 2025-11-22 08:34:41.977 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:34:43 np0005531888 nova_compute[186788]: 2025-11-22 08:34:43.435 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:44 np0005531888 nova_compute[186788]: 2025-11-22 08:34:44.325 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:34:45Z|00680|binding|INFO|Releasing lport c2e4c3a5-2adc-45b3-861d-eb69d02fc3dd from this chassis (sb_readonly=0)
Nov 22 03:34:45 np0005531888 nova_compute[186788]: 2025-11-22 08:34:45.963 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.202 186792 DEBUG nova.compute.manager [req-8c0dac71-c83e-4857-aacd-bb941930ce24 req-44ff7660-bdf7-4af7-a90e-9ca407e6fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-changed-bfb6d679-0393-475e-aa21-80b081b6dd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.203 186792 DEBUG nova.compute.manager [req-8c0dac71-c83e-4857-aacd-bb941930ce24 req-44ff7660-bdf7-4af7-a90e-9ca407e6fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Refreshing instance network info cache due to event network-changed-bfb6d679-0393-475e-aa21-80b081b6dd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.203 186792 DEBUG oslo_concurrency.lockutils [req-8c0dac71-c83e-4857-aacd-bb941930ce24 req-44ff7660-bdf7-4af7-a90e-9ca407e6fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.203 186792 DEBUG oslo_concurrency.lockutils [req-8c0dac71-c83e-4857-aacd-bb941930ce24 req-44ff7660-bdf7-4af7-a90e-9ca407e6fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.204 186792 DEBUG nova.network.neutron [req-8c0dac71-c83e-4857-aacd-bb941930ce24 req-44ff7660-bdf7-4af7-a90e-9ca407e6fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Refreshing network info cache for port bfb6d679-0393-475e-aa21-80b081b6dd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.288 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.290 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.291 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.313 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "28c4266d-8891-42aa-b05f-9e25e32a2105" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.314 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.314 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.314 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.315 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.325 186792 INFO nova.compute.manager [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Terminating instance#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.335 186792 DEBUG nova.compute.manager [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:34:47 np0005531888 kernel: tapbfb6d679-03 (unregistering): left promiscuous mode
Nov 22 03:34:47 np0005531888 NetworkManager[55166]: <info>  [1763800487.3697] device (tapbfb6d679-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:34:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:34:47Z|00681|binding|INFO|Releasing lport bfb6d679-0393-475e-aa21-80b081b6dd4a from this chassis (sb_readonly=0)
Nov 22 03:34:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:34:47Z|00682|binding|INFO|Setting lport bfb6d679-0393-475e-aa21-80b081b6dd4a down in Southbound
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.380 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 ovn_controller[95067]: 2025-11-22T08:34:47Z|00683|binding|INFO|Removing iface tapbfb6d679-03 ovn-installed in OVS
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.390 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:bf:79 10.100.0.4'], port_security=['fa:16:3e:72:bf:79 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '28c4266d-8891-42aa-b05f-9e25e32a2105', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e034377b-e7dc-4d6e-b7c8-53f948b62761', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4283ce39-f10b-4253-aafd-7adef97db372', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5e07858-1687-49e6-a10b-a34ed4fdb0a1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=bfb6d679-0393-475e-aa21-80b081b6dd4a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.392 104023 INFO neutron.agent.ovn.metadata.agent [-] Port bfb6d679-0393-475e-aa21-80b081b6dd4a in datapath e034377b-e7dc-4d6e-b7c8-53f948b62761 unbound from our chassis#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.393 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e034377b-e7dc-4d6e-b7c8-53f948b62761, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.395 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7e9fafb0-bc78-4a1f-a3ce-bfd969f8ca9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.395 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761 namespace which is not needed anymore#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.399 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Nov 22 03:34:47 np0005531888 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a5.scope: Consumed 18.777s CPU time.
Nov 22 03:34:47 np0005531888 systemd-machined[153106]: Machine qemu-81-instance-000000a5 terminated.
Nov 22 03:34:47 np0005531888 neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761[245572]: [NOTICE]   (245576) : haproxy version is 2.8.14-c23fe91
Nov 22 03:34:47 np0005531888 neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761[245572]: [NOTICE]   (245576) : path to executable is /usr/sbin/haproxy
Nov 22 03:34:47 np0005531888 neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761[245572]: [WARNING]  (245576) : Exiting Master process...
Nov 22 03:34:47 np0005531888 neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761[245572]: [ALERT]    (245576) : Current worker (245578) exited with code 143 (Terminated)
Nov 22 03:34:47 np0005531888 neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761[245572]: [WARNING]  (245576) : All workers exited. Exiting... (0)
Nov 22 03:34:47 np0005531888 systemd[1]: libpod-17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9.scope: Deactivated successfully.
Nov 22 03:34:47 np0005531888 podman[246010]: 2025-11-22 08:34:47.529399716 +0000 UTC m=+0.047844697 container died 17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:34:47 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9-userdata-shm.mount: Deactivated successfully.
Nov 22 03:34:47 np0005531888 systemd[1]: var-lib-containers-storage-overlay-0f1b481e7e02a7f11b2697903e95d424d89a633f8953440536b30ec15739c669-merged.mount: Deactivated successfully.
Nov 22 03:34:47 np0005531888 podman[246010]: 2025-11-22 08:34:47.589113205 +0000 UTC m=+0.107558166 container cleanup 17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:34:47 np0005531888 systemd[1]: libpod-conmon-17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9.scope: Deactivated successfully.
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.612 186792 INFO nova.virt.libvirt.driver [-] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Instance destroyed successfully.#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.613 186792 DEBUG nova.objects.instance [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid 28c4266d-8891-42aa-b05f-9e25e32a2105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.625 186792 DEBUG nova.virt.libvirt.vif [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-839460819',display_name='tempest-TestNetworkBasicOps-server-839460819',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-839460819',id=165,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOfiPITIfaG7HOmmrU+dUnF4R1GQwF/SErx/rVrjQAma6JYHV+pzvBu8LoitrnT3oUHPG7p23mRG8NO9kH96MBDZSSjQFLRMUUQzr+b8Rb5gn4T1YV7NdB2v+tYxsKF1Q==',key_name='tempest-TestNetworkBasicOps-609619719',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:33:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-0vb8uett',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:33:35Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=28c4266d-8891-42aa-b05f-9e25e32a2105,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.626 186792 DEBUG nova.network.os_vif_util [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.626 186792 DEBUG nova.network.os_vif_util [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:72:bf:79,bridge_name='br-int',has_traffic_filtering=True,id=bfb6d679-0393-475e-aa21-80b081b6dd4a,network=Network(e034377b-e7dc-4d6e-b7c8-53f948b62761),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb6d679-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.627 186792 DEBUG os_vif [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:bf:79,bridge_name='br-int',has_traffic_filtering=True,id=bfb6d679-0393-475e-aa21-80b081b6dd4a,network=Network(e034377b-e7dc-4d6e-b7c8-53f948b62761),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb6d679-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.629 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.629 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfb6d679-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.631 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.632 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.635 186792 INFO os_vif [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:bf:79,bridge_name='br-int',has_traffic_filtering=True,id=bfb6d679-0393-475e-aa21-80b081b6dd4a,network=Network(e034377b-e7dc-4d6e-b7c8-53f948b62761),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb6d679-03')#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.636 186792 INFO nova.virt.libvirt.driver [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Deleting instance files /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105_del#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.637 186792 INFO nova.virt.libvirt.driver [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Deletion of /var/lib/nova/instances/28c4266d-8891-42aa-b05f-9e25e32a2105_del complete#033[00m
Nov 22 03:34:47 np0005531888 podman[246053]: 2025-11-22 08:34:47.671943402 +0000 UTC m=+0.053930427 container remove 17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.676 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6bc566-f20e-47c6-b9db-8db7b0f3b2e7]: (4, ('Sat Nov 22 08:34:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761 (17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9)\n17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9\nSat Nov 22 08:34:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761 (17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9)\n17b293c29ff8144445646d9d894d76975f12348fb2f5cd8ab42e5f03c84eebf9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.678 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[91fa3209-2417-46b3-ba03-84de680b1003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.679 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape034377b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.680 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 kernel: tape034377b-e0: left promiscuous mode
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.691 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.694 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d719243c-2737-4cf9-9527-7d6353264e08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.706 186792 INFO nova.compute.manager [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.706 186792 DEBUG oslo.service.loopingcall [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.706 186792 DEBUG nova.compute.manager [-] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:34:47 np0005531888 nova_compute[186788]: 2025-11-22 08:34:47.706 186792 DEBUG nova.network.neutron [-] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.711 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed0ab37-234e-4717-9c76-f3e246fb4551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.713 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c3901fb0-1684-4f56-9d1a-21b164ef6e81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.730 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa21b26-ca45-450d-80b4-5d177c30c213]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715524, 'reachable_time': 18557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246069, 'error': None, 'target': 'ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.732 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e034377b-e7dc-4d6e-b7c8-53f948b62761 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:34:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:47.733 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[38d08b97-4b01-4c02-b97e-a4f555fca0ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:34:47 np0005531888 systemd[1]: run-netns-ovnmeta\x2de034377b\x2de7dc\x2d4d6e\x2db7c8\x2d53f948b62761.mount: Deactivated successfully.
Nov 22 03:34:48 np0005531888 nova_compute[186788]: 2025-11-22 08:34:48.253 186792 DEBUG nova.compute.manager [req-5f5a6422-2542-46e6-a177-eb7a5f15d75f req-0cc44e87-f9b4-43a3-9613-8a74ef98d7eb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-vif-unplugged-bfb6d679-0393-475e-aa21-80b081b6dd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:34:48 np0005531888 nova_compute[186788]: 2025-11-22 08:34:48.253 186792 DEBUG oslo_concurrency.lockutils [req-5f5a6422-2542-46e6-a177-eb7a5f15d75f req-0cc44e87-f9b4-43a3-9613-8a74ef98d7eb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:48 np0005531888 nova_compute[186788]: 2025-11-22 08:34:48.254 186792 DEBUG oslo_concurrency.lockutils [req-5f5a6422-2542-46e6-a177-eb7a5f15d75f req-0cc44e87-f9b4-43a3-9613-8a74ef98d7eb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:48 np0005531888 nova_compute[186788]: 2025-11-22 08:34:48.254 186792 DEBUG oslo_concurrency.lockutils [req-5f5a6422-2542-46e6-a177-eb7a5f15d75f req-0cc44e87-f9b4-43a3-9613-8a74ef98d7eb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:48 np0005531888 nova_compute[186788]: 2025-11-22 08:34:48.255 186792 DEBUG nova.compute.manager [req-5f5a6422-2542-46e6-a177-eb7a5f15d75f req-0cc44e87-f9b4-43a3-9613-8a74ef98d7eb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] No waiting events found dispatching network-vif-unplugged-bfb6d679-0393-475e-aa21-80b081b6dd4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:34:48 np0005531888 nova_compute[186788]: 2025-11-22 08:34:48.255 186792 DEBUG nova.compute.manager [req-5f5a6422-2542-46e6-a177-eb7a5f15d75f req-0cc44e87-f9b4-43a3-9613-8a74ef98d7eb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-vif-unplugged-bfb6d679-0393-475e-aa21-80b081b6dd4a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:34:49 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:34:49.292 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.326 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.346 186792 DEBUG nova.network.neutron [-] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.382 186792 INFO nova.compute.manager [-] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Took 1.68 seconds to deallocate network for instance.#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.479 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.480 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.538 186792 DEBUG nova.compute.provider_tree [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.549 186792 DEBUG nova.scheduler.client.report [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.575 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.607 186792 INFO nova.scheduler.client.report [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance 28c4266d-8891-42aa-b05f-9e25e32a2105#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.698 186792 DEBUG oslo_concurrency.lockutils [None req-ca04f954-189e-475b-a859-406722b28488 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.826 186792 DEBUG nova.network.neutron [req-8c0dac71-c83e-4857-aacd-bb941930ce24 req-44ff7660-bdf7-4af7-a90e-9ca407e6fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updated VIF entry in instance network info cache for port bfb6d679-0393-475e-aa21-80b081b6dd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.827 186792 DEBUG nova.network.neutron [req-8c0dac71-c83e-4857-aacd-bb941930ce24 req-44ff7660-bdf7-4af7-a90e-9ca407e6fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Updating instance_info_cache with network_info: [{"id": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "address": "fa:16:3e:72:bf:79", "network": {"id": "e034377b-e7dc-4d6e-b7c8-53f948b62761", "bridge": "br-int", "label": "tempest-network-smoke--1397682704", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6d679-03", "ovs_interfaceid": "bfb6d679-0393-475e-aa21-80b081b6dd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:34:49 np0005531888 nova_compute[186788]: 2025-11-22 08:34:49.844 186792 DEBUG oslo_concurrency.lockutils [req-8c0dac71-c83e-4857-aacd-bb941930ce24 req-44ff7660-bdf7-4af7-a90e-9ca407e6fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-28c4266d-8891-42aa-b05f-9e25e32a2105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:34:50 np0005531888 nova_compute[186788]: 2025-11-22 08:34:50.328 186792 DEBUG nova.compute.manager [req-b28400da-b4db-4ad7-a5d5-b3e6777f7525 req-823b7e0e-1a0e-415d-ae1a-ed87298399ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:34:50 np0005531888 nova_compute[186788]: 2025-11-22 08:34:50.329 186792 DEBUG oslo_concurrency.lockutils [req-b28400da-b4db-4ad7-a5d5-b3e6777f7525 req-823b7e0e-1a0e-415d-ae1a-ed87298399ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:50 np0005531888 nova_compute[186788]: 2025-11-22 08:34:50.329 186792 DEBUG oslo_concurrency.lockutils [req-b28400da-b4db-4ad7-a5d5-b3e6777f7525 req-823b7e0e-1a0e-415d-ae1a-ed87298399ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:50 np0005531888 nova_compute[186788]: 2025-11-22 08:34:50.329 186792 DEBUG oslo_concurrency.lockutils [req-b28400da-b4db-4ad7-a5d5-b3e6777f7525 req-823b7e0e-1a0e-415d-ae1a-ed87298399ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "28c4266d-8891-42aa-b05f-9e25e32a2105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:50 np0005531888 nova_compute[186788]: 2025-11-22 08:34:50.329 186792 DEBUG nova.compute.manager [req-b28400da-b4db-4ad7-a5d5-b3e6777f7525 req-823b7e0e-1a0e-415d-ae1a-ed87298399ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] No waiting events found dispatching network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:34:50 np0005531888 nova_compute[186788]: 2025-11-22 08:34:50.329 186792 WARNING nova.compute.manager [req-b28400da-b4db-4ad7-a5d5-b3e6777f7525 req-823b7e0e-1a0e-415d-ae1a-ed87298399ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received unexpected event network-vif-plugged-bfb6d679-0393-475e-aa21-80b081b6dd4a for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:34:50 np0005531888 nova_compute[186788]: 2025-11-22 08:34:50.330 186792 DEBUG nova.compute.manager [req-b28400da-b4db-4ad7-a5d5-b3e6777f7525 req-823b7e0e-1a0e-415d-ae1a-ed87298399ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Received event network-vif-deleted-bfb6d679-0393-475e-aa21-80b081b6dd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:34:50 np0005531888 nova_compute[186788]: 2025-11-22 08:34:50.962 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:52 np0005531888 nova_compute[186788]: 2025-11-22 08:34:52.633 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:53 np0005531888 nova_compute[186788]: 2025-11-22 08:34:53.295 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:53 np0005531888 nova_compute[186788]: 2025-11-22 08:34:53.381 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:54 np0005531888 nova_compute[186788]: 2025-11-22 08:34:54.329 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:54 np0005531888 podman[246073]: 2025-11-22 08:34:54.67277313 +0000 UTC m=+0.049556689 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:34:54 np0005531888 podman[246072]: 2025-11-22 08:34:54.673187821 +0000 UTC m=+0.052418290 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:34:57 np0005531888 nova_compute[186788]: 2025-11-22 08:34:57.636 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:59 np0005531888 nova_compute[186788]: 2025-11-22 08:34:59.331 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:02 np0005531888 nova_compute[186788]: 2025-11-22 08:35:02.611 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800487.60973, 28c4266d-8891-42aa-b05f-9e25e32a2105 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:35:02 np0005531888 nova_compute[186788]: 2025-11-22 08:35:02.612 186792 INFO nova.compute.manager [-] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:35:02 np0005531888 nova_compute[186788]: 2025-11-22 08:35:02.640 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:02 np0005531888 nova_compute[186788]: 2025-11-22 08:35:02.675 186792 DEBUG nova.compute.manager [None req-fca09233-5143-4404-8040-4fe872205ab5 - - - - - -] [instance: 28c4266d-8891-42aa-b05f-9e25e32a2105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:35:04 np0005531888 nova_compute[186788]: 2025-11-22 08:35:04.333 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:06 np0005531888 podman[246116]: 2025-11-22 08:35:06.670331353 +0000 UTC m=+0.045916712 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:35:06 np0005531888 podman[246115]: 2025-11-22 08:35:06.701413485 +0000 UTC m=+0.079318603 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:35:07 np0005531888 nova_compute[186788]: 2025-11-22 08:35:07.643 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:09 np0005531888 nova_compute[186788]: 2025-11-22 08:35:09.335 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:10 np0005531888 podman[200996]: time="2025-11-22T08:35:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 03:35:10 np0005531888 podman[200996]: @ - - [22/Nov/2025:08:35:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22840 "" "Go-http-client/1.1"
Nov 22 03:35:10 np0005531888 podman[246159]: 2025-11-22 08:35:10.694834071 +0000 UTC m=+0.068479697 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_id=edpm, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Nov 22 03:35:10 np0005531888 podman[246160]: 2025-11-22 08:35:10.702419954 +0000 UTC m=+0.073099195 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:35:10 np0005531888 podman[246161]: 2025-11-22 08:35:10.731729651 +0000 UTC m=+0.090153639 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:35:12 np0005531888 nova_compute[186788]: 2025-11-22 08:35:12.646 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:14 np0005531888 nova_compute[186788]: 2025-11-22 08:35:14.336 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:15 np0005531888 nova_compute[186788]: 2025-11-22 08:35:15.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:17 np0005531888 nova_compute[186788]: 2025-11-22 08:35:17.650 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:17 np0005531888 nova_compute[186788]: 2025-11-22 08:35:17.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:17 np0005531888 nova_compute[186788]: 2025-11-22 08:35:17.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:17 np0005531888 nova_compute[186788]: 2025-11-22 08:35:17.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:35:17 np0005531888 nova_compute[186788]: 2025-11-22 08:35:17.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:35:17 np0005531888 nova_compute[186788]: 2025-11-22 08:35:17.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:35:19 np0005531888 nova_compute[186788]: 2025-11-22 08:35:19.338 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:22 np0005531888 nova_compute[186788]: 2025-11-22 08:35:22.654 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:23 np0005531888 nova_compute[186788]: 2025-11-22 08:35:23.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:24 np0005531888 nova_compute[186788]: 2025-11-22 08:35:24.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:25 np0005531888 podman[246224]: 2025-11-22 08:35:25.685393764 +0000 UTC m=+0.053205317 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:35:25 np0005531888 podman[246223]: 2025-11-22 08:35:25.732467355 +0000 UTC m=+0.091923225 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:35:25 np0005531888 nova_compute[186788]: 2025-11-22 08:35:25.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:27 np0005531888 nova_compute[186788]: 2025-11-22 08:35:27.658 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:29 np0005531888 nova_compute[186788]: 2025-11-22 08:35:29.343 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:30 np0005531888 nova_compute[186788]: 2025-11-22 08:35:30.891 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:31 np0005531888 nova_compute[186788]: 2025-11-22 08:35:31.957 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:31 np0005531888 nova_compute[186788]: 2025-11-22 08:35:31.958 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:32 np0005531888 nova_compute[186788]: 2025-11-22 08:35:32.661 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:32 np0005531888 nova_compute[186788]: 2025-11-22 08:35:32.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:32 np0005531888 nova_compute[186788]: 2025-11-22 08:35:32.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:35:32 np0005531888 nova_compute[186788]: 2025-11-22 08:35:32.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:35:32 np0005531888 nova_compute[186788]: 2025-11-22 08:35:32.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:35:32 np0005531888 nova_compute[186788]: 2025-11-22 08:35:32.977 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.147 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.148 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5709MB free_disk=73.26720809936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.149 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.149 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.222 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.222 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.241 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.264 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.265 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.281 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.312 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.339 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.358 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.434 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:35:33 np0005531888 nova_compute[186788]: 2025-11-22 08:35:33.434 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:35:34 np0005531888 nova_compute[186788]: 2025-11-22 08:35:34.345 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:36 np0005531888 nova_compute[186788]: 2025-11-22 08:35:36.434 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:36 np0005531888 nova_compute[186788]: 2025-11-22 08:35:36.435 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:35:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:35:36.855 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:35:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:35:36.855 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:35:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:35:36.855 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:35:37 np0005531888 nova_compute[186788]: 2025-11-22 08:35:37.664 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:37 np0005531888 podman[246269]: 2025-11-22 08:35:37.687251612 +0000 UTC m=+0.060741340 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:35:37 np0005531888 podman[246270]: 2025-11-22 08:35:37.705490117 +0000 UTC m=+0.077782355 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:35:38 np0005531888 ovn_controller[95067]: 2025-11-22T08:35:38Z|00684|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 22 03:35:39 np0005531888 nova_compute[186788]: 2025-11-22 08:35:39.347 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:41 np0005531888 podman[246313]: 2025-11-22 08:35:41.678364147 +0000 UTC m=+0.055484285 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Nov 22 03:35:41 np0005531888 podman[246314]: 2025-11-22 08:35:41.697526276 +0000 UTC m=+0.063754596 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:35:41 np0005531888 podman[246315]: 2025-11-22 08:35:41.70278554 +0000 UTC m=+0.070725064 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:35:42 np0005531888 nova_compute[186788]: 2025-11-22 08:35:42.667 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:44 np0005531888 nova_compute[186788]: 2025-11-22 08:35:44.348 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:47 np0005531888 nova_compute[186788]: 2025-11-22 08:35:47.671 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:49 np0005531888 nova_compute[186788]: 2025-11-22 08:35:49.352 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:50 np0005531888 nova_compute[186788]: 2025-11-22 08:35:50.879 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:52 np0005531888 nova_compute[186788]: 2025-11-22 08:35:52.675 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:54 np0005531888 nova_compute[186788]: 2025-11-22 08:35:54.352 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:56 np0005531888 podman[246382]: 2025-11-22 08:35:56.676669999 +0000 UTC m=+0.046312832 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:35:56 np0005531888 podman[246381]: 2025-11-22 08:35:56.676669779 +0000 UTC m=+0.050524889 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:35:57 np0005531888 nova_compute[186788]: 2025-11-22 08:35:57.678 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:59 np0005531888 nova_compute[186788]: 2025-11-22 08:35:59.354 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:02 np0005531888 nova_compute[186788]: 2025-11-22 08:36:02.682 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:04 np0005531888 nova_compute[186788]: 2025-11-22 08:36:04.355 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:07 np0005531888 nova_compute[186788]: 2025-11-22 08:36:07.686 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:08 np0005531888 podman[246423]: 2025-11-22 08:36:08.678689738 +0000 UTC m=+0.051967865 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:36:08 np0005531888 podman[246422]: 2025-11-22 08:36:08.685789289 +0000 UTC m=+0.062041022 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:36:09 np0005531888 nova_compute[186788]: 2025-11-22 08:36:09.357 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:12 np0005531888 nova_compute[186788]: 2025-11-22 08:36:12.394 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:36:12.394 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:36:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:36:12.396 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:36:12 np0005531888 podman[246468]: 2025-11-22 08:36:12.678749532 +0000 UTC m=+0.048847047 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 03:36:12 np0005531888 nova_compute[186788]: 2025-11-22 08:36:12.687 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:12 np0005531888 podman[246467]: 2025-11-22 08:36:12.709536117 +0000 UTC m=+0.082136155 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 22 03:36:12 np0005531888 podman[246469]: 2025-11-22 08:36:12.714901034 +0000 UTC m=+0.078254066 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:36:14 np0005531888 nova_compute[186788]: 2025-11-22 08:36:14.359 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:16 np0005531888 nova_compute[186788]: 2025-11-22 08:36:16.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:17 np0005531888 nova_compute[186788]: 2025-11-22 08:36:17.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:17 np0005531888 nova_compute[186788]: 2025-11-22 08:36:17.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:17 np0005531888 nova_compute[186788]: 2025-11-22 08:36:17.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:17 np0005531888 nova_compute[186788]: 2025-11-22 08:36:17.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:36:17 np0005531888 nova_compute[186788]: 2025-11-22 08:36:17.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:36:17 np0005531888 nova_compute[186788]: 2025-11-22 08:36:17.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:36:19 np0005531888 nova_compute[186788]: 2025-11-22 08:36:19.360 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:36:22.398 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:36:22 np0005531888 nova_compute[186788]: 2025-11-22 08:36:22.694 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:23 np0005531888 nova_compute[186788]: 2025-11-22 08:36:23.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:24 np0005531888 nova_compute[186788]: 2025-11-22 08:36:24.362 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:26 np0005531888 nova_compute[186788]: 2025-11-22 08:36:26.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:27 np0005531888 podman[246534]: 2025-11-22 08:36:27.679848845 +0000 UTC m=+0.050942830 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:36:27 np0005531888 podman[246535]: 2025-11-22 08:36:27.68434447 +0000 UTC m=+0.051402871 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 03:36:27 np0005531888 nova_compute[186788]: 2025-11-22 08:36:27.697 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:29 np0005531888 nova_compute[186788]: 2025-11-22 08:36:29.365 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:31 np0005531888 nova_compute[186788]: 2025-11-22 08:36:31.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:31 np0005531888 nova_compute[186788]: 2025-11-22 08:36:31.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:32 np0005531888 nova_compute[186788]: 2025-11-22 08:36:32.700 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:32 np0005531888 nova_compute[186788]: 2025-11-22 08:36:32.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:32 np0005531888 nova_compute[186788]: 2025-11-22 08:36:32.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:36:32 np0005531888 nova_compute[186788]: 2025-11-22 08:36:32.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:36:32 np0005531888 nova_compute[186788]: 2025-11-22 08:36:32.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:36:32 np0005531888 nova_compute[186788]: 2025-11-22 08:36:32.983 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.131 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.132 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5715MB free_disk=73.26722717285156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.132 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.133 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.274 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.275 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.385 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.400 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.401 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:36:33 np0005531888 nova_compute[186788]: 2025-11-22 08:36:33.401 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:36:34 np0005531888 nova_compute[186788]: 2025-11-22 08:36:34.366 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.853 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.854 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:36:36.855 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:36:36.856 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:36:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:36:36.856 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:36:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:36:36.856 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:36:37 np0005531888 nova_compute[186788]: 2025-11-22 08:36:37.402 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:37 np0005531888 nova_compute[186788]: 2025-11-22 08:36:37.403 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:36:37 np0005531888 nova_compute[186788]: 2025-11-22 08:36:37.703 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:39 np0005531888 nova_compute[186788]: 2025-11-22 08:36:39.367 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:39 np0005531888 podman[246578]: 2025-11-22 08:36:39.687513279 +0000 UTC m=+0.057889786 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:36:39 np0005531888 podman[246577]: 2025-11-22 08:36:39.695309278 +0000 UTC m=+0.065692356 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:36:42 np0005531888 nova_compute[186788]: 2025-11-22 08:36:42.707 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:43 np0005531888 podman[246618]: 2025-11-22 08:36:43.675882505 +0000 UTC m=+0.052891330 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Nov 22 03:36:43 np0005531888 podman[246619]: 2025-11-22 08:36:43.684668289 +0000 UTC m=+0.059647082 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:36:43 np0005531888 podman[246620]: 2025-11-22 08:36:43.711407941 +0000 UTC m=+0.081907120 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 03:36:44 np0005531888 nova_compute[186788]: 2025-11-22 08:36:44.369 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:47 np0005531888 nova_compute[186788]: 2025-11-22 08:36:47.710 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:49 np0005531888 nova_compute[186788]: 2025-11-22 08:36:49.371 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:52 np0005531888 nova_compute[186788]: 2025-11-22 08:36:52.714 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:53 np0005531888 nova_compute[186788]: 2025-11-22 08:36:53.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:54 np0005531888 nova_compute[186788]: 2025-11-22 08:36:54.373 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:57 np0005531888 nova_compute[186788]: 2025-11-22 08:36:57.718 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:58 np0005531888 podman[246685]: 2025-11-22 08:36:58.686635363 +0000 UTC m=+0.058311628 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:36:58 np0005531888 podman[246684]: 2025-11-22 08:36:58.687051044 +0000 UTC m=+0.062857915 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:36:59 np0005531888 nova_compute[186788]: 2025-11-22 08:36:59.375 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:02 np0005531888 nova_compute[186788]: 2025-11-22 08:37:02.721 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:04 np0005531888 nova_compute[186788]: 2025-11-22 08:37:04.375 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:07 np0005531888 nova_compute[186788]: 2025-11-22 08:37:07.724 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:09 np0005531888 nova_compute[186788]: 2025-11-22 08:37:09.377 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:10 np0005531888 podman[246726]: 2025-11-22 08:37:10.682832405 +0000 UTC m=+0.056836281 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:37:10 np0005531888 podman[246727]: 2025-11-22 08:37:10.703549883 +0000 UTC m=+0.063871420 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:37:12 np0005531888 nova_compute[186788]: 2025-11-22 08:37:12.728 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:14 np0005531888 nova_compute[186788]: 2025-11-22 08:37:14.379 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:14 np0005531888 podman[246769]: 2025-11-22 08:37:14.673611832 +0000 UTC m=+0.050987621 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350)
Nov 22 03:37:14 np0005531888 podman[246770]: 2025-11-22 08:37:14.682488258 +0000 UTC m=+0.053540476 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 03:37:14 np0005531888 podman[246776]: 2025-11-22 08:37:14.720238021 +0000 UTC m=+0.079562080 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:37:16 np0005531888 nova_compute[186788]: 2025-11-22 08:37:16.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:37:17.077 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:37:17 np0005531888 nova_compute[186788]: 2025-11-22 08:37:17.077 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:17 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:37:17.078 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:37:17 np0005531888 nova_compute[186788]: 2025-11-22 08:37:17.730 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:17 np0005531888 nova_compute[186788]: 2025-11-22 08:37:17.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:17 np0005531888 nova_compute[186788]: 2025-11-22 08:37:17.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:17 np0005531888 nova_compute[186788]: 2025-11-22 08:37:17.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:37:17 np0005531888 nova_compute[186788]: 2025-11-22 08:37:17.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:37:18 np0005531888 nova_compute[186788]: 2025-11-22 08:37:18.011 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:37:19 np0005531888 nova_compute[186788]: 2025-11-22 08:37:19.380 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:37:22.080 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:37:22 np0005531888 nova_compute[186788]: 2025-11-22 08:37:22.733 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:24 np0005531888 nova_compute[186788]: 2025-11-22 08:37:24.382 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:25 np0005531888 nova_compute[186788]: 2025-11-22 08:37:25.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:27 np0005531888 nova_compute[186788]: 2025-11-22 08:37:27.736 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:27 np0005531888 nova_compute[186788]: 2025-11-22 08:37:27.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:29 np0005531888 nova_compute[186788]: 2025-11-22 08:37:29.383 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:29 np0005531888 podman[246832]: 2025-11-22 08:37:29.697938606 +0000 UTC m=+0.064280221 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:37:29 np0005531888 podman[246833]: 2025-11-22 08:37:29.712379745 +0000 UTC m=+0.074247505 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:37:32 np0005531888 nova_compute[186788]: 2025-11-22 08:37:32.740 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:32 np0005531888 nova_compute[186788]: 2025-11-22 08:37:32.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:32 np0005531888 nova_compute[186788]: 2025-11-22 08:37:32.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:32 np0005531888 nova_compute[186788]: 2025-11-22 08:37:32.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:37:32 np0005531888 nova_compute[186788]: 2025-11-22 08:37:32.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:37:32 np0005531888 nova_compute[186788]: 2025-11-22 08:37:32.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:37:32 np0005531888 nova_compute[186788]: 2025-11-22 08:37:32.984 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.158 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.159 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5714MB free_disk=73.26722717285156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.159 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.159 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.276 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.277 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.327 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.702 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.704 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:37:33 np0005531888 nova_compute[186788]: 2025-11-22 08:37:33.704 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:37:34 np0005531888 nova_compute[186788]: 2025-11-22 08:37:34.384 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:34 np0005531888 nova_compute[186788]: 2025-11-22 08:37:34.705 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:37:36.856 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:37:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:37:36.857 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:37:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:37:36.857 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:37:37 np0005531888 nova_compute[186788]: 2025-11-22 08:37:37.744 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:38 np0005531888 nova_compute[186788]: 2025-11-22 08:37:38.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:38 np0005531888 nova_compute[186788]: 2025-11-22 08:37:38.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:37:39 np0005531888 nova_compute[186788]: 2025-11-22 08:37:39.387 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:41 np0005531888 podman[246876]: 2025-11-22 08:37:41.67734078 +0000 UTC m=+0.046645250 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:37:41 np0005531888 podman[246875]: 2025-11-22 08:37:41.677717341 +0000 UTC m=+0.052006879 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:37:42 np0005531888 nova_compute[186788]: 2025-11-22 08:37:42.747 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:44 np0005531888 nova_compute[186788]: 2025-11-22 08:37:44.389 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:45 np0005531888 podman[246920]: 2025-11-22 08:37:45.691461592 +0000 UTC m=+0.059522929 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal)
Nov 22 03:37:45 np0005531888 podman[246921]: 2025-11-22 08:37:45.700609445 +0000 UTC m=+0.062557167 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:37:45 np0005531888 podman[246922]: 2025-11-22 08:37:45.736652964 +0000 UTC m=+0.090537090 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:37:47 np0005531888 nova_compute[186788]: 2025-11-22 08:37:47.750 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:49 np0005531888 nova_compute[186788]: 2025-11-22 08:37:49.390 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:52 np0005531888 nova_compute[186788]: 2025-11-22 08:37:52.753 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:54 np0005531888 nova_compute[186788]: 2025-11-22 08:37:54.391 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:57 np0005531888 nova_compute[186788]: 2025-11-22 08:37:57.757 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:59 np0005531888 nova_compute[186788]: 2025-11-22 08:37:59.393 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:00 np0005531888 podman[246985]: 2025-11-22 08:38:00.669287774 +0000 UTC m=+0.047677968 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:38:00 np0005531888 podman[246986]: 2025-11-22 08:38:00.697525303 +0000 UTC m=+0.072410117 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.236 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.236 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.272 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.389 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.390 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.397 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.397 186792 INFO nova.compute.claims [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.662 186792 DEBUG nova.compute.provider_tree [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.674 186792 DEBUG nova.scheduler.client.report [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.693 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.694 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.760 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.776 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.777 186792 DEBUG nova.network.neutron [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.804 186792 INFO nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.830 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.971 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.972 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.972 186792 INFO nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Creating image(s)#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.973 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.973 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.974 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:02 np0005531888 nova_compute[186788]: 2025-11-22 08:38:02.986 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.048 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.049 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.050 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.062 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.117 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.118 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.157 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.158 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.159 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.213 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.214 186792 DEBUG nova.virt.disk.api [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.215 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.271 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.272 186792 DEBUG nova.virt.disk.api [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.272 186792 DEBUG nova.objects.instance [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid e304bef1-c7a3-45f0-9975-d9dc37b7fba4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.286 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.287 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Ensure instance console log exists: /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.287 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.288 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.288 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:03 np0005531888 nova_compute[186788]: 2025-11-22 08:38:03.374 186792 DEBUG nova.policy [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:38:04 np0005531888 nova_compute[186788]: 2025-11-22 08:38:04.394 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:05 np0005531888 nova_compute[186788]: 2025-11-22 08:38:05.302 186792 DEBUG nova.network.neutron [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Successfully created port: def7aa64-0c3a-4e1b-9de9-b44e62099d6d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.122 186792 DEBUG nova.network.neutron [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Successfully updated port: def7aa64-0c3a-4e1b-9de9-b44e62099d6d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.140 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.141 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.141 186792 DEBUG nova.network.neutron [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.248 186792 DEBUG nova.compute.manager [req-20a70c48-b2ac-4204-96c1-e1756c6e492e req-725b6646-abb3-44b4-9e9c-4e5114971b1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-changed-def7aa64-0c3a-4e1b-9de9-b44e62099d6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.248 186792 DEBUG nova.compute.manager [req-20a70c48-b2ac-4204-96c1-e1756c6e492e req-725b6646-abb3-44b4-9e9c-4e5114971b1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Refreshing instance network info cache due to event network-changed-def7aa64-0c3a-4e1b-9de9-b44e62099d6d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.248 186792 DEBUG oslo_concurrency.lockutils [req-20a70c48-b2ac-4204-96c1-e1756c6e492e req-725b6646-abb3-44b4-9e9c-4e5114971b1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.355 186792 DEBUG nova.network.neutron [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:38:07 np0005531888 nova_compute[186788]: 2025-11-22 08:38:07.764 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.609 186792 DEBUG nova.network.neutron [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updating instance_info_cache with network_info: [{"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.632 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.632 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Instance network_info: |[{"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.632 186792 DEBUG oslo_concurrency.lockutils [req-20a70c48-b2ac-4204-96c1-e1756c6e492e req-725b6646-abb3-44b4-9e9c-4e5114971b1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.633 186792 DEBUG nova.network.neutron [req-20a70c48-b2ac-4204-96c1-e1756c6e492e req-725b6646-abb3-44b4-9e9c-4e5114971b1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Refreshing network info cache for port def7aa64-0c3a-4e1b-9de9-b44e62099d6d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.635 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Start _get_guest_xml network_info=[{"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.640 186792 WARNING nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.651 186792 DEBUG nova.virt.libvirt.host [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.651 186792 DEBUG nova.virt.libvirt.host [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.658 186792 DEBUG nova.virt.libvirt.host [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.659 186792 DEBUG nova.virt.libvirt.host [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.660 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.661 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.661 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.661 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.662 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.662 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.662 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.662 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.662 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.663 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.663 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.663 186792 DEBUG nova.virt.hardware [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.666 186792 DEBUG nova.virt.libvirt.vif [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:37:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-883461096',display_name='tempest-TestNetworkBasicOps-server-883461096',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-883461096',id=169,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD0z/d3D8Z1P4K6JIpGV7Tq4WTuVk8ez/ejc+HnxCd8xIpRVLW+pHpsYU0NMH0CQcfwPQU8cK3poygMokn8lRRcNDaAElwceyW121vP8CjaKUGd0RpUS661cpbp+qYCy8g==',key_name='tempest-TestNetworkBasicOps-1997631990',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-l7ketysl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:38:02Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=e304bef1-c7a3-45f0-9975-d9dc37b7fba4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.666 186792 DEBUG nova.network.os_vif_util [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.667 186792 DEBUG nova.network.os_vif_util [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c0:5c,bridge_name='br-int',has_traffic_filtering=True,id=def7aa64-0c3a-4e1b-9de9-b44e62099d6d,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef7aa64-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.668 186792 DEBUG nova.objects.instance [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid e304bef1-c7a3-45f0-9975-d9dc37b7fba4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.682 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <uuid>e304bef1-c7a3-45f0-9975-d9dc37b7fba4</uuid>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <name>instance-000000a9</name>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkBasicOps-server-883461096</nova:name>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:38:08</nova:creationTime>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        <nova:port uuid="def7aa64-0c3a-4e1b-9de9-b44e62099d6d">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <entry name="serial">e304bef1-c7a3-45f0-9975-d9dc37b7fba4</entry>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <entry name="uuid">e304bef1-c7a3-45f0-9975-d9dc37b7fba4</entry>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.config"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:1b:c0:5c"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <target dev="tapdef7aa64-0c"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/console.log" append="off"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:38:08 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:38:08 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:38:08 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:38:08 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.683 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Preparing to wait for external event network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.683 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.683 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.684 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.684 186792 DEBUG nova.virt.libvirt.vif [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:37:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-883461096',display_name='tempest-TestNetworkBasicOps-server-883461096',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-883461096',id=169,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD0z/d3D8Z1P4K6JIpGV7Tq4WTuVk8ez/ejc+HnxCd8xIpRVLW+pHpsYU0NMH0CQcfwPQU8cK3poygMokn8lRRcNDaAElwceyW121vP8CjaKUGd0RpUS661cpbp+qYCy8g==',key_name='tempest-TestNetworkBasicOps-1997631990',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-l7ketysl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:38:02Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=e304bef1-c7a3-45f0-9975-d9dc37b7fba4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.684 186792 DEBUG nova.network.os_vif_util [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.685 186792 DEBUG nova.network.os_vif_util [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c0:5c,bridge_name='br-int',has_traffic_filtering=True,id=def7aa64-0c3a-4e1b-9de9-b44e62099d6d,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef7aa64-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.685 186792 DEBUG os_vif [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c0:5c,bridge_name='br-int',has_traffic_filtering=True,id=def7aa64-0c3a-4e1b-9de9-b44e62099d6d,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef7aa64-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.686 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.686 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.687 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.690 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.690 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdef7aa64-0c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.691 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdef7aa64-0c, col_values=(('external_ids', {'iface-id': 'def7aa64-0c3a-4e1b-9de9-b44e62099d6d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:c0:5c', 'vm-uuid': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:08 np0005531888 NetworkManager[55166]: <info>  [1763800688.6939] manager: (tapdef7aa64-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.694 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.699 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.699 186792 INFO os_vif [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c0:5c,bridge_name='br-int',has_traffic_filtering=True,id=def7aa64-0c3a-4e1b-9de9-b44e62099d6d,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef7aa64-0c')#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.758 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.759 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.759 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:1b:c0:5c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:38:08 np0005531888 nova_compute[186788]: 2025-11-22 08:38:08.760 186792 INFO nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Using config drive#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.360 186792 INFO nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Creating config drive at /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.config#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.365 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpej9jjpl5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.396 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.494 186792 DEBUG oslo_concurrency.processutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpej9jjpl5" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:38:09 np0005531888 kernel: tapdef7aa64-0c: entered promiscuous mode
Nov 22 03:38:09 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:09Z|00685|binding|INFO|Claiming lport def7aa64-0c3a-4e1b-9de9-b44e62099d6d for this chassis.
Nov 22 03:38:09 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:09Z|00686|binding|INFO|def7aa64-0c3a-4e1b-9de9-b44e62099d6d: Claiming fa:16:3e:1b:c0:5c 10.100.0.13
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.554 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 NetworkManager[55166]: <info>  [1763800689.5572] manager: (tapdef7aa64-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/325)
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.557 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.574 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c0:5c 10.100.0.13'], port_security=['fa:16:3e:1b:c0:5c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a869b57-a476-415c-ba30-ce607e13fca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '31665b59-3e3b-4029-aea2-90d19deb1554', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f98a1991-00c9-465f-9246-5677a61cda3d, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=def7aa64-0c3a-4e1b-9de9-b44e62099d6d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.575 104023 INFO neutron.agent.ovn.metadata.agent [-] Port def7aa64-0c3a-4e1b-9de9-b44e62099d6d in datapath 8a869b57-a476-415c-ba30-ce607e13fca8 bound to our chassis#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.576 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8a869b57-a476-415c-ba30-ce607e13fca8#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.588 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e45989-cfc3-496a-8d1b-3756e1aa8256]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.589 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8a869b57-a1 in ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.591 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8a869b57-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.591 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[dffbfa49-9142-4e7e-92f3-0e220628fc1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.592 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0620aa-367b-4504-96e2-785f3ba40dff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 systemd-udevd[247061]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:38:09 np0005531888 systemd-machined[153106]: New machine qemu-82-instance-000000a9.
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.605 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[030bc9d4-1f0a-4264-9f8d-02ce28084cd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 NetworkManager[55166]: <info>  [1763800689.6102] device (tapdef7aa64-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:38:09 np0005531888 NetworkManager[55166]: <info>  [1763800689.6111] device (tapdef7aa64-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.620 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:09Z|00687|binding|INFO|Setting lport def7aa64-0c3a-4e1b-9de9-b44e62099d6d ovn-installed in OVS
Nov 22 03:38:09 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:09Z|00688|binding|INFO|Setting lport def7aa64-0c3a-4e1b-9de9-b44e62099d6d up in Southbound
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.626 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 systemd[1]: Started Virtual Machine qemu-82-instance-000000a9.
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.630 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[afcce449-1ad4-4897-98b6-1271fda9f8d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.657 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[9eff67cf-6a26-4429-97cd-d8c6e26c5a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.662 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[660887f5-4ace-4541-90f2-3a212a78655a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 systemd-udevd[247066]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:38:09 np0005531888 NetworkManager[55166]: <info>  [1763800689.6639] manager: (tap8a869b57-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/326)
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.698 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[fe90a4bf-b186-4535-b45e-4253461981e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.700 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[3b35ba14-9fb4-4f49-ada0-a055cb829bd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 NetworkManager[55166]: <info>  [1763800689.7226] device (tap8a869b57-a0): carrier: link connected
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.728 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[318e09e2-027e-442d-b61c-6ca8ffda5482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.746 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a89feb7e-d732-4e5f-80e6-520c268492bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8a869b57-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:0a:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743036, 'reachable_time': 35656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247094, 'error': None, 'target': 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.760 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[38432fa6-ff61-40ea-ad4f-e6f5627ff488]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:abf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 743036, 'tstamp': 743036}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247095, 'error': None, 'target': 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.777 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[cc6b4235-75d7-4e06-8cba-f728fe637e25]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8a869b57-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:0a:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743036, 'reachable_time': 35656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247096, 'error': None, 'target': 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.807 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad5a798-432b-4cb0-9327-e59f740ca0c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.865 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4890a2-9ee5-429c-be32-439a05c96f1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.867 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a869b57-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.867 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.867 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a869b57-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.869 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 NetworkManager[55166]: <info>  [1763800689.8700] manager: (tap8a869b57-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/327)
Nov 22 03:38:09 np0005531888 kernel: tap8a869b57-a0: entered promiscuous mode
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.871 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8a869b57-a0, col_values=(('external_ids', {'iface-id': 'c85ad4c8-d148-4415-be4e-a8012749d2ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.872 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:09Z|00689|binding|INFO|Releasing lport c85ad4c8-d148-4415-be4e-a8012749d2ac from this chassis (sb_readonly=0)
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.884 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.884 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.885 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8a869b57-a476-415c-ba30-ce607e13fca8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8a869b57-a476-415c-ba30-ce607e13fca8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.886 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[16cd1a27-3e00-4f5f-b0c6-b25e38299e78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.887 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-8a869b57-a476-415c-ba30-ce607e13fca8
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/8a869b57-a476-415c-ba30-ce607e13fca8.pid.haproxy
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 8a869b57-a476-415c-ba30-ce607e13fca8
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:38:09 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:09.888 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'env', 'PROCESS_TAG=haproxy-8a869b57-a476-415c-ba30-ce607e13fca8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8a869b57-a476-415c-ba30-ce607e13fca8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.903 186792 DEBUG nova.compute.manager [req-ac328f08-7fb3-4b23-b895-2ecbde006286 req-5ec291da-96d6-4242-ace0-e63db4c15e43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.903 186792 DEBUG oslo_concurrency.lockutils [req-ac328f08-7fb3-4b23-b895-2ecbde006286 req-5ec291da-96d6-4242-ace0-e63db4c15e43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.903 186792 DEBUG oslo_concurrency.lockutils [req-ac328f08-7fb3-4b23-b895-2ecbde006286 req-5ec291da-96d6-4242-ace0-e63db4c15e43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.904 186792 DEBUG oslo_concurrency.lockutils [req-ac328f08-7fb3-4b23-b895-2ecbde006286 req-5ec291da-96d6-4242-ace0-e63db4c15e43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:09 np0005531888 nova_compute[186788]: 2025-11-22 08:38:09.904 186792 DEBUG nova.compute.manager [req-ac328f08-7fb3-4b23-b895-2ecbde006286 req-5ec291da-96d6-4242-ace0-e63db4c15e43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Processing event network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.094 186792 DEBUG nova.network.neutron [req-20a70c48-b2ac-4204-96c1-e1756c6e492e req-725b6646-abb3-44b4-9e9c-4e5114971b1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updated VIF entry in instance network info cache for port def7aa64-0c3a-4e1b-9de9-b44e62099d6d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.094 186792 DEBUG nova.network.neutron [req-20a70c48-b2ac-4204-96c1-e1756c6e492e req-725b6646-abb3-44b4-9e9c-4e5114971b1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updating instance_info_cache with network_info: [{"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.105 186792 DEBUG oslo_concurrency.lockutils [req-20a70c48-b2ac-4204-96c1-e1756c6e492e req-725b6646-abb3-44b4-9e9c-4e5114971b1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:38:10 np0005531888 podman[247128]: 2025-11-22 08:38:10.256170058 +0000 UTC m=+0.069822542 container create 0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:38:10 np0005531888 podman[247128]: 2025-11-22 08:38:10.21039452 +0000 UTC m=+0.024046994 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:38:10 np0005531888 systemd[1]: Started libpod-conmon-0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854.scope.
Nov 22 03:38:10 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:38:10 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f8e09d656a456b9fbf6b764278dd820654e508178aa1a386f465065a4b7ca6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:38:10 np0005531888 podman[247128]: 2025-11-22 08:38:10.363842353 +0000 UTC m=+0.177494827 container init 0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 03:38:10 np0005531888 podman[247128]: 2025-11-22 08:38:10.369927068 +0000 UTC m=+0.183579512 container start 0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:38:10 np0005531888 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[247144]: [NOTICE]   (247148) : New worker (247150) forked
Nov 22 03:38:10 np0005531888 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[247144]: [NOTICE]   (247148) : Loading success.
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.753 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.754 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800690.753135, e304bef1-c7a3-45f0-9975-d9dc37b7fba4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.755 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] VM Started (Lifecycle Event)#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.758 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.761 186792 INFO nova.virt.libvirt.driver [-] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Instance spawned successfully.#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.762 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.863 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.867 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.875 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.875 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.876 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.876 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.876 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.877 186792 DEBUG nova.virt.libvirt.driver [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.904 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.904 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800690.7544043, e304bef1-c7a3-45f0-9975-d9dc37b7fba4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.904 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.939 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.942 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800690.7577665, e304bef1-c7a3-45f0-9975-d9dc37b7fba4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.942 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.975 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.978 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.985 186792 INFO nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Took 8.01 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:38:10 np0005531888 nova_compute[186788]: 2025-11-22 08:38:10.986 186792 DEBUG nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.016 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.088 186792 INFO nova.compute.manager [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Took 8.73 seconds to build instance.#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.109 186792 DEBUG oslo_concurrency.lockutils [None req-829f398b-3d08-4532-8bf9-d202bc9e83a1 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.872s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.975 186792 DEBUG nova.compute.manager [req-10208470-de38-493a-93e4-77fa61c9f405 req-c221ced4-7db9-4a8f-b701-e4c013edd5da 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.976 186792 DEBUG oslo_concurrency.lockutils [req-10208470-de38-493a-93e4-77fa61c9f405 req-c221ced4-7db9-4a8f-b701-e4c013edd5da 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.976 186792 DEBUG oslo_concurrency.lockutils [req-10208470-de38-493a-93e4-77fa61c9f405 req-c221ced4-7db9-4a8f-b701-e4c013edd5da 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.976 186792 DEBUG oslo_concurrency.lockutils [req-10208470-de38-493a-93e4-77fa61c9f405 req-c221ced4-7db9-4a8f-b701-e4c013edd5da 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.977 186792 DEBUG nova.compute.manager [req-10208470-de38-493a-93e4-77fa61c9f405 req-c221ced4-7db9-4a8f-b701-e4c013edd5da 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] No waiting events found dispatching network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:38:11 np0005531888 nova_compute[186788]: 2025-11-22 08:38:11.977 186792 WARNING nova.compute.manager [req-10208470-de38-493a-93e4-77fa61c9f405 req-c221ced4-7db9-4a8f-b701-e4c013edd5da 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received unexpected event network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d for instance with vm_state active and task_state None.#033[00m
Nov 22 03:38:12 np0005531888 podman[247166]: 2025-11-22 08:38:12.707070025 +0000 UTC m=+0.074399588 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:38:12 np0005531888 podman[247167]: 2025-11-22 08:38:12.721683458 +0000 UTC m=+0.074441890 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:38:13 np0005531888 nova_compute[186788]: 2025-11-22 08:38:13.693 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:14 np0005531888 nova_compute[186788]: 2025-11-22 08:38:14.397 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:16 np0005531888 podman[247208]: 2025-11-22 08:38:16.680249964 +0000 UTC m=+0.057649591 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:38:16 np0005531888 podman[247210]: 2025-11-22 08:38:16.707676962 +0000 UTC m=+0.075630519 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:38:16 np0005531888 podman[247209]: 2025-11-22 08:38:16.707981631 +0000 UTC m=+0.081747726 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:38:17 np0005531888 nova_compute[186788]: 2025-11-22 08:38:17.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:18 np0005531888 nova_compute[186788]: 2025-11-22 08:38:18.695 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:18 np0005531888 nova_compute[186788]: 2025-11-22 08:38:18.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:18 np0005531888 nova_compute[186788]: 2025-11-22 08:38:18.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:38:18 np0005531888 nova_compute[186788]: 2025-11-22 08:38:18.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:38:19 np0005531888 nova_compute[186788]: 2025-11-22 08:38:19.267 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:38:19 np0005531888 nova_compute[186788]: 2025-11-22 08:38:19.268 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:38:19 np0005531888 nova_compute[186788]: 2025-11-22 08:38:19.268 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:38:19 np0005531888 nova_compute[186788]: 2025-11-22 08:38:19.269 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e304bef1-c7a3-45f0-9975-d9dc37b7fba4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:38:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:19.333 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:38:19 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:19.334 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:38:19 np0005531888 nova_compute[186788]: 2025-11-22 08:38:19.335 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:19 np0005531888 nova_compute[186788]: 2025-11-22 08:38:19.399 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:22 np0005531888 nova_compute[186788]: 2025-11-22 08:38:22.254 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:22 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:22Z|00690|binding|INFO|Releasing lport c85ad4c8-d148-4415-be4e-a8012749d2ac from this chassis (sb_readonly=0)
Nov 22 03:38:22 np0005531888 NetworkManager[55166]: <info>  [1763800702.2553] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Nov 22 03:38:22 np0005531888 NetworkManager[55166]: <info>  [1763800702.2563] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Nov 22 03:38:22 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:22Z|00691|binding|INFO|Releasing lport c85ad4c8-d148-4415-be4e-a8012749d2ac from this chassis (sb_readonly=0)
Nov 22 03:38:22 np0005531888 nova_compute[186788]: 2025-11-22 08:38:22.283 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:22 np0005531888 nova_compute[186788]: 2025-11-22 08:38:22.288 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.331 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updating instance_info_cache with network_info: [{"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.354 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.354 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.698 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.844 186792 DEBUG nova.compute.manager [req-0930f607-ebc9-43aa-9323-d8875e215280 req-a5d6d15c-4ead-4786-9124-8b32cbc1c3ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-changed-def7aa64-0c3a-4e1b-9de9-b44e62099d6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.845 186792 DEBUG nova.compute.manager [req-0930f607-ebc9-43aa-9323-d8875e215280 req-a5d6d15c-4ead-4786-9124-8b32cbc1c3ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Refreshing instance network info cache due to event network-changed-def7aa64-0c3a-4e1b-9de9-b44e62099d6d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.845 186792 DEBUG oslo_concurrency.lockutils [req-0930f607-ebc9-43aa-9323-d8875e215280 req-a5d6d15c-4ead-4786-9124-8b32cbc1c3ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.845 186792 DEBUG oslo_concurrency.lockutils [req-0930f607-ebc9-43aa-9323-d8875e215280 req-a5d6d15c-4ead-4786-9124-8b32cbc1c3ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:38:23 np0005531888 nova_compute[186788]: 2025-11-22 08:38:23.846 186792 DEBUG nova.network.neutron [req-0930f607-ebc9-43aa-9323-d8875e215280 req-a5d6d15c-4ead-4786-9124-8b32cbc1c3ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Refreshing network info cache for port def7aa64-0c3a-4e1b-9de9-b44e62099d6d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:38:24 np0005531888 nova_compute[186788]: 2025-11-22 08:38:24.348 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:24 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:24Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1b:c0:5c 10.100.0.13
Nov 22 03:38:24 np0005531888 ovn_controller[95067]: 2025-11-22T08:38:24Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1b:c0:5c 10.100.0.13
Nov 22 03:38:24 np0005531888 nova_compute[186788]: 2025-11-22 08:38:24.400 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:25 np0005531888 nova_compute[186788]: 2025-11-22 08:38:25.359 186792 DEBUG nova.network.neutron [req-0930f607-ebc9-43aa-9323-d8875e215280 req-a5d6d15c-4ead-4786-9124-8b32cbc1c3ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updated VIF entry in instance network info cache for port def7aa64-0c3a-4e1b-9de9-b44e62099d6d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:38:25 np0005531888 nova_compute[186788]: 2025-11-22 08:38:25.359 186792 DEBUG nova.network.neutron [req-0930f607-ebc9-43aa-9323-d8875e215280 req-a5d6d15c-4ead-4786-9124-8b32cbc1c3ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updating instance_info_cache with network_info: [{"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:38:25 np0005531888 nova_compute[186788]: 2025-11-22 08:38:25.372 186792 DEBUG oslo_concurrency.lockutils [req-0930f607-ebc9-43aa-9323-d8875e215280 req-a5d6d15c-4ead-4786-9124-8b32cbc1c3ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:38:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:26.337 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:38:27 np0005531888 nova_compute[186788]: 2025-11-22 08:38:27.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:28 np0005531888 nova_compute[186788]: 2025-11-22 08:38:28.701 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:29 np0005531888 nova_compute[186788]: 2025-11-22 08:38:29.402 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:29 np0005531888 nova_compute[186788]: 2025-11-22 08:38:29.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:30 np0005531888 nova_compute[186788]: 2025-11-22 08:38:30.087 186792 INFO nova.compute.manager [None req-d54de764-1ad2-4e4e-b8e8-c66be95c769c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Get console output#033[00m
Nov 22 03:38:30 np0005531888 nova_compute[186788]: 2025-11-22 08:38:30.092 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:38:31 np0005531888 podman[247288]: 2025-11-22 08:38:31.68032344 +0000 UTC m=+0.054676075 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:38:31 np0005531888 podman[247289]: 2025-11-22 08:38:31.699409017 +0000 UTC m=+0.072466669 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:38:33 np0005531888 nova_compute[186788]: 2025-11-22 08:38:33.613 186792 DEBUG nova.compute.manager [req-d8af6449-7423-4fcc-9872-62ffcb1348f1 req-9e644c54-f12e-4ebd-bf6a-afa062bb4804 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-changed-def7aa64-0c3a-4e1b-9de9-b44e62099d6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:38:33 np0005531888 nova_compute[186788]: 2025-11-22 08:38:33.613 186792 DEBUG nova.compute.manager [req-d8af6449-7423-4fcc-9872-62ffcb1348f1 req-9e644c54-f12e-4ebd-bf6a-afa062bb4804 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Refreshing instance network info cache due to event network-changed-def7aa64-0c3a-4e1b-9de9-b44e62099d6d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:38:33 np0005531888 nova_compute[186788]: 2025-11-22 08:38:33.614 186792 DEBUG oslo_concurrency.lockutils [req-d8af6449-7423-4fcc-9872-62ffcb1348f1 req-9e644c54-f12e-4ebd-bf6a-afa062bb4804 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:38:33 np0005531888 nova_compute[186788]: 2025-11-22 08:38:33.614 186792 DEBUG oslo_concurrency.lockutils [req-d8af6449-7423-4fcc-9872-62ffcb1348f1 req-9e644c54-f12e-4ebd-bf6a-afa062bb4804 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:38:33 np0005531888 nova_compute[186788]: 2025-11-22 08:38:33.614 186792 DEBUG nova.network.neutron [req-d8af6449-7423-4fcc-9872-62ffcb1348f1 req-9e644c54-f12e-4ebd-bf6a-afa062bb4804 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Refreshing network info cache for port def7aa64-0c3a-4e1b-9de9-b44e62099d6d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:38:33 np0005531888 nova_compute[186788]: 2025-11-22 08:38:33.703 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:33 np0005531888 nova_compute[186788]: 2025-11-22 08:38:33.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:34 np0005531888 nova_compute[186788]: 2025-11-22 08:38:34.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:34 np0005531888 nova_compute[186788]: 2025-11-22 08:38:34.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:34 np0005531888 nova_compute[186788]: 2025-11-22 08:38:34.973 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:34 np0005531888 nova_compute[186788]: 2025-11-22 08:38:34.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:34 np0005531888 nova_compute[186788]: 2025-11-22 08:38:34.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:34 np0005531888 nova_compute[186788]: 2025-11-22 08:38:34.974 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.042 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.100 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.101 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.175 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.318 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.319 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5534MB free_disk=73.23853302001953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.319 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.320 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.445 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance e304bef1-c7a3-45f0-9975-d9dc37b7fba4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.445 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.446 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.499 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.512 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.755 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.756 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.978 186792 DEBUG nova.network.neutron [req-d8af6449-7423-4fcc-9872-62ffcb1348f1 req-9e644c54-f12e-4ebd-bf6a-afa062bb4804 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updated VIF entry in instance network info cache for port def7aa64-0c3a-4e1b-9de9-b44e62099d6d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.979 186792 DEBUG nova.network.neutron [req-d8af6449-7423-4fcc-9872-62ffcb1348f1 req-9e644c54-f12e-4ebd-bf6a-afa062bb4804 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updating instance_info_cache with network_info: [{"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:38:35 np0005531888 nova_compute[186788]: 2025-11-22 08:38:35.995 186792 DEBUG oslo_concurrency.lockutils [req-d8af6449-7423-4fcc-9872-62ffcb1348f1 req-9e644c54-f12e-4ebd-bf6a-afa062bb4804 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:38:36 np0005531888 nova_compute[186788]: 2025-11-22 08:38:36.756 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.855 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'name': 'tempest-TestNetworkBasicOps-server-883461096', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a9', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '12f63a6d87a947758ab928c0d625ff06', 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'hostId': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.855 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:38:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:36.857 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:36.857 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:38:36.858 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.870 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.870 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73bb2cf4-1d11-4071-9cca-ffa98c8b3912', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.856122', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a3575fcc-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.555607523, 'message_signature': '9483ffcadc7c7a64d5d119a1cd27a7f2af38b787b76a5ff33544fde2f69d0b86'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.856122', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a3576dbe-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.555607523, 'message_signature': '5579967bfc9a6d50b2db9c41d8247eb594b8c75744ff7ab90efcb74bc95c9e3d'}]}, 'timestamp': '2025-11-22 08:38:36.870950', '_unique_id': 'f8b67a0c87bf404e899ee3711ffb2a77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.873 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.908 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.909 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e1f7c10-2857-4f61-802d-577b5f1d96e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.873295', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a35d5922-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': 'e617d672ea2695c6b7b36c688060db21f89fb184ae71dad48490198072db562e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.873295', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a35d675a-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': '109eb28e8d26d283b41cb684f3645a66ed0a93d56fae960b7de1c32f6bbb570a'}]}, 'timestamp': '2025-11-22 08:38:36.910078', '_unique_id': 'a62795eabd5c4d69a8f74cd38b54f9d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.910 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.914 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e304bef1-c7a3-45f0-9975-d9dc37b7fba4 / tapdef7aa64-0c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.915 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.outgoing.packets volume: 106 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23db3556-01b3-4070-9f0f-7bfc0082686e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 106, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.912518', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a35e3e5a-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': '631ba2776175b7079f100e7d32788551d6120ca5382f42935c429af03b73d1fb'}]}, 'timestamp': '2025-11-22 08:38:36.915685', '_unique_id': '49b1dfabf4ab4bebbf948f5df2ebba20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.917 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c961b569-ab11-4054-b512-dff8db552fe5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.917898', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a35ea5f2-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': '93712ef07ce6ad6d628e47260c920586f77ff0ff281e55db3a5abb4198285bc9'}]}, 'timestamp': '2025-11-22 08:38:36.918280', '_unique_id': 'b8c54a450a3748588e4f5014f53e959e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.918 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.920 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.936 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/memory.usage volume: 42.50390625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6eb286c-3f39-4d9a-a049-c583b9c79f04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.50390625, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'timestamp': '2025-11-22T08:38:36.920342', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a3619424-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.636272069, 'message_signature': '3e0ec5b34b1390058a6219f6ab3872284300578432c32ef5e082b8502b1cd192'}]}, 'timestamp': '2025-11-22 08:38:36.937500', '_unique_id': 'dc059aeb57be45f2b17e8b3654fc9729'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.938 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.939 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.939 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.incoming.bytes volume: 18962 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6eaa2e34-c9d2-449f-a5ff-a5aac5b2e587', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18962, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.939527', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a361f2f2-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': 'e42126eeeb2e4d2d5d606fab700c005ffb8802853a7f3e1ade560d5f97a28f1c'}]}, 'timestamp': '2025-11-22 08:38:36.939908', '_unique_id': '633fa6281c5e4491ab305d127bcd18d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.941 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.941 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.941 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-883461096>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-883461096>]
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.942 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.942 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-883461096>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-883461096>]
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.942 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.942 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69ca7f62-796d-4764-893d-8928ec2d0664', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72953856, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.942551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a362693a-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': 'd96f618ce6179f09bb5309f774eac4b15190d34dbec088054b84ea45c53db05f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.942551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a3627600-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': 'a5a57433a1dca58c15adecd3383810a8987f7700c9853432fcd5085f1e4dfc66'}]}, 'timestamp': '2025-11-22 08:38:36.943233', '_unique_id': '43241381e2d6453ab001e9df8a264ee5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.945 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '161cc459-ccab-4f1b-be7a-a80f73cbb4bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.945246', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a362d186-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': 'be3c472c44056cec4d7446c340c126cc89f2d7430900aaf0da90b79fea7669da'}]}, 'timestamp': '2025-11-22 08:38:36.945620', '_unique_id': 'e5b11a0f883c467c9e334c1164f7ae38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.947 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.947 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-883461096>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-883461096>]
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5cbc92a-59f3-4f2e-945d-f25bdafe7d8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.947983', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a3633bee-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': '99db5dc7759e452f54c7afa85f74177a6d00a767e294a88d2c23555816d9bbb9'}]}, 'timestamp': '2025-11-22 08:38:36.948324', '_unique_id': 'f0c93316fd9247d0ac4c71bef5d74e69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.950 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.950 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09556d1c-7341-4082-a2dc-94c7d23271cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.950213', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a3639346-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.555607523, 'message_signature': '0320c32470c3310a5563db767456d86671898e090596879fddfcb01990116c86'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.950213', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a363a020-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.555607523, 'message_signature': '3db40c184ad8173162ba1a573fb3710fc397b2f8404b81def1540929566cc52f'}]}, 'timestamp': '2025-11-22 08:38:36.950872', '_unique_id': 'cf740d620ad84d31a5b490fb2e7088ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.952 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.read.latency volume: 766959957 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.read.latency volume: 213637917 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de357b50-8dc1-4b16-af22-b2e7a8b2b468', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 766959957, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.952783', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a363f7aa-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': '81fae10bb8bd37785815ac101091d8bced86410b37688310f0b8b727ad8c048d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 213637917, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.952783', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a3640376-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': '2e983a3a6589ffea459ddb063e229a93b3414410d413992317976518affabb56'}]}, 'timestamp': '2025-11-22 08:38:36.953414', '_unique_id': '7cc485df11f34ff1ae4a409c7745ee51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.955 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.955 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-883461096>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-883461096>]
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.955 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5c9b09c-320c-4abf-a9a6-225908e97fe0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.955779', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a3646d52-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': '2533c6b74b7f3f1d78578ddd279f56cdaf5aadc1fc56933f56e3b7e4dd0b0cad'}]}, 'timestamp': '2025-11-22 08:38:36.956140', '_unique_id': 'bbd0cda811084dba8afa5c2222dca32c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.957 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.958 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91722c6f-4060-4247-978b-be0c9c93ad95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.957942', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a364c162-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.555607523, 'message_signature': 'e052213022a0fe38061324fffcfeee9106b60654166df324893245a9ba7ccf7a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.957942', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a364ccd4-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.555607523, 'message_signature': 'a7dba3c9602f43def8166fe18abf667ec39021ef840c8368788a113d9333705a'}]}, 'timestamp': '2025-11-22 08:38:36.958531', '_unique_id': '374f301b80a4403eb6dc01f7712b977a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9935c1df-e1be-4330-8110-c9644160e625', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.960137', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a3651586-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': 'f661e018fa849ee39c0b9abf13977f23daf2f90b80510dbd51a973ce22ec364b'}]}, 'timestamp': '2025-11-22 08:38:36.960403', '_unique_id': '68306ba715e647fe81b034a038d46422'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.961 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.962 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.962 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f92775a-9459-4d4c-a785-dfc8d2216ec8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.962034', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a3655f82-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': 'cb744ef97d03d9d5e430ef89d3e8653c07a0068508f245deb1ce3a8fffd7f5d4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.962034', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a36568ba-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': '491bfb2a437d7aca25f178657cf9b91fa2af5d38fbb554d15b4aff06d9aceb07'}]}, 'timestamp': '2025-11-22 08:38:36.962519', '_unique_id': '3d4da47918b143bbb7dad56f76dc923c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.963 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/cpu volume: 12390000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd87bf702-96be-47d0-90ba-c55f0ca3bbd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12390000000, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'timestamp': '2025-11-22T08:38:36.964035', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a365adb6-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.636272069, 'message_signature': '27ac4cebdf06cfde1197f6d95c5662468dcab618bfa7bcc7fc6da2ddbaf74629'}]}, 'timestamp': '2025-11-22 08:38:36.964295', '_unique_id': 'e422fa3611804ad484a8ebc854fa4eb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.965 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.write.latency volume: 30453478880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a7d1325-f55d-4bf2-9825-3824d747a496', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30453478880, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.965724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a365efec-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': '3703d0491b2f4e41bfa686e4dff7c636806dd99e2062c4ac9936f07143f906ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.965724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a365fb0e-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': 'a73cdf112a2c8fb989c500fd21949addb21fdfec25b2c9aef9c18fb6b3adfc7b'}]}, 'timestamp': '2025-11-22 08:38:36.966299', '_unique_id': 'd9eae93cbe904fffa73925f4e30fd4db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.968 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.incoming.packets volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcc01cdc-e54c-423c-893f-0d3c2e6e94c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 103, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.968380', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a3665824-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': '2b8e048ba6571a84f14e51ea828d6cd05aa7a16f2e8d42334064e07c2b9fbc4e'}]}, 'timestamp': '2025-11-22 08:38:36.968713', '_unique_id': '0c8c7f7960b8447c9139f94a149d0792'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.970 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7da68f3-1d67-4d04-8202-a818c3c1974a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.970392', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a366a716-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': 'dc35c32dc2dcedec9966de8cf281bbe3a398ac0bf15f5eba4ce4d0d6e769d6fc'}]}, 'timestamp': '2025-11-22 08:38:36.970719', '_unique_id': '6f20088c5de44a728b36faa68af99f14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.972 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.972 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74aa6349-66f3-43db-82fd-a5858f062494', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-vda', 'timestamp': '2025-11-22T08:38:36.972617', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a366ff7c-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': 'a5f75baa1560c740d9b1779c7964bd19302381a458801a265b84e3b104c15b00'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4-sda', 'timestamp': '2025-11-22T08:38:36.972617', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'instance-000000a9', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a367229a-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.572810902, 'message_signature': '1b19d8e0892213f76e0b391040382f996b87134cf66a24a3ad6dd68538859517'}]}, 'timestamp': '2025-11-22 08:38:36.973855', '_unique_id': 'a622ba66f8a64fe98da551cc56ea90fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.975 12 DEBUG ceilometer.compute.pollsters [-] e304bef1-c7a3-45f0-9975-d9dc37b7fba4/network.outgoing.bytes volume: 15832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45d844d1-fa76-4bd2-80ef-d15fb1cb713d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15832, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a9-e304bef1-c7a3-45f0-9975-d9dc37b7fba4-tapdef7aa64-0c', 'timestamp': '2025-11-22T08:38:36.975549', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-883461096', 'name': 'tapdef7aa64-0c', 'instance_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c0:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdef7aa64-0c'}, 'message_id': 'a3677132-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7457.612060702, 'message_signature': 'fca49db10ac774a060b35b3aaf167d1fc49e24bd43b566e0c682698cd462fcb8'}]}, 'timestamp': '2025-11-22 08:38:36.975889', '_unique_id': 'fa4acbe861ac4268af7ed30a4ab83287'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:38:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:38:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:38:38 np0005531888 nova_compute[186788]: 2025-11-22 08:38:38.705 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:38 np0005531888 nova_compute[186788]: 2025-11-22 08:38:38.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:38 np0005531888 nova_compute[186788]: 2025-11-22 08:38:38.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:38:39 np0005531888 nova_compute[186788]: 2025-11-22 08:38:39.407 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:43 np0005531888 nova_compute[186788]: 2025-11-22 08:38:43.707 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:43 np0005531888 podman[247337]: 2025-11-22 08:38:43.717782173 +0000 UTC m=+0.076798999 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:38:43 np0005531888 podman[247336]: 2025-11-22 08:38:43.729218265 +0000 UTC m=+0.093063084 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 22 03:38:44 np0005531888 nova_compute[186788]: 2025-11-22 08:38:44.409 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:47 np0005531888 podman[247378]: 2025-11-22 08:38:47.677474607 +0000 UTC m=+0.054933291 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:38:47 np0005531888 podman[247379]: 2025-11-22 08:38:47.684153987 +0000 UTC m=+0.054792957 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:38:47 np0005531888 podman[247380]: 2025-11-22 08:38:47.734350467 +0000 UTC m=+0.101511888 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 22 03:38:48 np0005531888 nova_compute[186788]: 2025-11-22 08:38:48.709 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:49 np0005531888 nova_compute[186788]: 2025-11-22 08:38:49.409 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:53 np0005531888 nova_compute[186788]: 2025-11-22 08:38:53.711 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:54 np0005531888 nova_compute[186788]: 2025-11-22 08:38:54.410 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:58 np0005531888 nova_compute[186788]: 2025-11-22 08:38:58.713 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:58 np0005531888 nova_compute[186788]: 2025-11-22 08:38:58.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:59 np0005531888 nova_compute[186788]: 2025-11-22 08:38:59.413 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:02 np0005531888 podman[247442]: 2025-11-22 08:39:02.678731852 +0000 UTC m=+0.056233205 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:39:02 np0005531888 podman[247443]: 2025-11-22 08:39:02.679052391 +0000 UTC m=+0.052063868 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:39:03 np0005531888 nova_compute[186788]: 2025-11-22 08:39:03.716 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:04 np0005531888 nova_compute[186788]: 2025-11-22 08:39:04.415 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:08 np0005531888 nova_compute[186788]: 2025-11-22 08:39:08.718 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:09 np0005531888 nova_compute[186788]: 2025-11-22 08:39:09.418 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:13 np0005531888 nova_compute[186788]: 2025-11-22 08:39:13.721 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:14.272 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:39:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:14.273 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:39:14 np0005531888 nova_compute[186788]: 2025-11-22 08:39:14.275 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:14 np0005531888 nova_compute[186788]: 2025-11-22 08:39:14.423 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:14 np0005531888 podman[247481]: 2025-11-22 08:39:14.676915703 +0000 UTC m=+0.049757310 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:39:14 np0005531888 podman[247482]: 2025-11-22 08:39:14.682352572 +0000 UTC m=+0.049665308 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:39:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:16.277 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:39:17 np0005531888 nova_compute[186788]: 2025-11-22 08:39:17.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:18 np0005531888 podman[247525]: 2025-11-22 08:39:18.678463546 +0000 UTC m=+0.055128347 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container)
Nov 22 03:39:18 np0005531888 podman[247526]: 2025-11-22 08:39:18.685927156 +0000 UTC m=+0.059145229 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 22 03:39:18 np0005531888 podman[247527]: 2025-11-22 08:39:18.713392266 +0000 UTC m=+0.083373487 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 03:39:18 np0005531888 nova_compute[186788]: 2025-11-22 08:39:18.726 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:19 np0005531888 nova_compute[186788]: 2025-11-22 08:39:19.426 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:20 np0005531888 nova_compute[186788]: 2025-11-22 08:39:20.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:20 np0005531888 nova_compute[186788]: 2025-11-22 08:39:20.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:39:20 np0005531888 nova_compute[186788]: 2025-11-22 08:39:20.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.456 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.457 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.457 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.458 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.458 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.470 186792 INFO nova.compute.manager [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Terminating instance#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.479 186792 DEBUG nova.compute.manager [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:39:21 np0005531888 kernel: tapdef7aa64-0c (unregistering): left promiscuous mode
Nov 22 03:39:21 np0005531888 NetworkManager[55166]: <info>  [1763800761.5102] device (tapdef7aa64-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:39:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:39:21Z|00692|binding|INFO|Releasing lport def7aa64-0c3a-4e1b-9de9-b44e62099d6d from this chassis (sb_readonly=0)
Nov 22 03:39:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:39:21Z|00693|binding|INFO|Setting lport def7aa64-0c3a-4e1b-9de9-b44e62099d6d down in Southbound
Nov 22 03:39:21 np0005531888 ovn_controller[95067]: 2025-11-22T08:39:21Z|00694|binding|INFO|Removing iface tapdef7aa64-0c ovn-installed in OVS
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.521 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.535 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:21 np0005531888 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a9.scope: Deactivated successfully.
Nov 22 03:39:21 np0005531888 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a9.scope: Consumed 16.811s CPU time.
Nov 22 03:39:21 np0005531888 systemd-machined[153106]: Machine qemu-82-instance-000000a9 terminated.
Nov 22 03:39:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:21.673 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c0:5c 10.100.0.13'], port_security=['fa:16:3e:1b:c0:5c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e304bef1-c7a3-45f0-9975-d9dc37b7fba4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a869b57-a476-415c-ba30-ce607e13fca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '31665b59-3e3b-4029-aea2-90d19deb1554', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f98a1991-00c9-465f-9246-5677a61cda3d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=def7aa64-0c3a-4e1b-9de9-b44e62099d6d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:39:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:21.674 104023 INFO neutron.agent.ovn.metadata.agent [-] Port def7aa64-0c3a-4e1b-9de9-b44e62099d6d in datapath 8a869b57-a476-415c-ba30-ce607e13fca8 unbound from our chassis#033[00m
Nov 22 03:39:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:21.675 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8a869b57-a476-415c-ba30-ce607e13fca8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:39:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:21.677 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5017c7-f1ad-4049-9a05-29dfb79d45d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:39:21 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:21.678 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 namespace which is not needed anymore#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.742 186792 INFO nova.virt.libvirt.driver [-] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Instance destroyed successfully.#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.743 186792 DEBUG nova.objects.instance [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid e304bef1-c7a3-45f0-9975-d9dc37b7fba4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.760 186792 DEBUG nova.virt.libvirt.vif [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:37:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-883461096',display_name='tempest-TestNetworkBasicOps-server-883461096',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-883461096',id=169,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD0z/d3D8Z1P4K6JIpGV7Tq4WTuVk8ez/ejc+HnxCd8xIpRVLW+pHpsYU0NMH0CQcfwPQU8cK3poygMokn8lRRcNDaAElwceyW121vP8CjaKUGd0RpUS661cpbp+qYCy8g==',key_name='tempest-TestNetworkBasicOps-1997631990',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:38:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-l7ketysl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:38:11Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=e304bef1-c7a3-45f0-9975-d9dc37b7fba4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.761 186792 DEBUG nova.network.os_vif_util [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.761 186792 DEBUG nova.network.os_vif_util [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1b:c0:5c,bridge_name='br-int',has_traffic_filtering=True,id=def7aa64-0c3a-4e1b-9de9-b44e62099d6d,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef7aa64-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.762 186792 DEBUG os_vif [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:c0:5c,bridge_name='br-int',has_traffic_filtering=True,id=def7aa64-0c3a-4e1b-9de9-b44e62099d6d,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef7aa64-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.764 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.764 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdef7aa64-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.767 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.768 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.771 186792 INFO os_vif [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:c0:5c,bridge_name='br-int',has_traffic_filtering=True,id=def7aa64-0c3a-4e1b-9de9-b44e62099d6d,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef7aa64-0c')#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.772 186792 INFO nova.virt.libvirt.driver [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Deleting instance files /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4_del#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.772 186792 INFO nova.virt.libvirt.driver [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Deletion of /var/lib/nova/instances/e304bef1-c7a3-45f0-9975-d9dc37b7fba4_del complete#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.777 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.777 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.778 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:39:21 np0005531888 nova_compute[186788]: 2025-11-22 08:39:21.778 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e304bef1-c7a3-45f0-9975-d9dc37b7fba4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:39:21 np0005531888 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[247144]: [NOTICE]   (247148) : haproxy version is 2.8.14-c23fe91
Nov 22 03:39:21 np0005531888 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[247144]: [NOTICE]   (247148) : path to executable is /usr/sbin/haproxy
Nov 22 03:39:21 np0005531888 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[247144]: [WARNING]  (247148) : Exiting Master process...
Nov 22 03:39:21 np0005531888 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[247144]: [ALERT]    (247148) : Current worker (247150) exited with code 143 (Terminated)
Nov 22 03:39:21 np0005531888 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[247144]: [WARNING]  (247148) : All workers exited. Exiting... (0)
Nov 22 03:39:21 np0005531888 systemd[1]: libpod-0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854.scope: Deactivated successfully.
Nov 22 03:39:21 np0005531888 podman[247635]: 2025-11-22 08:39:21.826827985 +0000 UTC m=+0.053404143 container died 0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:39:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay-87f8e09d656a456b9fbf6b764278dd820654e508178aa1a386f465065a4b7ca6-merged.mount: Deactivated successfully.
Nov 22 03:39:21 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854-userdata-shm.mount: Deactivated successfully.
Nov 22 03:39:21 np0005531888 podman[247635]: 2025-11-22 08:39:21.9407449 +0000 UTC m=+0.167321068 container cleanup 0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:39:21 np0005531888 systemd[1]: libpod-conmon-0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854.scope: Deactivated successfully.
Nov 22 03:39:22 np0005531888 nova_compute[186788]: 2025-11-22 08:39:22.010 186792 INFO nova.compute.manager [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Took 0.53 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:39:22 np0005531888 nova_compute[186788]: 2025-11-22 08:39:22.011 186792 DEBUG oslo.service.loopingcall [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:39:22 np0005531888 nova_compute[186788]: 2025-11-22 08:39:22.012 186792 DEBUG nova.compute.manager [-] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:39:22 np0005531888 nova_compute[186788]: 2025-11-22 08:39:22.012 186792 DEBUG nova.network.neutron [-] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:39:22 np0005531888 podman[247667]: 2025-11-22 08:39:22.036511021 +0000 UTC m=+0.076407448 container remove 0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.042 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1c34d5cf-bdeb-4b73-88d9-14976acf6da9]: (4, ('Sat Nov 22 08:39:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 (0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854)\n0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854\nSat Nov 22 08:39:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 (0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854)\n0c299f0a067422f6bfe66bd215763f0885ed5648eeec875b05b2b5aaba3fa854\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.043 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4dadc460-1ac8-4ef3-ab0c-1ab6db5c71b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.044 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a869b57-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:39:22 np0005531888 nova_compute[186788]: 2025-11-22 08:39:22.045 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:22 np0005531888 kernel: tap8a869b57-a0: left promiscuous mode
Nov 22 03:39:22 np0005531888 nova_compute[186788]: 2025-11-22 08:39:22.056 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.059 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[658c10ff-4f63-4cdf-b3c7-5bc2879ef955]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.089 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ecaa621c-4d56-4669-a363-3fc8c07d7681]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.090 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9f4f9b-3dfb-4531-a8f4-ba37e46264eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.109 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[42d72663-53f4-49bc-8c7d-654bb92dc710]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743029, 'reachable_time': 15942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247682, 'error': None, 'target': 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:39:22 np0005531888 systemd[1]: run-netns-ovnmeta\x2d8a869b57\x2da476\x2d415c\x2dba30\x2dce607e13fca8.mount: Deactivated successfully.
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.113 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:39:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:22.113 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[75cf379f-ed70-447a-b6a2-5f6d4b9fe9bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:39:23 np0005531888 nova_compute[186788]: 2025-11-22 08:39:23.553 186792 DEBUG nova.compute.manager [req-848cd49d-00ba-436e-b76b-af312d49a53f req-e9c4cdc6-1a21-4cd9-b2cd-9a500d9dca21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-vif-unplugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:39:23 np0005531888 nova_compute[186788]: 2025-11-22 08:39:23.554 186792 DEBUG oslo_concurrency.lockutils [req-848cd49d-00ba-436e-b76b-af312d49a53f req-e9c4cdc6-1a21-4cd9-b2cd-9a500d9dca21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:23 np0005531888 nova_compute[186788]: 2025-11-22 08:39:23.555 186792 DEBUG oslo_concurrency.lockutils [req-848cd49d-00ba-436e-b76b-af312d49a53f req-e9c4cdc6-1a21-4cd9-b2cd-9a500d9dca21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:23 np0005531888 nova_compute[186788]: 2025-11-22 08:39:23.555 186792 DEBUG oslo_concurrency.lockutils [req-848cd49d-00ba-436e-b76b-af312d49a53f req-e9c4cdc6-1a21-4cd9-b2cd-9a500d9dca21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:23 np0005531888 nova_compute[186788]: 2025-11-22 08:39:23.555 186792 DEBUG nova.compute.manager [req-848cd49d-00ba-436e-b76b-af312d49a53f req-e9c4cdc6-1a21-4cd9-b2cd-9a500d9dca21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] No waiting events found dispatching network-vif-unplugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:39:23 np0005531888 nova_compute[186788]: 2025-11-22 08:39:23.555 186792 DEBUG nova.compute.manager [req-848cd49d-00ba-436e-b76b-af312d49a53f req-e9c4cdc6-1a21-4cd9-b2cd-9a500d9dca21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-vif-unplugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:39:24 np0005531888 nova_compute[186788]: 2025-11-22 08:39:24.427 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.428 186792 DEBUG nova.network.neutron [-] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.482 186792 DEBUG nova.compute.manager [req-16cd1062-bc57-4ad4-97fa-ab299d5b9ea7 req-285cfba4-05ff-40a5-bc50-e94ff459f87f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-vif-deleted-def7aa64-0c3a-4e1b-9de9-b44e62099d6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.483 186792 INFO nova.compute.manager [req-16cd1062-bc57-4ad4-97fa-ab299d5b9ea7 req-285cfba4-05ff-40a5-bc50-e94ff459f87f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Neutron deleted interface def7aa64-0c3a-4e1b-9de9-b44e62099d6d; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.483 186792 DEBUG nova.network.neutron [req-16cd1062-bc57-4ad4-97fa-ab299d5b9ea7 req-285cfba4-05ff-40a5-bc50-e94ff459f87f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.499 186792 INFO nova.compute.manager [-] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Took 3.49 seconds to deallocate network for instance.#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.515 186792 DEBUG nova.compute.manager [req-16cd1062-bc57-4ad4-97fa-ab299d5b9ea7 req-285cfba4-05ff-40a5-bc50-e94ff459f87f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Detach interface failed, port_id=def7aa64-0c3a-4e1b-9de9-b44e62099d6d, reason: Instance e304bef1-c7a3-45f0-9975-d9dc37b7fba4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.576 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.577 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.668 186792 DEBUG nova.compute.manager [req-d464e7a5-4ae3-4817-a204-80fa7bfcb3f6 req-7869b774-b5b9-473e-885e-1a2c83dc6792 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received event network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.668 186792 DEBUG oslo_concurrency.lockutils [req-d464e7a5-4ae3-4817-a204-80fa7bfcb3f6 req-7869b774-b5b9-473e-885e-1a2c83dc6792 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.668 186792 DEBUG oslo_concurrency.lockutils [req-d464e7a5-4ae3-4817-a204-80fa7bfcb3f6 req-7869b774-b5b9-473e-885e-1a2c83dc6792 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.669 186792 DEBUG oslo_concurrency.lockutils [req-d464e7a5-4ae3-4817-a204-80fa7bfcb3f6 req-7869b774-b5b9-473e-885e-1a2c83dc6792 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.669 186792 DEBUG nova.compute.manager [req-d464e7a5-4ae3-4817-a204-80fa7bfcb3f6 req-7869b774-b5b9-473e-885e-1a2c83dc6792 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] No waiting events found dispatching network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.669 186792 WARNING nova.compute.manager [req-d464e7a5-4ae3-4817-a204-80fa7bfcb3f6 req-7869b774-b5b9-473e-885e-1a2c83dc6792 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Received unexpected event network-vif-plugged-def7aa64-0c3a-4e1b-9de9-b44e62099d6d for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.684 186792 DEBUG nova.compute.provider_tree [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.698 186792 DEBUG nova.scheduler.client.report [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.725 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.886 186792 INFO nova.scheduler.client.report [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance e304bef1-c7a3-45f0-9975-d9dc37b7fba4#033[00m
Nov 22 03:39:25 np0005531888 nova_compute[186788]: 2025-11-22 08:39:25.911 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updating instance_info_cache with network_info: [{"id": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "address": "fa:16:3e:1b:c0:5c", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef7aa64-0c", "ovs_interfaceid": "def7aa64-0c3a-4e1b-9de9-b44e62099d6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:39:26 np0005531888 nova_compute[186788]: 2025-11-22 08:39:26.097 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-e304bef1-c7a3-45f0-9975-d9dc37b7fba4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:39:26 np0005531888 nova_compute[186788]: 2025-11-22 08:39:26.098 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:39:26 np0005531888 nova_compute[186788]: 2025-11-22 08:39:26.255 186792 DEBUG oslo_concurrency.lockutils [None req-3beec2f9-0bb4-4ecb-8c5a-3d7b5a24b37b 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "e304bef1-c7a3-45f0-9975-d9dc37b7fba4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:26 np0005531888 nova_compute[186788]: 2025-11-22 08:39:26.768 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:27 np0005531888 nova_compute[186788]: 2025-11-22 08:39:27.092 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:27 np0005531888 nova_compute[186788]: 2025-11-22 08:39:27.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:29 np0005531888 nova_compute[186788]: 2025-11-22 08:39:29.428 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:29 np0005531888 nova_compute[186788]: 2025-11-22 08:39:29.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:30 np0005531888 nova_compute[186788]: 2025-11-22 08:39:30.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:30 np0005531888 nova_compute[186788]: 2025-11-22 08:39:30.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:39:31 np0005531888 nova_compute[186788]: 2025-11-22 08:39:31.770 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:33 np0005531888 podman[247683]: 2025-11-22 08:39:33.675333775 +0000 UTC m=+0.048833424 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:39:33 np0005531888 podman[247684]: 2025-11-22 08:39:33.675379176 +0000 UTC m=+0.047149573 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:39:33 np0005531888 nova_compute[186788]: 2025-11-22 08:39:33.975 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:34 np0005531888 nova_compute[186788]: 2025-11-22 08:39:34.430 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:34 np0005531888 nova_compute[186788]: 2025-11-22 08:39:34.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:34 np0005531888 nova_compute[186788]: 2025-11-22 08:39:34.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:34 np0005531888 nova_compute[186788]: 2025-11-22 08:39:34.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:34 np0005531888 nova_compute[186788]: 2025-11-22 08:39:34.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:34 np0005531888 nova_compute[186788]: 2025-11-22 08:39:34.980 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:39:35 np0005531888 nova_compute[186788]: 2025-11-22 08:39:35.155 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:39:35 np0005531888 nova_compute[186788]: 2025-11-22 08:39:35.156 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5709MB free_disk=73.26722717285156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:39:35 np0005531888 nova_compute[186788]: 2025-11-22 08:39:35.156 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:35 np0005531888 nova_compute[186788]: 2025-11-22 08:39:35.156 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:35 np0005531888 nova_compute[186788]: 2025-11-22 08:39:35.233 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:39:35 np0005531888 nova_compute[186788]: 2025-11-22 08:39:35.233 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:39:35 np0005531888 nova_compute[186788]: 2025-11-22 08:39:35.260 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:39:35 np0005531888 nova_compute[186788]: 2025-11-22 08:39:35.276 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:39:36 np0005531888 nova_compute[186788]: 2025-11-22 08:39:36.475 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:39:36 np0005531888 nova_compute[186788]: 2025-11-22 08:39:36.476 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:36 np0005531888 nova_compute[186788]: 2025-11-22 08:39:36.740 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800761.739796, e304bef1-c7a3-45f0-9975-d9dc37b7fba4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:39:36 np0005531888 nova_compute[186788]: 2025-11-22 08:39:36.741 186792 INFO nova.compute.manager [-] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:39:36 np0005531888 nova_compute[186788]: 2025-11-22 08:39:36.769 186792 DEBUG nova.compute.manager [None req-3e06d5dc-8096-4b97-b160-fd87934f208c - - - - - -] [instance: e304bef1-c7a3-45f0-9975-d9dc37b7fba4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:39:36 np0005531888 nova_compute[186788]: 2025-11-22 08:39:36.774 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:36.859 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:36.859 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:36.859 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:38 np0005531888 nova_compute[186788]: 2025-11-22 08:39:38.477 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:38 np0005531888 nova_compute[186788]: 2025-11-22 08:39:38.834 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:38 np0005531888 nova_compute[186788]: 2025-11-22 08:39:38.905 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:38 np0005531888 nova_compute[186788]: 2025-11-22 08:39:38.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:38 np0005531888 nova_compute[186788]: 2025-11-22 08:39:38.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:39:39 np0005531888 nova_compute[186788]: 2025-11-22 08:39:39.431 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:41 np0005531888 nova_compute[186788]: 2025-11-22 08:39:41.776 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:42 np0005531888 nova_compute[186788]: 2025-11-22 08:39:42.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:44 np0005531888 nova_compute[186788]: 2025-11-22 08:39:44.433 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:45 np0005531888 podman[247728]: 2025-11-22 08:39:45.69277889 +0000 UTC m=+0.055129760 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:39:45 np0005531888 podman[247727]: 2025-11-22 08:39:45.692938744 +0000 UTC m=+0.061837135 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:39:46 np0005531888 nova_compute[186788]: 2025-11-22 08:39:46.779 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:49 np0005531888 nova_compute[186788]: 2025-11-22 08:39:49.435 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:49 np0005531888 podman[247771]: 2025-11-22 08:39:49.691167195 +0000 UTC m=+0.060113052 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Nov 22 03:39:49 np0005531888 podman[247772]: 2025-11-22 08:39:49.728564986 +0000 UTC m=+0.092567931 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:39:49 np0005531888 podman[247773]: 2025-11-22 08:39:49.756545125 +0000 UTC m=+0.113708001 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 22 03:39:51 np0005531888 nova_compute[186788]: 2025-11-22 08:39:51.782 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:52 np0005531888 nova_compute[186788]: 2025-11-22 08:39:52.976 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:52 np0005531888 nova_compute[186788]: 2025-11-22 08:39:52.977 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:39:52 np0005531888 nova_compute[186788]: 2025-11-22 08:39:52.991 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:39:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:53.167 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:39:53 np0005531888 nova_compute[186788]: 2025-11-22 08:39:53.168 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:53 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:39:53.168 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:39:54 np0005531888 nova_compute[186788]: 2025-11-22 08:39:54.437 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:56 np0005531888 nova_compute[186788]: 2025-11-22 08:39:56.786 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:59 np0005531888 nova_compute[186788]: 2025-11-22 08:39:59.439 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:00.171 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:01 np0005531888 nova_compute[186788]: 2025-11-22 08:40:01.789 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:04 np0005531888 nova_compute[186788]: 2025-11-22 08:40:04.440 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:04 np0005531888 podman[247837]: 2025-11-22 08:40:04.675546786 +0000 UTC m=+0.048662149 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:40:04 np0005531888 podman[247838]: 2025-11-22 08:40:04.685449651 +0000 UTC m=+0.053924420 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.409 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.409 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.428 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.509 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.509 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.524 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.525 186792 INFO nova.compute.claims [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.725 186792 DEBUG nova.compute.provider_tree [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.736 186792 DEBUG nova.scheduler.client.report [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.757 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.758 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.792 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.801 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.801 186792 DEBUG nova.network.neutron [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.814 186792 INFO nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.835 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.923 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.925 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.926 186792 INFO nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Creating image(s)#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.926 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.927 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.927 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:06 np0005531888 nova_compute[186788]: 2025-11-22 08:40:06.947 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.006 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.007 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.008 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.023 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.079 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.080 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.114 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.115 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.115 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.170 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.171 186792 DEBUG nova.virt.disk.api [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.172 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.191 186792 DEBUG nova.policy [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.229 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.230 186792 DEBUG nova.virt.disk.api [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.230 186792 DEBUG nova.objects.instance [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.251 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.251 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Ensure instance console log exists: /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.252 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.252 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.252 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:07 np0005531888 nova_compute[186788]: 2025-11-22 08:40:07.867 186792 DEBUG nova.network.neutron [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Successfully created port: b45d9774-1141-4b22-81f2-234b0c06226c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:40:08 np0005531888 nova_compute[186788]: 2025-11-22 08:40:08.793 186792 DEBUG nova.network.neutron [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Successfully updated port: b45d9774-1141-4b22-81f2-234b0c06226c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:40:08 np0005531888 nova_compute[186788]: 2025-11-22 08:40:08.808 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:40:08 np0005531888 nova_compute[186788]: 2025-11-22 08:40:08.809 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:40:08 np0005531888 nova_compute[186788]: 2025-11-22 08:40:08.809 186792 DEBUG nova.network.neutron [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:40:08 np0005531888 nova_compute[186788]: 2025-11-22 08:40:08.922 186792 DEBUG nova.compute.manager [req-905eaee0-ab8e-4d03-b950-ade2e9ac5185 req-bca2c628-552f-45bb-9180-5b75fca7f6d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-changed-b45d9774-1141-4b22-81f2-234b0c06226c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:40:08 np0005531888 nova_compute[186788]: 2025-11-22 08:40:08.922 186792 DEBUG nova.compute.manager [req-905eaee0-ab8e-4d03-b950-ade2e9ac5185 req-bca2c628-552f-45bb-9180-5b75fca7f6d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing instance network info cache due to event network-changed-b45d9774-1141-4b22-81f2-234b0c06226c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:40:08 np0005531888 nova_compute[186788]: 2025-11-22 08:40:08.922 186792 DEBUG oslo_concurrency.lockutils [req-905eaee0-ab8e-4d03-b950-ade2e9ac5185 req-bca2c628-552f-45bb-9180-5b75fca7f6d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:40:08 np0005531888 nova_compute[186788]: 2025-11-22 08:40:08.955 186792 DEBUG nova.network.neutron [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.783 186792 DEBUG nova.network.neutron [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.807 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.814 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Instance network_info: |[{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.823 186792 DEBUG oslo_concurrency.lockutils [req-905eaee0-ab8e-4d03-b950-ade2e9ac5185 req-bca2c628-552f-45bb-9180-5b75fca7f6d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.826 186792 DEBUG nova.network.neutron [req-905eaee0-ab8e-4d03-b950-ade2e9ac5185 req-bca2c628-552f-45bb-9180-5b75fca7f6d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing network info cache for port b45d9774-1141-4b22-81f2-234b0c06226c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.831 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Start _get_guest_xml network_info=[{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.837 186792 WARNING nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.842 186792 DEBUG nova.virt.libvirt.host [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.843 186792 DEBUG nova.virt.libvirt.host [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.849 186792 DEBUG nova.virt.libvirt.host [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.849 186792 DEBUG nova.virt.libvirt.host [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.850 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.851 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.851 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.851 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.852 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.852 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.852 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.852 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.852 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.853 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.853 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.853 186792 DEBUG nova.virt.hardware [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.857 186792 DEBUG nova.virt.libvirt.vif [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:40:06Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.857 186792 DEBUG nova.network.os_vif_util [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.858 186792 DEBUG nova.network.os_vif_util [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:10:e5,bridge_name='br-int',has_traffic_filtering=True,id=b45d9774-1141-4b22-81f2-234b0c06226c,network=Network(33840711-4341-4e1a-a9eb-fdace57bd254),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb45d9774-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.859 186792 DEBUG nova.objects.instance [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.874 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <uuid>5966747e-45f2-43c0-9339-5011e53635fd</uuid>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <name>instance-000000ab</name>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkBasicOps-server-453774490</nova:name>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:40:09</nova:creationTime>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        <nova:port uuid="b45d9774-1141-4b22-81f2-234b0c06226c">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <entry name="serial">5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <entry name="uuid">5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.config"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:7f:10:e5"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <target dev="tapb45d9774-11"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log" append="off"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:40:09 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:40:09 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:40:09 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:40:09 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.876 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Preparing to wait for external event network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.876 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.877 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.877 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.878 186792 DEBUG nova.virt.libvirt.vif [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:40:06Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.878 186792 DEBUG nova.network.os_vif_util [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.879 186792 DEBUG nova.network.os_vif_util [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:10:e5,bridge_name='br-int',has_traffic_filtering=True,id=b45d9774-1141-4b22-81f2-234b0c06226c,network=Network(33840711-4341-4e1a-a9eb-fdace57bd254),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb45d9774-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.879 186792 DEBUG os_vif [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:10:e5,bridge_name='br-int',has_traffic_filtering=True,id=b45d9774-1141-4b22-81f2-234b0c06226c,network=Network(33840711-4341-4e1a-a9eb-fdace57bd254),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb45d9774-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.880 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.880 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.881 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.883 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.883 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb45d9774-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.884 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb45d9774-11, col_values=(('external_ids', {'iface-id': 'b45d9774-1141-4b22-81f2-234b0c06226c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:10:e5', 'vm-uuid': '5966747e-45f2-43c0-9339-5011e53635fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.885 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:09 np0005531888 NetworkManager[55166]: <info>  [1763800809.8865] manager: (tapb45d9774-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.888 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.892 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:09 np0005531888 nova_compute[186788]: 2025-11-22 08:40:09.893 186792 INFO os_vif [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:10:e5,bridge_name='br-int',has_traffic_filtering=True,id=b45d9774-1141-4b22-81f2-234b0c06226c,network=Network(33840711-4341-4e1a-a9eb-fdace57bd254),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb45d9774-11')#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.051 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.052 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.052 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:7f:10:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.053 186792 INFO nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Using config drive#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.381 186792 INFO nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Creating config drive at /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.config#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.386 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzg7f45l5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.510 186792 DEBUG oslo_concurrency.processutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzg7f45l5" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:40:10 np0005531888 kernel: tapb45d9774-11: entered promiscuous mode
Nov 22 03:40:10 np0005531888 NetworkManager[55166]: <info>  [1763800810.5790] manager: (tapb45d9774-11): new Tun device (/org/freedesktop/NetworkManager/Devices/331)
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.578 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:10Z|00695|binding|INFO|Claiming lport b45d9774-1141-4b22-81f2-234b0c06226c for this chassis.
Nov 22 03:40:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:10Z|00696|binding|INFO|b45d9774-1141-4b22-81f2-234b0c06226c: Claiming fa:16:3e:7f:10:e5 10.100.0.11
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.605 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.614 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:10:e5 10.100.0.11'], port_security=['fa:16:3e:7f:10:e5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33840711-4341-4e1a-a9eb-fdace57bd254', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7385b9c4-7ae8-4013-983d-fc34cdf06060', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa1f2ac2-785f-446c-9f84-7df67f45fc3e, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=b45d9774-1141-4b22-81f2-234b0c06226c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.616 104023 INFO neutron.agent.ovn.metadata.agent [-] Port b45d9774-1141-4b22-81f2-234b0c06226c in datapath 33840711-4341-4e1a-a9eb-fdace57bd254 bound to our chassis#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.617 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33840711-4341-4e1a-a9eb-fdace57bd254#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.635 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[990eaafc-180e-4362-8d95-193740ce1a3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.637 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap33840711-41 in ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:40:10 np0005531888 systemd-udevd[247914]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.640 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap33840711-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.641 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4c21cc04-e4f2-4204-8ac1-106e5bcc7070]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.642 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d7180744-3e8c-40f5-9a6f-66a8175a18cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 systemd-machined[153106]: New machine qemu-83-instance-000000ab.
Nov 22 03:40:10 np0005531888 NetworkManager[55166]: <info>  [1763800810.6574] device (tapb45d9774-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:40:10 np0005531888 NetworkManager[55166]: <info>  [1763800810.6584] device (tapb45d9774-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.657 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[f69e9c53-5572-407f-8e2e-c839c67c31b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.661 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:10Z|00697|binding|INFO|Setting lport b45d9774-1141-4b22-81f2-234b0c06226c ovn-installed in OVS
Nov 22 03:40:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:10Z|00698|binding|INFO|Setting lport b45d9774-1141-4b22-81f2-234b0c06226c up in Southbound
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.668 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:10 np0005531888 systemd[1]: Started Virtual Machine qemu-83-instance-000000ab.
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.675 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7720ddb7-1798-414d-a374-3eeba80a0a57]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.704 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f71d2a19-f9d3-4551-bc66-aae1e39d1c66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.710 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[23d3cb71-c6c6-41fe-9c79-8f1a67390e7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 systemd-udevd[247918]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:40:10 np0005531888 NetworkManager[55166]: <info>  [1763800810.7113] manager: (tap33840711-40): new Veth device (/org/freedesktop/NetworkManager/Devices/332)
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.738 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e73522e3-a128-44a8-be94-9a022d134f7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.741 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[000d92c5-e36a-4604-a855-bc2af1530f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 NetworkManager[55166]: <info>  [1763800810.7661] device (tap33840711-40): carrier: link connected
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.773 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5454ee99-cbd1-4126-a514-c8b40b2f11ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.790 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[afe1361e-4095-47f8-a5c8-4db1f120482e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33840711-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:4d:b4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755140, 'reachable_time': 31177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247947, 'error': None, 'target': 'ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.807 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fd1fc106-8c6e-40ea-a123-96d684aebbfc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feed:4db4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 755140, 'tstamp': 755140}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247948, 'error': None, 'target': 'ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.825 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3df4b22c-74d1-45ea-a8ab-6906cf7458a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33840711-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:4d:b4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755140, 'reachable_time': 31177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247949, 'error': None, 'target': 'ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.839 186792 DEBUG nova.network.neutron [req-905eaee0-ab8e-4d03-b950-ade2e9ac5185 req-bca2c628-552f-45bb-9180-5b75fca7f6d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updated VIF entry in instance network info cache for port b45d9774-1141-4b22-81f2-234b0c06226c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.840 186792 DEBUG nova.network.neutron [req-905eaee0-ab8e-4d03-b950-ade2e9ac5185 req-bca2c628-552f-45bb-9180-5b75fca7f6d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.854 186792 DEBUG oslo_concurrency.lockutils [req-905eaee0-ab8e-4d03-b950-ade2e9ac5185 req-bca2c628-552f-45bb-9180-5b75fca7f6d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.857 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[79c518d5-6c2f-44df-82dd-a6fcd594b4d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.933 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[07b37c6a-8911-48dd-b9a1-35cf8077204b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.934 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33840711-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.935 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.935 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33840711-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:10 np0005531888 kernel: tap33840711-40: entered promiscuous mode
Nov 22 03:40:10 np0005531888 NetworkManager[55166]: <info>  [1763800810.9376] manager: (tap33840711-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/333)
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.938 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.940 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.941 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33840711-40, col_values=(('external_ids', {'iface-id': 'c5e5de30-961a-4548-9448-54ac752e5c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.942 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:10 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:10Z|00699|binding|INFO|Releasing lport c5e5de30-961a-4548-9448-54ac752e5c80 from this chassis (sb_readonly=0)
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.944 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.944 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/33840711-4341-4e1a-a9eb-fdace57bd254.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/33840711-4341-4e1a-a9eb-fdace57bd254.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.945 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c479b69f-9077-4c74-ba9f-9854aea1ac4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.946 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-33840711-4341-4e1a-a9eb-fdace57bd254
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/33840711-4341-4e1a-a9eb-fdace57bd254.pid.haproxy
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 33840711-4341-4e1a-a9eb-fdace57bd254
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:40:10 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:10.946 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254', 'env', 'PROCESS_TAG=haproxy-33840711-4341-4e1a-a9eb-fdace57bd254', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/33840711-4341-4e1a-a9eb-fdace57bd254.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:40:10 np0005531888 nova_compute[186788]: 2025-11-22 08:40:10.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.007 186792 DEBUG nova.compute.manager [req-5e35e4c2-312a-4c5b-9b07-db67aee9fbe5 req-861a056f-41e9-45cb-8ff0-0c81ae1e7c2e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.008 186792 DEBUG oslo_concurrency.lockutils [req-5e35e4c2-312a-4c5b-9b07-db67aee9fbe5 req-861a056f-41e9-45cb-8ff0-0c81ae1e7c2e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.008 186792 DEBUG oslo_concurrency.lockutils [req-5e35e4c2-312a-4c5b-9b07-db67aee9fbe5 req-861a056f-41e9-45cb-8ff0-0c81ae1e7c2e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.008 186792 DEBUG oslo_concurrency.lockutils [req-5e35e4c2-312a-4c5b-9b07-db67aee9fbe5 req-861a056f-41e9-45cb-8ff0-0c81ae1e7c2e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.009 186792 DEBUG nova.compute.manager [req-5e35e4c2-312a-4c5b-9b07-db67aee9fbe5 req-861a056f-41e9-45cb-8ff0-0c81ae1e7c2e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Processing event network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.310 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800811.30979, 5966747e-45f2-43c0-9339-5011e53635fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.311 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] VM Started (Lifecycle Event)#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.314 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.317 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.320 186792 INFO nova.virt.libvirt.driver [-] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Instance spawned successfully.#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.320 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.335 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.338 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.348 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.348 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.349 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.349 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.349 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.350 186792 DEBUG nova.virt.libvirt.driver [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.355 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.355 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800811.3099809, 5966747e-45f2-43c0-9339-5011e53635fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.355 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:40:11 np0005531888 podman[247987]: 2025-11-22 08:40:11.275057437 +0000 UTC m=+0.021864400 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.384 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.387 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763800811.316455, 5966747e-45f2-43c0-9339-5011e53635fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.388 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.424 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.428 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.430 186792 INFO nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Took 4.51 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.431 186792 DEBUG nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.459 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:40:11 np0005531888 podman[247987]: 2025-11-22 08:40:11.501046243 +0000 UTC m=+0.247853176 container create 327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.528 186792 INFO nova.compute.manager [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Took 5.06 seconds to build instance.#033[00m
Nov 22 03:40:11 np0005531888 nova_compute[186788]: 2025-11-22 08:40:11.545 186792 DEBUG oslo_concurrency.lockutils [None req-7492e5b2-faf0-4397-a092-e3d803cb386d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:11 np0005531888 systemd[1]: Started libpod-conmon-327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360.scope.
Nov 22 03:40:11 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:40:11 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a871537357e5ce2eac9b5d9b8e116700cff553292e3495542fc770a4c641d984/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:40:11 np0005531888 podman[247987]: 2025-11-22 08:40:11.594201448 +0000 UTC m=+0.341008381 container init 327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:40:11 np0005531888 podman[247987]: 2025-11-22 08:40:11.600553544 +0000 UTC m=+0.347360477 container start 327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:40:11 np0005531888 neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254[248003]: [NOTICE]   (248007) : New worker (248009) forked
Nov 22 03:40:11 np0005531888 neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254[248003]: [NOTICE]   (248007) : Loading success.
Nov 22 03:40:13 np0005531888 nova_compute[186788]: 2025-11-22 08:40:13.086 186792 DEBUG nova.compute.manager [req-a85194f7-c673-4d4c-b0ea-aa490ff3e37a req-b5985e60-09d3-4ce4-99c3-ec939e42044a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:40:13 np0005531888 nova_compute[186788]: 2025-11-22 08:40:13.087 186792 DEBUG oslo_concurrency.lockutils [req-a85194f7-c673-4d4c-b0ea-aa490ff3e37a req-b5985e60-09d3-4ce4-99c3-ec939e42044a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:13 np0005531888 nova_compute[186788]: 2025-11-22 08:40:13.087 186792 DEBUG oslo_concurrency.lockutils [req-a85194f7-c673-4d4c-b0ea-aa490ff3e37a req-b5985e60-09d3-4ce4-99c3-ec939e42044a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:13 np0005531888 nova_compute[186788]: 2025-11-22 08:40:13.087 186792 DEBUG oslo_concurrency.lockutils [req-a85194f7-c673-4d4c-b0ea-aa490ff3e37a req-b5985e60-09d3-4ce4-99c3-ec939e42044a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:13 np0005531888 nova_compute[186788]: 2025-11-22 08:40:13.087 186792 DEBUG nova.compute.manager [req-a85194f7-c673-4d4c-b0ea-aa490ff3e37a req-b5985e60-09d3-4ce4-99c3-ec939e42044a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] No waiting events found dispatching network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:40:13 np0005531888 nova_compute[186788]: 2025-11-22 08:40:13.088 186792 WARNING nova.compute.manager [req-a85194f7-c673-4d4c-b0ea-aa490ff3e37a req-b5985e60-09d3-4ce4-99c3-ec939e42044a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received unexpected event network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c for instance with vm_state active and task_state None.#033[00m
Nov 22 03:40:14 np0005531888 nova_compute[186788]: 2025-11-22 08:40:14.444 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:14 np0005531888 nova_compute[186788]: 2025-11-22 08:40:14.886 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:16 np0005531888 podman[248019]: 2025-11-22 08:40:16.677455977 +0000 UTC m=+0.048707110 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:40:16 np0005531888 podman[248018]: 2025-11-22 08:40:16.682784119 +0000 UTC m=+0.056981855 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:40:18 np0005531888 nova_compute[186788]: 2025-11-22 08:40:18.969 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:19 np0005531888 nova_compute[186788]: 2025-11-22 08:40:19.446 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:19 np0005531888 nova_compute[186788]: 2025-11-22 08:40:19.888 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:20 np0005531888 podman[248062]: 2025-11-22 08:40:20.694419348 +0000 UTC m=+0.063519065 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, release=1755695350)
Nov 22 03:40:20 np0005531888 podman[248063]: 2025-11-22 08:40:20.704598249 +0000 UTC m=+0.069933063 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:40:20 np0005531888 podman[248064]: 2025-11-22 08:40:20.733232004 +0000 UTC m=+0.095440052 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:40:20 np0005531888 nova_compute[186788]: 2025-11-22 08:40:20.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:20 np0005531888 nova_compute[186788]: 2025-11-22 08:40:20.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:40:20 np0005531888 nova_compute[186788]: 2025-11-22 08:40:20.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:40:21 np0005531888 nova_compute[186788]: 2025-11-22 08:40:21.136 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:40:21 np0005531888 nova_compute[186788]: 2025-11-22 08:40:21.137 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:40:21 np0005531888 nova_compute[186788]: 2025-11-22 08:40:21.137 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:40:21 np0005531888 nova_compute[186788]: 2025-11-22 08:40:21.137 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:40:22 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:22Z|00700|binding|INFO|Releasing lport c5e5de30-961a-4548-9448-54ac752e5c80 from this chassis (sb_readonly=0)
Nov 22 03:40:22 np0005531888 NetworkManager[55166]: <info>  [1763800822.2604] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Nov 22 03:40:22 np0005531888 nova_compute[186788]: 2025-11-22 08:40:22.260 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:22 np0005531888 NetworkManager[55166]: <info>  [1763800822.2616] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Nov 22 03:40:22 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:22Z|00701|binding|INFO|Releasing lport c5e5de30-961a-4548-9448-54ac752e5c80 from this chassis (sb_readonly=0)
Nov 22 03:40:22 np0005531888 nova_compute[186788]: 2025-11-22 08:40:22.288 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:22 np0005531888 nova_compute[186788]: 2025-11-22 08:40:22.293 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:23 np0005531888 nova_compute[186788]: 2025-11-22 08:40:23.455 186792 DEBUG nova.compute.manager [req-dea88c7d-d552-478a-877e-34c104531a9a req-734afb92-ea24-4871-817b-7cd602ca6a7a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-changed-b45d9774-1141-4b22-81f2-234b0c06226c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:40:23 np0005531888 nova_compute[186788]: 2025-11-22 08:40:23.457 186792 DEBUG nova.compute.manager [req-dea88c7d-d552-478a-877e-34c104531a9a req-734afb92-ea24-4871-817b-7cd602ca6a7a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing instance network info cache due to event network-changed-b45d9774-1141-4b22-81f2-234b0c06226c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:40:23 np0005531888 nova_compute[186788]: 2025-11-22 08:40:23.457 186792 DEBUG oslo_concurrency.lockutils [req-dea88c7d-d552-478a-877e-34c104531a9a req-734afb92-ea24-4871-817b-7cd602ca6a7a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:40:24 np0005531888 nova_compute[186788]: 2025-11-22 08:40:24.332 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:40:24 np0005531888 nova_compute[186788]: 2025-11-22 08:40:24.351 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:40:24 np0005531888 nova_compute[186788]: 2025-11-22 08:40:24.352 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:40:24 np0005531888 nova_compute[186788]: 2025-11-22 08:40:24.353 186792 DEBUG oslo_concurrency.lockutils [req-dea88c7d-d552-478a-877e-34c104531a9a req-734afb92-ea24-4871-817b-7cd602ca6a7a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:40:24 np0005531888 nova_compute[186788]: 2025-11-22 08:40:24.353 186792 DEBUG nova.network.neutron [req-dea88c7d-d552-478a-877e-34c104531a9a req-734afb92-ea24-4871-817b-7cd602ca6a7a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing network info cache for port b45d9774-1141-4b22-81f2-234b0c06226c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:40:24 np0005531888 nova_compute[186788]: 2025-11-22 08:40:24.448 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:24 np0005531888 nova_compute[186788]: 2025-11-22 08:40:24.890 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:26Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:10:e5 10.100.0.11
Nov 22 03:40:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:26Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:10:e5 10.100.0.11
Nov 22 03:40:26 np0005531888 nova_compute[186788]: 2025-11-22 08:40:26.431 186792 DEBUG nova.network.neutron [req-dea88c7d-d552-478a-877e-34c104531a9a req-734afb92-ea24-4871-817b-7cd602ca6a7a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updated VIF entry in instance network info cache for port b45d9774-1141-4b22-81f2-234b0c06226c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:40:26 np0005531888 nova_compute[186788]: 2025-11-22 08:40:26.432 186792 DEBUG nova.network.neutron [req-dea88c7d-d552-478a-877e-34c104531a9a req-734afb92-ea24-4871-817b-7cd602ca6a7a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:40:26 np0005531888 nova_compute[186788]: 2025-11-22 08:40:26.462 186792 DEBUG oslo_concurrency.lockutils [req-dea88c7d-d552-478a-877e-34c104531a9a req-734afb92-ea24-4871-817b-7cd602ca6a7a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:40:27 np0005531888 nova_compute[186788]: 2025-11-22 08:40:27.348 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:27 np0005531888 nova_compute[186788]: 2025-11-22 08:40:27.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:29 np0005531888 nova_compute[186788]: 2025-11-22 08:40:29.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:29 np0005531888 nova_compute[186788]: 2025-11-22 08:40:29.892 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:31 np0005531888 nova_compute[186788]: 2025-11-22 08:40:31.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:33 np0005531888 nova_compute[186788]: 2025-11-22 08:40:33.002 186792 INFO nova.compute.manager [None req-d3ed069b-2593-4327-8d9f-1bbe0720f3ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Get console output#033[00m
Nov 22 03:40:33 np0005531888 nova_compute[186788]: 2025-11-22 08:40:33.008 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:40:34 np0005531888 nova_compute[186788]: 2025-11-22 08:40:34.452 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:34 np0005531888 nova_compute[186788]: 2025-11-22 08:40:34.894 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:35 np0005531888 podman[248138]: 2025-11-22 08:40:35.685543194 +0000 UTC m=+0.057518378 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 03:40:35 np0005531888 podman[248137]: 2025-11-22 08:40:35.711801571 +0000 UTC m=+0.087351122 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:40:35 np0005531888 nova_compute[186788]: 2025-11-22 08:40:35.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:35 np0005531888 nova_compute[186788]: 2025-11-22 08:40:35.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:35 np0005531888 nova_compute[186788]: 2025-11-22 08:40:35.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:35 np0005531888 nova_compute[186788]: 2025-11-22 08:40:35.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:35 np0005531888 nova_compute[186788]: 2025-11-22 08:40:35.993 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:35 np0005531888 nova_compute[186788]: 2025-11-22 08:40:35.993 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.091 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.157 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.158 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.212 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.391 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.393 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5542MB free_disk=73.23831558227539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.394 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.394 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.481 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 5966747e-45f2-43c0-9339-5011e53635fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.482 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.482 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.500 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.523 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.524 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.543 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.565 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.618 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.633 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.664 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:40:36 np0005531888 nova_compute[186788]: 2025-11-22 08:40:36.664 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.855 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5966747e-45f2-43c0-9339-5011e53635fd', 'name': 'tempest-TestNetworkBasicOps-server-453774490', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ab', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '12f63a6d87a947758ab928c0d625ff06', 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'hostId': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.859 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5966747e-45f2-43c0-9339-5011e53635fd / tapb45d9774-11 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.859 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8207c923-a842-46ff-abaa-c28d05e86d10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.856719', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eadc46c8-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': '217ee8980b454ddd9c23bc4fdf3d5bcc429127382b30fa2296cd4b9a8c92d9d2'}]}, 'timestamp': '2025-11-22 08:40:36.859825', '_unique_id': '85142b8d89be449bb61f4d91e64880d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.860 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:36.861 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.861 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:36.862 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:36.862 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.884 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.885 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9eda615f-01e5-4fc4-85ec-acb9aade7ef3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.862130', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eae03300-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': 'a0ec24fe5cf47d6c315be9622947ce53ff1c17bf5a7397d0a3a818c3f669bd0f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.862130', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eae0405c-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': 'f074ba8c352e3999d5db55b7ea8c5ab8399ea18afbaeafe023cc1a254f0640fd'}]}, 'timestamp': '2025-11-22 08:40:36.885831', '_unique_id': '9f6c1ae262784c9c955a9693166db63c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.888 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.888 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.latency volume: 765605834 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.888 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.latency volume: 56991771 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '847c3993-3b68-4330-836c-238c26ab7595', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 765605834, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.888169', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eae0a772-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': 'e5c77fee6a1ddbb615ccf754a9e3b01436297d1837822eab8e2652b2ebdc245f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56991771, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.888169', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eae0b294-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': '93c2e05faf4197a74645d24c856c76454ebb5bdb19369c02c8a395a1435cc0f6'}]}, 'timestamp': '2025-11-22 08:40:36.888742', '_unique_id': '15caf6c3d1ea4c87bbe0957de7916630'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.890 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0270da94-2a93-42cf-a11c-1d8ba5f70d5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.890271', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eae0f8d0-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': '31b77a4eb85cf7e2a5dd659c97f8c6f09a7cc824b5d9b994f4dc4ef116aecc0f'}]}, 'timestamp': '2025-11-22 08:40:36.890551', '_unique_id': '9c07bf762bff4ce39aff49fc2da62ae2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.891 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.892 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.892 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-453774490>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-453774490>]
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.892 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.892 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.requests volume: 1094 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.892 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f576f287-3fab-4122-ac62-972577b82107', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1094, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.892461', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eae14e48-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': '4ac944fd8ea68f5907d6a1a584b047be01f5d306d0d0e2889e7fde8352010c50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.892461', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eae1579e-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': '96eb0d54b15d49a10c8015cc5e23050018429b7dc9071f5567b655de7b8510d9'}]}, 'timestamp': '2025-11-22 08:40:36.892962', '_unique_id': 'a67a84f8d04e4100a0239ff0ebb65a6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.894 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.894 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.894 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-453774490>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-453774490>]
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.894 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dce89fec-d294-4e7b-95ff-a9dc25e0369f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.894816', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eae1a9e2-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': '50e62361b86ec7cb69316a662d9a577f3d0beb55cc2ff3693e258eeca07906f7'}]}, 'timestamp': '2025-11-22 08:40:36.895087', '_unique_id': '0ff6ed6cad4644d9a66acd4e00a7c1a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.895 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.896 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.906 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.906 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e6ce8d3-0d5e-42ff-81dd-a27bfd3b20a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.896609', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eae36bf6-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.5961166, 'message_signature': 'bf55fd3570c681d10db7153f711a1c41ec9d7c0ff505045688761fb971670020'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.896609', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eae37bc8-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.5961166, 'message_signature': '204ea0de26ac7246482e18f83a3efdd3e5ea7a714d168ec7547f759d43b29f98'}]}, 'timestamp': '2025-11-22 08:40:36.907072', '_unique_id': '4e9367d4f9ac4663b9867a54451b9505'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.909 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.909 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-453774490>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-453774490>]
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.910 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.bytes volume: 30484992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.910 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3db8652-e7be-4a2f-a136-4ff0ca9b4809', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30484992, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.910315', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eae4084a-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': 'bb95a2468954b402b8b371713d90733121b5140c260dac6efca6f9bffc7ed6a1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.910315', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eae4148e-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': '21569585da09f97c7ade4a93c077d99abbc16f7b4bdb149c45d7fc67f01b7020'}]}, 'timestamp': '2025-11-22 08:40:36.910940', '_unique_id': '2f12757006d046bf87d188a0bafd6d4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.911 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.913 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d331c89-22ad-48eb-95d0-8a3c2bcfe638', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.913007', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eae472e4-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': 'bd56e9427ebde389a5b9ba9ab20bce2b8ff13d3718cc41f699457beef3c0bc82'}]}, 'timestamp': '2025-11-22 08:40:36.913386', '_unique_id': 'ad4db3f022484ef38cf62a10185a767b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.914 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.915 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.916 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1fcd65c-0a44-498f-8353-0e9e5acdd452', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.915702', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eae4dcfc-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.5961166, 'message_signature': '660426223e4c54bb8ea7718714ddb7ab702f1aaa01592a2ebdb2cbee9a39b4e0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.915702', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eae4eb48-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.5961166, 'message_signature': '1034c35a7c74f08a0deaef701b04c4f857d20bb8ee0eda75729ac64f34655deb'}]}, 'timestamp': '2025-11-22 08:40:36.916447', '_unique_id': '12b9a55e879c4286a31eece58fbc7773'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.917 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.918 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1848b46c-5b8c-441d-9ba5-816c500232cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.918337', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eae540f2-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': '22a8d4350bdd9584eb79908a892e5cf51d6370504ea6fac2b9a2b6332e0f31f4'}]}, 'timestamp': '2025-11-22 08:40:36.918629', '_unique_id': '5a4f382b74624c2ea2b8308f62cbea7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64c11dd0-2158-4259-8524-28449aadfd48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.920100', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eae585a8-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': 'a8caae72278c269967f29d2a05dca4920c17d5beb1eb6427d6f74566586f5402'}]}, 'timestamp': '2025-11-22 08:40:36.920371', '_unique_id': '5b75a35b25de4a1b87d0e7e4b96439ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.920 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.921 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.938 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/memory.usage volume: 42.7109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbc55199-fd66-42d5-8041-e27088581ec4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7109375, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'timestamp': '2025-11-22T08:40:36.922098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'eae87196-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.638205027, 'message_signature': 'd19f20c4a5772c872a7a63bff0a24719670d4f634b2eaa8114a598c199f8e779'}]}, 'timestamp': '2025-11-22 08:40:36.939688', '_unique_id': '301aff3d22cc4df29aedc5b440e246ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.942 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.942 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ec7e2ca-b2fe-433c-ad90-90057e85be37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.942313', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eae8ede2-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.5961166, 'message_signature': 'ae931aca65616c2fbca02fb9084b3dcb0f31c360c7c0a8039375add6b412fe59'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.942313', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eae8fbd4-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.5961166, 'message_signature': '695c9c77890ce8f38b98dd39d5d2a386c23353bf315f141722ccdf44ab5660bb'}]}, 'timestamp': '2025-11-22 08:40:36.943462', '_unique_id': 'e2d9307139a1443daa9f576817dd2ec6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.945 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.bytes volume: 3418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '604a1580-8ba3-4dd9-a2fd-b91cf9a1c7b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3418, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.945817', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eae973e8-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': '5a759ec072791ab82dbbc8c40d1ff4a3e15a1958f9ccf3b80af1232829cd92b4'}]}, 'timestamp': '2025-11-22 08:40:36.946148', '_unique_id': 'e2a37b1ee46340179fc7c25423d3bdf7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.947 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.latency volume: 3872803399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.948 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '733b37ce-9636-4fdb-8235-54f839cea56b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3872803399, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.947859', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eae9c352-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': '620b07443cfcbbac869fff363600e3e1974f8b22294f0a6c1bca6fb35ab08617'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.947859', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eae9ceba-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': 'e56048936d51c526a34447ffc894b1cca45bda4bfd9c327e7c047957586df720'}]}, 'timestamp': '2025-11-22 08:40:36.948472', '_unique_id': '363f728cd27447f5acff5f8bb98c0a26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.950 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.bytes volume: 4389 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b4c328f-cec1-44d6-9ed9-a9459e55b16c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4389, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.950331', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eaea2392-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': '903b34e359101b6bc8a184e1be653dc96dc2481b39f01aa6003ef6ceefa7156a'}]}, 'timestamp': '2025-11-22 08:40:36.950658', '_unique_id': '364d83940c614b44b6c3c8614b24c08a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.952 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26aed321-9177-4a5f-b777-7309530b0eb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.952381', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eaea72de-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': '5be51bf6f04efd571e517061127c8247a95791b6ba3ad2f2ef001235c48b87c9'}]}, 'timestamp': '2025-11-22 08:40:36.952705', '_unique_id': '06c24cafe36c485eb4985c0394a6876a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.954 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.954 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.bytes volume: 72916992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.955 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97fd96c9-c081-4406-9145-4588466d10e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72916992, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:40:36.954679', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'eaeace28-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': 'b911eea51b354cd4865a53e8b167a47c1bda1b471f3da5684086d1442802376f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:40:36.954679', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'eaeae264-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.561634281, 'message_signature': '0e92b4a83b61e666de2c982aa26c7afcd7c0af4dcc45d0af5eeab2d19392659d'}]}, 'timestamp': '2025-11-22 08:40:36.955550', '_unique_id': '82d3c57234264a05ad7636003b67e04c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.958 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.958 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-453774490>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-453774490>]
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.958 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd36a34b-9965-43e3-acf9-832cf875065f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:40:36.958769', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': 'eaeb6ea0-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.556207288, 'message_signature': 'bbd98a0afe3de99c5a202c240352ba825676198271368eb865fa462e4827a204'}]}, 'timestamp': '2025-11-22 08:40:36.959158', '_unique_id': '924dd1ad53514ccc867d63f737a7bb3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.959 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.961 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.961 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/cpu volume: 13400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db5f79a8-3031-474b-8367-f492e97fa2f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13400000000, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'timestamp': '2025-11-22T08:40:36.961284', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'eaebd08e-c77e-11f0-941d-fa163e6775e5', 'monotonic_time': 7577.638205027, 'message_signature': 'b44d894b10c14fda20e067cd8ea3fa3289f6f6289498818e0a473287344c499b'}]}, 'timestamp': '2025-11-22 08:40:36.961642', '_unique_id': 'bc6aff8c2dbf4d94a6fa847cfb5b3e1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:40:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:40:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:40:37 np0005531888 nova_compute[186788]: 2025-11-22 08:40:37.665 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:38 np0005531888 nova_compute[186788]: 2025-11-22 08:40:38.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:38 np0005531888 nova_compute[186788]: 2025-11-22 08:40:38.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:40:39 np0005531888 nova_compute[186788]: 2025-11-22 08:40:39.454 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:39 np0005531888 nova_compute[186788]: 2025-11-22 08:40:39.896 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:40.246 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:40:40 np0005531888 nova_compute[186788]: 2025-11-22 08:40:40.246 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:40.247 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:40:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:41.249 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:42 np0005531888 nova_compute[186788]: 2025-11-22 08:40:42.337 186792 DEBUG oslo_concurrency.lockutils [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "interface-5966747e-45f2-43c0-9339-5011e53635fd-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:42 np0005531888 nova_compute[186788]: 2025-11-22 08:40:42.337 186792 DEBUG oslo_concurrency.lockutils [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "interface-5966747e-45f2-43c0-9339-5011e53635fd-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:42 np0005531888 nova_compute[186788]: 2025-11-22 08:40:42.338 186792 DEBUG nova.objects.instance [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'flavor' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:40:42 np0005531888 nova_compute[186788]: 2025-11-22 08:40:42.762 186792 DEBUG nova.objects.instance [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:40:42 np0005531888 nova_compute[186788]: 2025-11-22 08:40:42.772 186792 DEBUG nova.network.neutron [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:40:42 np0005531888 nova_compute[186788]: 2025-11-22 08:40:42.918 186792 DEBUG nova.policy [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:40:43 np0005531888 nova_compute[186788]: 2025-11-22 08:40:43.715 186792 DEBUG nova.network.neutron [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Successfully created port: e755646b-370c-417b-bd1d-eb2939ed2c4e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.456 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.520 186792 DEBUG nova.network.neutron [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Successfully updated port: e755646b-370c-417b-bd1d-eb2939ed2c4e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.535 186792 DEBUG oslo_concurrency.lockutils [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.535 186792 DEBUG oslo_concurrency.lockutils [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.536 186792 DEBUG nova.network.neutron [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.691 186792 DEBUG nova.compute.manager [req-9a2d5525-5d84-40fd-b1e5-545da05ce7da req-36c0e6cf-6c61-4657-adda-580226fb8891 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-changed-e755646b-370c-417b-bd1d-eb2939ed2c4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.692 186792 DEBUG nova.compute.manager [req-9a2d5525-5d84-40fd-b1e5-545da05ce7da req-36c0e6cf-6c61-4657-adda-580226fb8891 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing instance network info cache due to event network-changed-e755646b-370c-417b-bd1d-eb2939ed2c4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.692 186792 DEBUG oslo_concurrency.lockutils [req-9a2d5525-5d84-40fd-b1e5-545da05ce7da req-36c0e6cf-6c61-4657-adda-580226fb8891 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:40:44 np0005531888 nova_compute[186788]: 2025-11-22 08:40:44.898 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:47 np0005531888 podman[248185]: 2025-11-22 08:40:47.696160157 +0000 UTC m=+0.060610234 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:40:47 np0005531888 podman[248184]: 2025-11-22 08:40:47.696072535 +0000 UTC m=+0.062441548 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.200 186792 DEBUG nova.network.neutron [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.230 186792 DEBUG oslo_concurrency.lockutils [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.231 186792 DEBUG oslo_concurrency.lockutils [req-9a2d5525-5d84-40fd-b1e5-545da05ce7da req-36c0e6cf-6c61-4657-adda-580226fb8891 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.231 186792 DEBUG nova.network.neutron [req-9a2d5525-5d84-40fd-b1e5-545da05ce7da req-36c0e6cf-6c61-4657-adda-580226fb8891 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing network info cache for port e755646b-370c-417b-bd1d-eb2939ed2c4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.234 186792 DEBUG nova.virt.libvirt.vif [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:40:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:40:11Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.234 186792 DEBUG nova.network.os_vif_util [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.235 186792 DEBUG nova.network.os_vif_util [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.235 186792 DEBUG os_vif [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.236 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.236 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.236 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.239 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.240 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape755646b-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.240 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape755646b-37, col_values=(('external_ids', {'iface-id': 'e755646b-370c-417b-bd1d-eb2939ed2c4e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:cd:dd', 'vm-uuid': '5966747e-45f2-43c0-9339-5011e53635fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.241 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 NetworkManager[55166]: <info>  [1763800848.2427] manager: (tape755646b-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/336)
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.243 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.248 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.249 186792 INFO os_vif [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37')#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.250 186792 DEBUG nova.virt.libvirt.vif [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:40:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:40:11Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.250 186792 DEBUG nova.network.os_vif_util [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.251 186792 DEBUG nova.network.os_vif_util [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.256 186792 DEBUG nova.virt.libvirt.guest [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] attach device xml: <interface type="ethernet">
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <mac address="fa:16:3e:c0:cd:dd"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <model type="virtio"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <mtu size="1442"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <target dev="tape755646b-37"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]: </interface>
Nov 22 03:40:48 np0005531888 nova_compute[186788]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 22 03:40:48 np0005531888 kernel: tape755646b-37: entered promiscuous mode
Nov 22 03:40:48 np0005531888 NetworkManager[55166]: <info>  [1763800848.2701] manager: (tape755646b-37): new Tun device (/org/freedesktop/NetworkManager/Devices/337)
Nov 22 03:40:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:48Z|00702|binding|INFO|Claiming lport e755646b-370c-417b-bd1d-eb2939ed2c4e for this chassis.
Nov 22 03:40:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:48Z|00703|binding|INFO|e755646b-370c-417b-bd1d-eb2939ed2c4e: Claiming fa:16:3e:c0:cd:dd 10.100.0.21
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.270 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.292 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:cd:dd 10.100.0.21'], port_security=['fa:16:3e:c0:cd:dd 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8d3a649-095f-4b94-af55-194302b82348', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1130d42c-f40b-4a39-88f2-637246715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8bf1006-f131-4804-9af5-1455421df1f6, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e755646b-370c-417b-bd1d-eb2939ed2c4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.293 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e755646b-370c-417b-bd1d-eb2939ed2c4e in datapath b8d3a649-095f-4b94-af55-194302b82348 bound to our chassis#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.295 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8d3a649-095f-4b94-af55-194302b82348#033[00m
Nov 22 03:40:48 np0005531888 systemd-udevd[248231]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:40:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:48Z|00704|binding|INFO|Setting lport e755646b-370c-417b-bd1d-eb2939ed2c4e ovn-installed in OVS
Nov 22 03:40:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:48Z|00705|binding|INFO|Setting lport e755646b-370c-417b-bd1d-eb2939ed2c4e up in Southbound
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.309 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.308 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa7c3bc-f411-4b5d-85b6-2a8791a1782e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.310 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8d3a649-01 in ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:40:48 np0005531888 NetworkManager[55166]: <info>  [1763800848.3139] device (tape755646b-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.312 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8d3a649-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.312 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c19c6395-9b3b-4afa-8361-3506295c20d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 NetworkManager[55166]: <info>  [1763800848.3151] device (tape755646b-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.315 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ed410dbb-4324-4536-a34f-08d4e7904d51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.329 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[7d983af1-4a19-46b0-8835-6baa94a6425e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.342 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[711d886e-e0d5-4ea0-86c4-10b32d98e94b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.374 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5083c559-4fbf-4e2c-8a24-58f11a44826d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.379 186792 DEBUG nova.virt.libvirt.driver [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.379 186792 DEBUG nova.virt.libvirt.driver [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.380 186792 DEBUG nova.virt.libvirt.driver [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:7f:10:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.380 186792 DEBUG nova.virt.libvirt.driver [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:c0:cd:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:40:48 np0005531888 systemd-udevd[248234]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:40:48 np0005531888 NetworkManager[55166]: <info>  [1763800848.3814] manager: (tapb8d3a649-00): new Veth device (/org/freedesktop/NetworkManager/Devices/338)
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.381 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5209d529-a9af-437a-b34e-94d073e39549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.409 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c566404c-89da-4a11-ba38-6147015340c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.412 186792 DEBUG nova.virt.libvirt.guest [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <nova:name>tempest-TestNetworkBasicOps-server-453774490</nova:name>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:40:48</nova:creationTime>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:port uuid="b45d9774-1141-4b22-81f2-234b0c06226c">
Nov 22 03:40:48 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    <nova:port uuid="e755646b-370c-417b-bd1d-eb2939ed2c4e">
Nov 22 03:40:48 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:40:48 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:40:48 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:40:48 np0005531888 nova_compute[186788]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.413 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[59e9d31e-cb70-4863-8709-912dfe0b86df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.436 186792 DEBUG oslo_concurrency.lockutils [None req-5ab14365-3b6c-4414-b727-d75e00030d50 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "interface-5966747e-45f2-43c0-9339-5011e53635fd-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:48 np0005531888 NetworkManager[55166]: <info>  [1763800848.4398] device (tapb8d3a649-00): carrier: link connected
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.446 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e31aba-3cfb-4e84-a71f-d48da89f99f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.464 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a963a9f1-e340-4d12-8c47-9ad979d383b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8d3a649-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8d:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758908, 'reachable_time': 37971, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248258, 'error': None, 'target': 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.484 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[59ed4191-c7b5-4b4c-a5d6-6289603130c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:8d73'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 758908, 'tstamp': 758908}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248259, 'error': None, 'target': 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.501 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ca34d345-8ed2-4520-80c0-05a11b6cd4a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8d3a649-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8d:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758908, 'reachable_time': 37971, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248260, 'error': None, 'target': 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.528 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2d5f01e2-b816-45e5-bb25-a22753fa8765]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.579 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ba77ce1e-133c-44b2-87a9-84d52a290356]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.580 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8d3a649-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.580 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.581 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8d3a649-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:48 np0005531888 kernel: tapb8d3a649-00: entered promiscuous mode
Nov 22 03:40:48 np0005531888 NetworkManager[55166]: <info>  [1763800848.5834] manager: (tapb8d3a649-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.582 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.585 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.586 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8d3a649-00, col_values=(('external_ids', {'iface-id': '02280f12-e9a8-46b2-8ba0-7dd36f2a11f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.587 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:48Z|00706|binding|INFO|Releasing lport 02280f12-e9a8-46b2-8ba0-7dd36f2a11f6 from this chassis (sb_readonly=0)
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.588 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.589 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8d3a649-095f-4b94-af55-194302b82348.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8d3a649-095f-4b94-af55-194302b82348.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.590 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6df7a109-67b4-476c-861e-9e9781f39a76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.591 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-b8d3a649-095f-4b94-af55-194302b82348
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/b8d3a649-095f-4b94-af55-194302b82348.pid.haproxy
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID b8d3a649-095f-4b94-af55-194302b82348
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:40:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:40:48.593 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'env', 'PROCESS_TAG=haproxy-b8d3a649-095f-4b94-af55-194302b82348', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8d3a649-095f-4b94-af55-194302b82348.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:40:48 np0005531888 nova_compute[186788]: 2025-11-22 08:40:48.600 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531888 podman[248292]: 2025-11-22 08:40:48.995436543 +0000 UTC m=+0.061437265 container create db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:40:49 np0005531888 systemd[1]: Started libpod-conmon-db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c.scope.
Nov 22 03:40:49 np0005531888 podman[248292]: 2025-11-22 08:40:48.963130967 +0000 UTC m=+0.029131709 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:40:49 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:40:49 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35aac08efb62a4230e3b17a54be3baaedbf3ba68fc952d41ba8fc08674084c44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:40:49 np0005531888 podman[248292]: 2025-11-22 08:40:49.093249963 +0000 UTC m=+0.159250705 container init db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:40:49 np0005531888 podman[248292]: 2025-11-22 08:40:49.099300832 +0000 UTC m=+0.165301544 container start db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:40:49 np0005531888 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[248307]: [NOTICE]   (248311) : New worker (248313) forked
Nov 22 03:40:49 np0005531888 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[248307]: [NOTICE]   (248311) : Loading success.
Nov 22 03:40:49 np0005531888 nova_compute[186788]: 2025-11-22 08:40:49.458 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:50 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:50Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:cd:dd 10.100.0.21
Nov 22 03:40:50 np0005531888 ovn_controller[95067]: 2025-11-22T08:40:50Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:cd:dd 10.100.0.21
Nov 22 03:40:50 np0005531888 nova_compute[186788]: 2025-11-22 08:40:50.539 186792 DEBUG nova.compute.manager [req-e5861c49-110a-4c59-8047-e1ca40b5ee9f req-3d10ddc2-8ee1-43ce-b70e-0e2703b85208 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:40:50 np0005531888 nova_compute[186788]: 2025-11-22 08:40:50.540 186792 DEBUG oslo_concurrency.lockutils [req-e5861c49-110a-4c59-8047-e1ca40b5ee9f req-3d10ddc2-8ee1-43ce-b70e-0e2703b85208 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:50 np0005531888 nova_compute[186788]: 2025-11-22 08:40:50.540 186792 DEBUG oslo_concurrency.lockutils [req-e5861c49-110a-4c59-8047-e1ca40b5ee9f req-3d10ddc2-8ee1-43ce-b70e-0e2703b85208 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:50 np0005531888 nova_compute[186788]: 2025-11-22 08:40:50.541 186792 DEBUG oslo_concurrency.lockutils [req-e5861c49-110a-4c59-8047-e1ca40b5ee9f req-3d10ddc2-8ee1-43ce-b70e-0e2703b85208 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:50 np0005531888 nova_compute[186788]: 2025-11-22 08:40:50.541 186792 DEBUG nova.compute.manager [req-e5861c49-110a-4c59-8047-e1ca40b5ee9f req-3d10ddc2-8ee1-43ce-b70e-0e2703b85208 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] No waiting events found dispatching network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:40:50 np0005531888 nova_compute[186788]: 2025-11-22 08:40:50.541 186792 WARNING nova.compute.manager [req-e5861c49-110a-4c59-8047-e1ca40b5ee9f req-3d10ddc2-8ee1-43ce-b70e-0e2703b85208 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received unexpected event network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e for instance with vm_state active and task_state None.#033[00m
Nov 22 03:40:51 np0005531888 podman[248322]: 2025-11-22 08:40:51.681288817 +0000 UTC m=+0.054389761 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Nov 22 03:40:51 np0005531888 podman[248323]: 2025-11-22 08:40:51.686795562 +0000 UTC m=+0.058104483 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 22 03:40:51 np0005531888 podman[248324]: 2025-11-22 08:40:51.737935122 +0000 UTC m=+0.106885094 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 03:40:52 np0005531888 nova_compute[186788]: 2025-11-22 08:40:52.884 186792 DEBUG nova.compute.manager [req-7235da20-a185-482c-ad86-913339da78ae req-795d2cdf-57b6-4e7f-8a08-18da66a70fae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:40:52 np0005531888 nova_compute[186788]: 2025-11-22 08:40:52.884 186792 DEBUG oslo_concurrency.lockutils [req-7235da20-a185-482c-ad86-913339da78ae req-795d2cdf-57b6-4e7f-8a08-18da66a70fae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:52 np0005531888 nova_compute[186788]: 2025-11-22 08:40:52.884 186792 DEBUG oslo_concurrency.lockutils [req-7235da20-a185-482c-ad86-913339da78ae req-795d2cdf-57b6-4e7f-8a08-18da66a70fae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:52 np0005531888 nova_compute[186788]: 2025-11-22 08:40:52.885 186792 DEBUG oslo_concurrency.lockutils [req-7235da20-a185-482c-ad86-913339da78ae req-795d2cdf-57b6-4e7f-8a08-18da66a70fae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:52 np0005531888 nova_compute[186788]: 2025-11-22 08:40:52.885 186792 DEBUG nova.compute.manager [req-7235da20-a185-482c-ad86-913339da78ae req-795d2cdf-57b6-4e7f-8a08-18da66a70fae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] No waiting events found dispatching network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:40:52 np0005531888 nova_compute[186788]: 2025-11-22 08:40:52.885 186792 WARNING nova.compute.manager [req-7235da20-a185-482c-ad86-913339da78ae req-795d2cdf-57b6-4e7f-8a08-18da66a70fae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received unexpected event network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e for instance with vm_state active and task_state None.#033[00m
Nov 22 03:40:53 np0005531888 nova_compute[186788]: 2025-11-22 08:40:53.241 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:54 np0005531888 nova_compute[186788]: 2025-11-22 08:40:54.397 186792 DEBUG nova.network.neutron [req-9a2d5525-5d84-40fd-b1e5-545da05ce7da req-36c0e6cf-6c61-4657-adda-580226fb8891 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updated VIF entry in instance network info cache for port e755646b-370c-417b-bd1d-eb2939ed2c4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:40:54 np0005531888 nova_compute[186788]: 2025-11-22 08:40:54.397 186792 DEBUG nova.network.neutron [req-9a2d5525-5d84-40fd-b1e5-545da05ce7da req-36c0e6cf-6c61-4657-adda-580226fb8891 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:40:54 np0005531888 nova_compute[186788]: 2025-11-22 08:40:54.444 186792 DEBUG oslo_concurrency.lockutils [req-9a2d5525-5d84-40fd-b1e5-545da05ce7da req-36c0e6cf-6c61-4657-adda-580226fb8891 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:40:54 np0005531888 nova_compute[186788]: 2025-11-22 08:40:54.460 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:58 np0005531888 nova_compute[186788]: 2025-11-22 08:40:58.244 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:59 np0005531888 nova_compute[186788]: 2025-11-22 08:40:59.462 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:00 np0005531888 nova_compute[186788]: 2025-11-22 08:41:00.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:03 np0005531888 nova_compute[186788]: 2025-11-22 08:41:03.245 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:04 np0005531888 nova_compute[186788]: 2025-11-22 08:41:04.463 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:06 np0005531888 podman[248385]: 2025-11-22 08:41:06.682740214 +0000 UTC m=+0.051865149 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:41:06 np0005531888 podman[248386]: 2025-11-22 08:41:06.710628471 +0000 UTC m=+0.073421770 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:41:08 np0005531888 nova_compute[186788]: 2025-11-22 08:41:08.247 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:09 np0005531888 nova_compute[186788]: 2025-11-22 08:41:09.464 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:13 np0005531888 nova_compute[186788]: 2025-11-22 08:41:13.249 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:14 np0005531888 nova_compute[186788]: 2025-11-22 08:41:14.465 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:18 np0005531888 nova_compute[186788]: 2025-11-22 08:41:18.250 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:18 np0005531888 podman[248431]: 2025-11-22 08:41:18.669389348 +0000 UTC m=+0.042480547 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:41:18 np0005531888 podman[248430]: 2025-11-22 08:41:18.674376801 +0000 UTC m=+0.051274874 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:41:18 np0005531888 nova_compute[186788]: 2025-11-22 08:41:18.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:19 np0005531888 nova_compute[186788]: 2025-11-22 08:41:19.467 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:22 np0005531888 podman[248474]: 2025-11-22 08:41:22.693102777 +0000 UTC m=+0.065554656 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 03:41:22 np0005531888 podman[248475]: 2025-11-22 08:41:22.711561342 +0000 UTC m=+0.080015002 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Nov 22 03:41:22 np0005531888 podman[248476]: 2025-11-22 08:41:22.73343809 +0000 UTC m=+0.091042543 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:41:22 np0005531888 nova_compute[186788]: 2025-11-22 08:41:22.958 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:22 np0005531888 nova_compute[186788]: 2025-11-22 08:41:22.958 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:41:22 np0005531888 nova_compute[186788]: 2025-11-22 08:41:22.958 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:41:23 np0005531888 nova_compute[186788]: 2025-11-22 08:41:23.255 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:23 np0005531888 nova_compute[186788]: 2025-11-22 08:41:23.265 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:41:23 np0005531888 nova_compute[186788]: 2025-11-22 08:41:23.265 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:41:23 np0005531888 nova_compute[186788]: 2025-11-22 08:41:23.266 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:41:23 np0005531888 nova_compute[186788]: 2025-11-22 08:41:23.266 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:41:24 np0005531888 nova_compute[186788]: 2025-11-22 08:41:24.470 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:28 np0005531888 nova_compute[186788]: 2025-11-22 08:41:28.258 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:29 np0005531888 nova_compute[186788]: 2025-11-22 08:41:29.472 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:29 np0005531888 nova_compute[186788]: 2025-11-22 08:41:29.886 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:41:30 np0005531888 nova_compute[186788]: 2025-11-22 08:41:30.007 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:41:30 np0005531888 nova_compute[186788]: 2025-11-22 08:41:30.008 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:41:30 np0005531888 nova_compute[186788]: 2025-11-22 08:41:30.008 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:32 np0005531888 nova_compute[186788]: 2025-11-22 08:41:32.000 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:33 np0005531888 nova_compute[186788]: 2025-11-22 08:41:33.260 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:33 np0005531888 nova_compute[186788]: 2025-11-22 08:41:33.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:34 np0005531888 nova_compute[186788]: 2025-11-22 08:41:34.474 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:35 np0005531888 nova_compute[186788]: 2025-11-22 08:41:35.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:35 np0005531888 nova_compute[186788]: 2025-11-22 08:41:35.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:41:35 np0005531888 nova_compute[186788]: 2025-11-22 08:41:35.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:41:35 np0005531888 nova_compute[186788]: 2025-11-22 08:41:35.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:41:35 np0005531888 nova_compute[186788]: 2025-11-22 08:41:35.983 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.043 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.102 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.103 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.162 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.341 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.343 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5525MB free_disk=73.23738479614258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.343 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.343 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.555 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 5966747e-45f2-43c0-9339-5011e53635fd actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.555 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.555 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.782 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.799 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.801 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:41:36 np0005531888 nova_compute[186788]: 2025-11-22 08:41:36.802 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:41:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:41:36.863 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:41:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:41:36.864 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:41:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:41:36.865 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:41:37 np0005531888 podman[248543]: 2025-11-22 08:41:37.672011432 +0000 UTC m=+0.041891823 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:41:37 np0005531888 podman[248544]: 2025-11-22 08:41:37.696463814 +0000 UTC m=+0.058076191 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:41:37 np0005531888 nova_compute[186788]: 2025-11-22 08:41:37.802 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:37 np0005531888 nova_compute[186788]: 2025-11-22 08:41:37.803 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:38 np0005531888 nova_compute[186788]: 2025-11-22 08:41:38.261 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:39 np0005531888 nova_compute[186788]: 2025-11-22 08:41:39.477 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:40 np0005531888 nova_compute[186788]: 2025-11-22 08:41:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:40 np0005531888 nova_compute[186788]: 2025-11-22 08:41:40.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:41:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:41:41Z|00707|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Nov 22 03:41:43 np0005531888 nova_compute[186788]: 2025-11-22 08:41:43.263 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:43 np0005531888 nova_compute[186788]: 2025-11-22 08:41:43.497 186792 DEBUG nova.compute.manager [req-967dabc3-7e90-4183-9772-b198e01be5d2 req-b4125a7c-88ce-4689-8b70-5645c3b2d9fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-changed-e755646b-370c-417b-bd1d-eb2939ed2c4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:41:43 np0005531888 nova_compute[186788]: 2025-11-22 08:41:43.498 186792 DEBUG nova.compute.manager [req-967dabc3-7e90-4183-9772-b198e01be5d2 req-b4125a7c-88ce-4689-8b70-5645c3b2d9fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing instance network info cache due to event network-changed-e755646b-370c-417b-bd1d-eb2939ed2c4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:41:43 np0005531888 nova_compute[186788]: 2025-11-22 08:41:43.498 186792 DEBUG oslo_concurrency.lockutils [req-967dabc3-7e90-4183-9772-b198e01be5d2 req-b4125a7c-88ce-4689-8b70-5645c3b2d9fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:41:43 np0005531888 nova_compute[186788]: 2025-11-22 08:41:43.498 186792 DEBUG oslo_concurrency.lockutils [req-967dabc3-7e90-4183-9772-b198e01be5d2 req-b4125a7c-88ce-4689-8b70-5645c3b2d9fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:41:43 np0005531888 nova_compute[186788]: 2025-11-22 08:41:43.498 186792 DEBUG nova.network.neutron [req-967dabc3-7e90-4183-9772-b198e01be5d2 req-b4125a7c-88ce-4689-8b70-5645c3b2d9fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing network info cache for port e755646b-370c-417b-bd1d-eb2939ed2c4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:41:44 np0005531888 nova_compute[186788]: 2025-11-22 08:41:44.479 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:45 np0005531888 nova_compute[186788]: 2025-11-22 08:41:45.674 186792 DEBUG nova.network.neutron [req-967dabc3-7e90-4183-9772-b198e01be5d2 req-b4125a7c-88ce-4689-8b70-5645c3b2d9fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updated VIF entry in instance network info cache for port e755646b-370c-417b-bd1d-eb2939ed2c4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:41:45 np0005531888 nova_compute[186788]: 2025-11-22 08:41:45.675 186792 DEBUG nova.network.neutron [req-967dabc3-7e90-4183-9772-b198e01be5d2 req-b4125a7c-88ce-4689-8b70-5645c3b2d9fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:41:45 np0005531888 nova_compute[186788]: 2025-11-22 08:41:45.694 186792 DEBUG oslo_concurrency.lockutils [req-967dabc3-7e90-4183-9772-b198e01be5d2 req-b4125a7c-88ce-4689-8b70-5645c3b2d9fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:41:48 np0005531888 nova_compute[186788]: 2025-11-22 08:41:48.265 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:49 np0005531888 nova_compute[186788]: 2025-11-22 08:41:49.483 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:49 np0005531888 podman[248586]: 2025-11-22 08:41:49.69131744 +0000 UTC m=+0.051452629 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 03:41:49 np0005531888 podman[248587]: 2025-11-22 08:41:49.69495985 +0000 UTC m=+0.051872949 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:41:53 np0005531888 nova_compute[186788]: 2025-11-22 08:41:53.267 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:53 np0005531888 podman[248631]: 2025-11-22 08:41:53.682396855 +0000 UTC m=+0.054730849 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:41:53 np0005531888 podman[248632]: 2025-11-22 08:41:53.693505479 +0000 UTC m=+0.059956468 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:41:53 np0005531888 podman[248633]: 2025-11-22 08:41:53.71181885 +0000 UTC m=+0.076432204 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:41:54 np0005531888 nova_compute[186788]: 2025-11-22 08:41:54.485 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:58 np0005531888 nova_compute[186788]: 2025-11-22 08:41:58.269 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:59 np0005531888 nova_compute[186788]: 2025-11-22 08:41:59.487 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:03 np0005531888 nova_compute[186788]: 2025-11-22 08:42:03.271 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:04 np0005531888 nova_compute[186788]: 2025-11-22 08:42:04.491 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:08 np0005531888 nova_compute[186788]: 2025-11-22 08:42:08.273 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:08 np0005531888 podman[248697]: 2025-11-22 08:42:08.675271126 +0000 UTC m=+0.045260817 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:42:08 np0005531888 podman[248698]: 2025-11-22 08:42:08.680942645 +0000 UTC m=+0.047254255 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:42:09 np0005531888 nova_compute[186788]: 2025-11-22 08:42:09.493 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:12 np0005531888 ovn_controller[95067]: 2025-11-22T08:42:12Z|00708|memory_trim|INFO|Detected inactivity (last active 30020 ms ago): trimming memory
Nov 22 03:42:13 np0005531888 nova_compute[186788]: 2025-11-22 08:42:13.275 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:14 np0005531888 nova_compute[186788]: 2025-11-22 08:42:14.496 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:18 np0005531888 nova_compute[186788]: 2025-11-22 08:42:18.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:19 np0005531888 nova_compute[186788]: 2025-11-22 08:42:19.499 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:20 np0005531888 podman[248741]: 2025-11-22 08:42:20.675232159 +0000 UTC m=+0.045867940 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:42:20 np0005531888 podman[248740]: 2025-11-22 08:42:20.710620561 +0000 UTC m=+0.084115594 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:42:20 np0005531888 nova_compute[186788]: 2025-11-22 08:42:20.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:23 np0005531888 nova_compute[186788]: 2025-11-22 08:42:23.278 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:23 np0005531888 nova_compute[186788]: 2025-11-22 08:42:23.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:23 np0005531888 nova_compute[186788]: 2025-11-22 08:42:23.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:42:23 np0005531888 nova_compute[186788]: 2025-11-22 08:42:23.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:42:24 np0005531888 nova_compute[186788]: 2025-11-22 08:42:24.368 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:42:24 np0005531888 nova_compute[186788]: 2025-11-22 08:42:24.368 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:42:24 np0005531888 nova_compute[186788]: 2025-11-22 08:42:24.368 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:42:24 np0005531888 nova_compute[186788]: 2025-11-22 08:42:24.368 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:42:24 np0005531888 nova_compute[186788]: 2025-11-22 08:42:24.500 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:24 np0005531888 podman[248781]: 2025-11-22 08:42:24.69132061 +0000 UTC m=+0.056708389 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 22 03:42:24 np0005531888 podman[248782]: 2025-11-22 08:42:24.697468221 +0000 UTC m=+0.060523872 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 03:42:24 np0005531888 podman[248783]: 2025-11-22 08:42:24.752463805 +0000 UTC m=+0.110679767 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 03:42:28 np0005531888 nova_compute[186788]: 2025-11-22 08:42:28.280 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:29 np0005531888 nova_compute[186788]: 2025-11-22 08:42:29.400 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:42:29 np0005531888 nova_compute[186788]: 2025-11-22 08:42:29.431 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:42:29 np0005531888 nova_compute[186788]: 2025-11-22 08:42:29.431 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:42:29 np0005531888 nova_compute[186788]: 2025-11-22 08:42:29.502 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:29 np0005531888 nova_compute[186788]: 2025-11-22 08:42:29.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:29 np0005531888 nova_compute[186788]: 2025-11-22 08:42:29.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:33 np0005531888 nova_compute[186788]: 2025-11-22 08:42:33.282 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:34 np0005531888 nova_compute[186788]: 2025-11-22 08:42:34.506 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:34.714 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:42:34 np0005531888 nova_compute[186788]: 2025-11-22 08:42:34.715 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:34.716 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:42:34 np0005531888 nova_compute[186788]: 2025-11-22 08:42:34.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.859 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5966747e-45f2-43c0-9339-5011e53635fd', 'name': 'tempest-TestNetworkBasicOps-server-453774490', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ab', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '12f63a6d87a947758ab928c0d625ff06', 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'hostId': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.860 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.864 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5966747e-45f2-43c0-9339-5011e53635fd / tape755646b-37 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.864 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.864 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:36.864 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:36.865 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:36.866 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4aa0f76a-d55c-47ec-a3bd-d507cffa7fae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.860471', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '326397e4-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '6ea6b76880d1aee8be5f7874bf61d8fffcddae0d94acc7df0d595797d0b951dd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.860471', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '3263a5ea-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '2e01d925852e7e28b9be7aba9d3db17e8a313fc1b99df9f301a987165af205d1'}]}, 'timestamp': '2025-11-22 08:42:36.865727', '_unique_id': 'a348bf768b554165873f3eda36766570'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.866 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.867 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.867 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets volume: 721 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.868 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets volume: 44 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cfc63ad-4896-4d01-9ca7-883e064e6637', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 721, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.867843', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '326417f0-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '7160f34bc76239538de34195f8b2ae39413e7a48e3c852fa107fa402f6acc9c7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 44, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.867843', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '32642344-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '44952ae9657e504333b46a317afebec5dd924354bd4c33f786f15422804ef9f3'}]}, 'timestamp': '2025-11-22 08:42:36.868446', '_unique_id': 'd7d9998d6b9d42a590687c2172d3742a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.869 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets volume: 825 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0ef240b-4bbe-486c-af1b-0733918fcbab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 825, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.869827', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '3264648a-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'a6a20e96bc3cffaebcc70e262318cf06f53cbab131c1934316da08786f429f06'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.869827', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '32646c82-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'aaa7729171c373d97a984de85f55de93cfaac905c0b3b328fef0aa6e136a050c'}]}, 'timestamp': '2025-11-22 08:42:36.870256', '_unique_id': 'fd6602b6020f485e92f341ed39d9b62d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.870 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.871 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.871 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.bytes.delta volume: 121942 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.871 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '145044c1-2385-484f-9285-604831d88b61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 121942, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.871417', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '3264a260-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '2f035bbc509306191feeb73843fcc58311022856ccead2dd5456df53d92af5c7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.871417', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '3264adf0-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'e3bb1e28588c312563c580541e7b373fd93cb9946deba7a0f591fbcf68310782'}]}, 'timestamp': '2025-11-22 08:42:36.871961', '_unique_id': '19b4d657e3c74e98a139fcbb173d4810'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.872 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.873 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:42:36 np0005531888 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.906 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.latency volume: 769303253 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.907 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.latency volume: 56991771 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89fdef6b-e1f3-475b-88d2-fd383691db72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 769303253, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.873869', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '326a0930-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': '069aab463b1e03469b0652c185efe8b1858b63116c1899c73be1bed419a75829'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56991771, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.873869', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '326a1420-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': '6a55e85fdec727b25eac4f1c71d3a680bb7488c7448bb1f09b2cc2d422ae6b64'}]}, 'timestamp': '2025-11-22 08:42:36.907361', '_unique_id': '85d20dbb5fab4877aa5a43db38d6695b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.908 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.923 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.924 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9853f5e-0fbf-47cf-9b80-be4999feee04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.909476', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '326c9d62-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.609036253, 'message_signature': '84347d20741ca8589788daa6728baa737b7ec0d15ea9f6f430d3c351c53612ea'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.909476', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '326cab18-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.609036253, 'message_signature': 'dc0012c642ca88affe4e87faed533b382c8e63ef75291e0fbbfde27fef2e820b'}]}, 'timestamp': '2025-11-22 08:42:36.924340', '_unique_id': 'a385ece13786499a9f12f75553c12a67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.925 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.926 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.requests volume: 377 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.926 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53fdc41c-b6db-4ca5-a09d-ffc410ddd5d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 377, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.926416', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '326d0a0e-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': '310b1112aad324dd7a9a26a283e7a3608078d9ee56ca844f9ba2e7ee2dfc591a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.926416', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '326d129c-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': '51f25822d486f06680acc2c46d9f8907ac97031aa1161a2a7588d68d7fc1d33c'}]}, 'timestamp': '2025-11-22 08:42:36.926948', '_unique_id': '2d034e1609b1485980ea5c7b5788d6f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.928 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.bytes volume: 30558720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.928 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '959e20f3-83ea-4916-ad09-98f7e8f7152d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30558720, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.928352', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '326d5298-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': '21a64c9336b6a0842f974e38032f7e331d4bf1b4390ca14fa0e06c911e2c4fd9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.928352', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '326d5cc0-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': '55dee349c7d9aaeed9832d6d28382c1b1383e77c1701ece495916a7e21aa9062'}]}, 'timestamp': '2025-11-22 08:42:36.928834', '_unique_id': 'eeba25b8f7fe40afb2064f2b5e7929c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.930 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.947 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/cpu volume: 14930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d2cc336-51dd-4109-8323-2cde9a087455', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14930000000, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'timestamp': '2025-11-22T08:42:36.930208', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '32704160-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.64665946, 'message_signature': '9c6600ed221d5fe6d3025edf47a69e4ea7426f73b70686966255a14a4579ef1b'}]}, 'timestamp': '2025-11-22 08:42:36.947878', '_unique_id': 'c39e89178b1a46dc901058db8fc93d01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.950 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.latency volume: 6039354137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.950 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a149471b-ccd9-4519-b848-f7197e654c19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6039354137, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.950150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3270a6dc-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': '2d3167bca19b2014df828466b8628fa363a8afa7d90cf66396b3eb1ffa685903'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.950150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3270b294-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': 'bda74367f03cf1f85bef6bacb2f89661494896ca349a71e8536d74a65a218e57'}]}, 'timestamp': '2025-11-22 08:42:36.950731', '_unique_id': 'aee1c1c38cb146ec8ff591f7fdee0b4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.952 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.952 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c477f8c-e5ec-4250-ab51-7bdda619c9a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.952678', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '32710abe-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '9cb094277402d688a08a3a67023f1be5ea11dc876d82eb993f2d7648b9b83bd2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.952678', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '32711608-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'f7d4d7989030b57a473435f9f2c854d9a9044a74b0af6696c3a9f81b123f62f2'}]}, 'timestamp': '2025-11-22 08:42:36.953246', '_unique_id': '50ff8775862742b1a6a3c808cdf4910a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 nova_compute[186788]: 2025-11-22 08:42:36.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.954 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.955 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.955 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b257c6b-852c-4cec-b4d3-af6629f46f54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.955078', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '32716888-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.609036253, 'message_signature': '44bbbf384995a016687f3333b1f41b9bb83744ed3f42a4c0f78c842a36a5d0cc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.955078', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3271726a-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.609036253, 'message_signature': 'b5f16e59f04b9d8957be3894fc13a366d5f4dc5f4253cc43c80b1a489fc1a58a'}]}, 'timestamp': '2025-11-22 08:42:36.955634', '_unique_id': '1b7d3e73f4044e8f9042fa05ebdc89b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.957 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.requests volume: 1100 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.957 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d333d05-45ec-40c3-8455-8f8cde88395e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1100, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.957659', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3271cd78-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': 'e7de311b37b8cce068066b23aa61870ad705feeb7795828ae56b91abb7aaacde'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.957659', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3271d75a-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': 'ec4d00f7b8d73b4fea4251e5ae6daf27a299c71d5f91f43ff62e974f826dcfec'}]}, 'timestamp': '2025-11-22 08:42:36.958191', '_unique_id': 'b496aa13edc94332b1782bef18c0bccf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.960 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.bytes volume: 73457664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.960 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e6b3033-731c-4a70-a2cc-bdbf8647a1fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73457664, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.960178', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '32722f2a-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': 'f6ffc9214c81de275525a41563e5e419ff0eadaa7905a48275bc2fef7a1d560b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.960178', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '327239c0-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.573380175, 'message_signature': 'afb9aa10c924d7092ab946fd092cd6fb4193b8625c520397da8d5797b4f724ed'}]}, 'timestamp': '2025-11-22 08:42:36.960708', '_unique_id': 'aa9b019f71634c7a97c8264022fb1d0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.962 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.962 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96aacf87-e489-4429-bc5b-702df231c34f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.962288', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '327281be-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'e5521be1ca6f9e178c120d415a1bca0690b4ed07f3b127c6075a95b0abef7543'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.962288', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '32728d1c-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '672399b7944b6b5736ae09dc1014f4d0abaee9b18f7c372c548583e234771bf7'}]}, 'timestamp': '2025-11-22 08:42:36.962847', '_unique_id': '992ca3499640484a9faf0d26756a0aec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.964 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.usage volume: 30277632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.965 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0582a523-e81b-4958-ba2f-bfa001059ae4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30277632, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-vda', 'timestamp': '2025-11-22T08:42:36.964814', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3272e492-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.609036253, 'message_signature': 'aed7ddca5ad8af0b55c4c836da77fbae89dc296b9a0d07a2bce7240bcc4a9464'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd-sda', 'timestamp': '2025-11-22T08:42:36.964814', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3272f0a4-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.609036253, 'message_signature': '29d4a574455c58382a0aff3e8e28ed3ca05e634abf86f93ef1c8584ad10bae91'}]}, 'timestamp': '2025-11-22 08:42:36.965414', '_unique_id': '16d7b2f61ca346328a7512a879ffa7f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.966 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.967 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.967 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.bytes.delta volume: 135376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.968 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30c598e1-322f-4b7b-a63f-f0baac3c150a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 135376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.967685', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '32735684-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '266a5e183c6eb3eac9423e82be88799b7307eb2a04724e51382ab22276b5fd76'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.967685', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '3273612e-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'b1f422f4aebe5a0be2bdb9f581cd9b1b082a8330d38f8e69893849b888f98666'}]}, 'timestamp': '2025-11-22 08:42:36.968282', '_unique_id': '238210ee80b84da4af6e0e99c1cda3f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.970 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.970 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.bytes volume: 139765 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.970 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.bytes volume: 4200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cce2d712-d345-4f11-9688-2198cabde796', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 139765, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.970641', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '3273c9e8-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'a6056a5a7c9d281c314e6185c26fb876f458378eea1a8bed0ce1d83583f054a8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4200, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.970641', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '3273d5e6-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'baea70886dce2e5898f8651b27e6d02f6b44b462a89a5089b4681c549bec877a'}]}, 'timestamp': '2025-11-22 08:42:36.971286', '_unique_id': '56fbbdde86cf4e91912f6736072de62e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.973 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.973 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.bytes volume: 125360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.973 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.outgoing.bytes volume: 5552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6442bd4-e407-4d0b-a22f-4a1abc6faecb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 125360, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.973305', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '32743310-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '0646d2479f7d09bfc575b64b73be101d1dcd1955a736bbc1d1c9b2eb100c9077'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5552, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.973305', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '327440a8-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'ea004b47096ef755f8ca4a5f8871badb48463baa4f9b6ad9b7a6d68447cbffe6'}]}, 'timestamp': '2025-11-22 08:42:36.974006', '_unique_id': '71a593a6f5c64367ab60938bb401ae14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.974 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/memory.usage volume: 42.66796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2632142a-9a48-44fd-ad9e-f7d87602fd91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.66796875, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'timestamp': '2025-11-22T08:42:36.976173', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'instance-000000ab', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3274a052-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.64665946, 'message_signature': 'b65ba0e29fa668a84b24749b94a8cbfe5ee359a6ea4bcc6e0efe4c0636e8347f'}]}, 'timestamp': '2025-11-22 08:42:36.976449', '_unique_id': '356b92a2b9614791b2e730170d9b3eb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.978 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.978 12 DEBUG ceilometer.compute.pollsters [-] 5966747e-45f2-43c0-9339-5011e53635fd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71c45157-3c28-40cf-b4d2-5c9493b41a4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tapb45d9774-11', 'timestamp': '2025-11-22T08:42:36.978118', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tapb45d9774-11', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7f:10:e5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb45d9774-11'}, 'message_id': '3274edbe-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': 'bac61b3c12eec60e4cdfff5697f5ea02c792f0e6027ec42b5612fd83bddf7a1f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000ab-5966747e-45f2-43c0-9339-5011e53635fd-tape755646b-37', 'timestamp': '2025-11-22T08:42:36.978118', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-453774490', 'name': 'tape755646b-37', 'instance_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:cd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape755646b-37'}, 'message_id': '3274fc28-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7697.559972914, 'message_signature': '12a9fd5c306621582ab1e9243c87ff8036cc3b0dff3767bc2b605e17e7e14399'}]}, 'timestamp': '2025-11-22 08:42:36.978802', '_unique_id': '812ea51ce55b472796ae8282ddd5ce3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:42:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:42:36.979 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:42:36 np0005531888 nova_compute[186788]: 2025-11-22 08:42:36.991 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:36 np0005531888 nova_compute[186788]: 2025-11-22 08:42:36.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:36 np0005531888 nova_compute[186788]: 2025-11-22 08:42:36.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:36 np0005531888 nova_compute[186788]: 2025-11-22 08:42:36.992 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.012 186792 DEBUG oslo_concurrency.lockutils [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "interface-5966747e-45f2-43c0-9339-5011e53635fd-e755646b-370c-417b-bd1d-eb2939ed2c4e" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.013 186792 DEBUG oslo_concurrency.lockutils [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "interface-5966747e-45f2-43c0-9339-5011e53635fd-e755646b-370c-417b-bd1d-eb2939ed2c4e" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.045 186792 DEBUG nova.objects.instance [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'flavor' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.077 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.096 186792 DEBUG nova.virt.libvirt.vif [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:40:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:40:11Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.097 186792 DEBUG nova.network.os_vif_util [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.098 186792 DEBUG nova.network.os_vif_util [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.102 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.105 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.107 186792 DEBUG nova.virt.libvirt.driver [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Attempting to detach device tape755646b-37 from instance 5966747e-45f2-43c0-9339-5011e53635fd from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.107 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <mac address="fa:16:3e:c0:cd:dd"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <model type="virtio"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <mtu size="1442"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <target dev="tape755646b-37"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: </interface>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.137 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.137 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.167 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.172 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface>not found in domain: <domain type='kvm' id='83'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <name>instance-000000ab</name>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <uuid>5966747e-45f2-43c0-9339-5011e53635fd</uuid>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:name>tempest-TestNetworkBasicOps-server-453774490</nova:name>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:40:48</nova:creationTime>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:port uuid="b45d9774-1141-4b22-81f2-234b0c06226c">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:port uuid="e755646b-370c-417b-bd1d-eb2939ed2c4e">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <memory unit='KiB'>131072</memory>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <resource>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <partition>/machine</partition>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </resource>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <sysinfo type='smbios'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='serial'>5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='uuid'>5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <boot dev='hd'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <smbios mode='sysinfo'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <vmcoreinfo state='on'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <feature policy='require' name='x2apic'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <feature policy='require' name='vme'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <clock offset='utc'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <timer name='hpet' present='no'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <on_reboot>restart</on_reboot>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <on_crash>destroy</on_crash>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <disk type='file' device='disk'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk' index='2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <backingStore type='file' index='3'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:        <format type='raw'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:        <backingStore/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      </backingStore>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target dev='vda' bus='virtio'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='virtio-disk0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <disk type='file' device='cdrom'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.config' index='1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <backingStore/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target dev='sda' bus='sata'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <readonly/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='sata0-0-0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pcie.0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='1' port='0x10'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='2' port='0x11'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='3' port='0x12'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.3'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='4' port='0x13'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.4'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='5' port='0x14'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='6' port='0x15'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.6'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='7' port='0x16'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.7'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='8' port='0x17'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.8'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='9' port='0x18'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.9'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='10' port='0x19'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.10'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='11' port='0x1a'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.11'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='12' port='0x1b'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.12'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='13' port='0x1c'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.13'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='14' port='0x1d'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.14'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='15' port='0x1e'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.15'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='16' port='0x1f'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.16'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='17' port='0x20'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.17'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='18' port='0x21'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.18'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='19' port='0x22'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.19'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='20' port='0x23'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.20'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='21' port='0x24'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.21'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='22' port='0x25'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.22'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='23' port='0x26'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.23'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='24' port='0x27'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.24'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='25' port='0x28'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.25'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-pci-bridge'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.26'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='usb'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='sata' index='0'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='ide'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <interface type='ethernet'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <mac address='fa:16:3e:7f:10:e5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target dev='tapb45d9774-11'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model type='virtio'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <mtu size='1442'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='net0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <interface type='ethernet'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <mac address='fa:16:3e:c0:cd:dd'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target dev='tape755646b-37'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model type='virtio'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <mtu size='1442'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='net1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <serial type='pty'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log' append='off'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target type='isa-serial' port='0'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:        <model name='isa-serial'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      </target>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log' append='off'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target type='serial' port='0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </console>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <input type='tablet' bus='usb'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='input0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <input type='mouse' bus='ps2'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='input1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <input type='keyboard' bus='ps2'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='input2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <listen type='address' address='::0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <audio id='1' type='none'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='video0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <watchdog model='itco' action='reset'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='watchdog0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </watchdog>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <memballoon model='virtio'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <stats period='10'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='balloon0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <rng model='virtio'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='rng0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <label>system_u:system_r:svirt_t:s0:c355,c962</label>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c355,c962</imagelabel>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <label>+107:+107</label>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.173 186792 INFO nova.virt.libvirt.driver [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully detached device tape755646b-37 from instance 5966747e-45f2-43c0-9339-5011e53635fd from the persistent domain config.#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.173 186792 DEBUG nova.virt.libvirt.driver [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] (1/8): Attempting to detach device tape755646b-37 with device alias net1 from instance 5966747e-45f2-43c0-9339-5011e53635fd from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.174 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <mac address="fa:16:3e:c0:cd:dd"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <model type="virtio"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <mtu size="1442"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <target dev="tape755646b-37"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: </interface>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.191 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:42:37 np0005531888 kernel: tape755646b-37 (unregistering): left promiscuous mode
Nov 22 03:42:37 np0005531888 NetworkManager[55166]: <info>  [1763800957.2717] device (tape755646b-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:42:37 np0005531888 virtqemud[186358]: An error occurred, but the cause is unknown
Nov 22 03:42:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:42:37Z|00709|binding|INFO|Releasing lport e755646b-370c-417b-bd1d-eb2939ed2c4e from this chassis (sb_readonly=0)
Nov 22 03:42:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:42:37Z|00710|binding|INFO|Setting lport e755646b-370c-417b-bd1d-eb2939ed2c4e down in Southbound
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.282 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:42:37Z|00711|binding|INFO|Removing iface tape755646b-37 ovn-installed in OVS
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.289 186792 DEBUG nova.virt.libvirt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Received event <DeviceRemovedEvent: 1763800957.2888672, 5966747e-45f2-43c0-9339-5011e53635fd => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.292 186792 DEBUG nova.virt.libvirt.driver [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Start waiting for the detach event from libvirt for device tape755646b-37 with device alias net1 for instance 5966747e-45f2-43c0-9339-5011e53635fd _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.293 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.293 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.297 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface>not found in domain: <domain type='kvm' id='83'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <name>instance-000000ab</name>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <uuid>5966747e-45f2-43c0-9339-5011e53635fd</uuid>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:name>tempest-TestNetworkBasicOps-server-453774490</nova:name>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:40:48</nova:creationTime>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:port uuid="b45d9774-1141-4b22-81f2-234b0c06226c">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:port uuid="e755646b-370c-417b-bd1d-eb2939ed2c4e">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <memory unit='KiB'>131072</memory>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <resource>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <partition>/machine</partition>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </resource>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <sysinfo type='smbios'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='serial'>5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='uuid'>5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <boot dev='hd'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <smbios mode='sysinfo'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <vmcoreinfo state='on'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <feature policy='require' name='x2apic'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <feature policy='require' name='vme'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <clock offset='utc'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <timer name='hpet' present='no'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <on_reboot>restart</on_reboot>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <on_crash>destroy</on_crash>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <disk type='file' device='disk'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk' index='2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <backingStore type='file' index='3'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:        <format type='raw'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:        <backingStore/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      </backingStore>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target dev='vda' bus='virtio'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='virtio-disk0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <disk type='file' device='cdrom'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.config' index='1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <backingStore/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target dev='sda' bus='sata'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <readonly/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='sata0-0-0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pcie.0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='1' port='0x10'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='2' port='0x11'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='3' port='0x12'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.3'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='4' port='0x13'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.4'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='5' port='0x14'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='6' port='0x15'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.6'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='7' port='0x16'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.7'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='8' port='0x17'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.8'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='9' port='0x18'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.9'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='10' port='0x19'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.10'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='11' port='0x1a'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.11'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='12' port='0x1b'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.12'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='13' port='0x1c'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.13'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='14' port='0x1d'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.14'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='15' port='0x1e'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.15'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='16' port='0x1f'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.16'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='17' port='0x20'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.17'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='18' port='0x21'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.18'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='19' port='0x22'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.19'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='20' port='0x23'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.20'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='21' port='0x24'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.21'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='22' port='0x25'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.22'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='23' port='0x26'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.23'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='24' port='0x27'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.24'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target chassis='25' port='0x28'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.25'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model name='pcie-pci-bridge'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='pci.26'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='usb'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <controller type='sata' index='0'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='ide'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <interface type='ethernet'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <mac address='fa:16:3e:7f:10:e5'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target dev='tapb45d9774-11'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model type='virtio'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <mtu size='1442'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='net0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <serial type='pty'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log' append='off'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target type='isa-serial' port='0'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:        <model name='isa-serial'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      </target>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log' append='off'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <target type='serial' port='0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </console>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <input type='tablet' bus='usb'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='input0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <input type='mouse' bus='ps2'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='input1'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <input type='keyboard' bus='ps2'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='input2'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <listen type='address' address='::0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <audio id='1' type='none'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='video0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <watchdog model='itco' action='reset'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='watchdog0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </watchdog>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <memballoon model='virtio'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <stats period='10'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='balloon0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <rng model='virtio'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <alias name='rng0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <label>system_u:system_r:svirt_t:s0:c355,c962</label>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c355,c962</imagelabel>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <label>+107:+107</label>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.298 186792 INFO nova.virt.libvirt.driver [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully detached device tape755646b-37 from instance 5966747e-45f2-43c0-9339-5011e53635fd from the live domain config.#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.299 186792 DEBUG nova.virt.libvirt.vif [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:40:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:40:11Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.299 186792 DEBUG nova.network.os_vif_util [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.300 186792 DEBUG nova.network.os_vif_util [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.300 186792 DEBUG os_vif [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.302 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.302 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape755646b-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.303 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.304 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.305 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.308 186792 INFO os_vif [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37')#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.309 186792 DEBUG nova.virt.libvirt.guest [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:name>tempest-TestNetworkBasicOps-server-453774490</nova:name>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:42:37</nova:creationTime>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    <nova:port uuid="b45d9774-1141-4b22-81f2-234b0c06226c">
Nov 22 03:42:37 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:42:37 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:42:37 np0005531888 nova_compute[186788]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.370 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:cd:dd 10.100.0.21', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8d3a649-095f-4b94-af55-194302b82348', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8bf1006-f131-4804-9af5-1455421df1f6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=e755646b-370c-417b-bd1d-eb2939ed2c4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.372 104023 INFO neutron.agent.ovn.metadata.agent [-] Port e755646b-370c-417b-bd1d-eb2939ed2c4e in datapath b8d3a649-095f-4b94-af55-194302b82348 unbound from our chassis#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.373 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8d3a649-095f-4b94-af55-194302b82348, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.375 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f580876a-5aaf-48d9-865f-039532cb9695]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.375 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 namespace which is not needed anymore#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.414 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.415 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5556MB free_disk=73.23736572265625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.415 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.416 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.563 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 5966747e-45f2-43c0-9339-5011e53635fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.563 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.564 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:42:37 np0005531888 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[248307]: [NOTICE]   (248311) : haproxy version is 2.8.14-c23fe91
Nov 22 03:42:37 np0005531888 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[248307]: [NOTICE]   (248311) : path to executable is /usr/sbin/haproxy
Nov 22 03:42:37 np0005531888 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[248307]: [ALERT]    (248311) : Current worker (248313) exited with code 143 (Terminated)
Nov 22 03:42:37 np0005531888 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[248307]: [WARNING]  (248311) : All workers exited. Exiting... (0)
Nov 22 03:42:37 np0005531888 systemd[1]: libpod-db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c.scope: Deactivated successfully.
Nov 22 03:42:37 np0005531888 podman[248877]: 2025-11-22 08:42:37.578387086 +0000 UTC m=+0.120705854 container died db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.656 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.675 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.677 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.677 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:37 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c-userdata-shm.mount: Deactivated successfully.
Nov 22 03:42:37 np0005531888 systemd[1]: var-lib-containers-storage-overlay-35aac08efb62a4230e3b17a54be3baaedbf3ba68fc952d41ba8fc08674084c44-merged.mount: Deactivated successfully.
Nov 22 03:42:37 np0005531888 podman[248877]: 2025-11-22 08:42:37.729716234 +0000 UTC m=+0.272035002 container cleanup db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:42:37 np0005531888 podman[248907]: 2025-11-22 08:42:37.791722941 +0000 UTC m=+0.041794931 container remove db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:42:37 np0005531888 systemd[1]: libpod-conmon-db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c.scope: Deactivated successfully.
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.798 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1630bc12-687e-42c2-bb64-34ae2acfd9c0]: (4, ('Sat Nov 22 08:42:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 (db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c)\ndb52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c\nSat Nov 22 08:42:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 (db52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c)\ndb52bd02d46b6469fec96d2422180163c7487de5a93f80c02361932bc43dd49c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.800 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9196c2-b28c-4684-abb8-961a72e1d1d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.801 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8d3a649-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.803 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:37 np0005531888 kernel: tapb8d3a649-00: left promiscuous mode
Nov 22 03:42:37 np0005531888 nova_compute[186788]: 2025-11-22 08:42:37.815 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.818 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e73af41d-2d4a-49f4-8751-2cf8f6fe9f27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.833 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d7cfc9c9-f288-4949-948f-76873b8d55f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.836 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fef86834-2484-4f20-9f8f-e3d33b20caec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.856 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8a94b0-93e5-4b05-aee7-7b4a685d79ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 758901, 'reachable_time': 18445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248922, 'error': None, 'target': 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:37 np0005531888 systemd[1]: run-netns-ovnmeta\x2db8d3a649\x2d095f\x2d4b94\x2daf55\x2d194302b82348.mount: Deactivated successfully.
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.861 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:42:37 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:37.861 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[165a33a4-457e-4572-8c5b-64168cd4ee03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.200 186792 DEBUG nova.compute.manager [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-unplugged-e755646b-370c-417b-bd1d-eb2939ed2c4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.201 186792 DEBUG oslo_concurrency.lockutils [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.202 186792 DEBUG oslo_concurrency.lockutils [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.203 186792 DEBUG oslo_concurrency.lockutils [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.203 186792 DEBUG nova.compute.manager [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] No waiting events found dispatching network-vif-unplugged-e755646b-370c-417b-bd1d-eb2939ed2c4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.203 186792 WARNING nova.compute.manager [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received unexpected event network-vif-unplugged-e755646b-370c-417b-bd1d-eb2939ed2c4e for instance with vm_state active and task_state None.#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.203 186792 DEBUG nova.compute.manager [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.204 186792 DEBUG oslo_concurrency.lockutils [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.204 186792 DEBUG oslo_concurrency.lockutils [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.204 186792 DEBUG oslo_concurrency.lockutils [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.204 186792 DEBUG nova.compute.manager [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] No waiting events found dispatching network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.204 186792 WARNING nova.compute.manager [req-a15a6aa0-5df8-48dc-988f-6b058b56ce6c req-5e11ebc5-e307-44af-8851-6d983de9b0f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received unexpected event network-vif-plugged-e755646b-370c-417b-bd1d-eb2939ed2c4e for instance with vm_state active and task_state None.#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.510 186792 DEBUG oslo_concurrency.lockutils [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.510 186792 DEBUG oslo_concurrency.lockutils [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.510 186792 DEBUG nova.network.neutron [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.642 186792 DEBUG nova.compute.manager [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-deleted-e755646b-370c-417b-bd1d-eb2939ed2c4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.643 186792 INFO nova.compute.manager [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Neutron deleted interface e755646b-370c-417b-bd1d-eb2939ed2c4e; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.643 186792 DEBUG nova.network.neutron [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.677 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.703 186792 DEBUG nova.objects.instance [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lazy-loading 'system_metadata' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.755 186792 DEBUG nova.objects.instance [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lazy-loading 'flavor' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.774 186792 DEBUG nova.virt.libvirt.vif [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:40:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:40:11Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.774 186792 DEBUG nova.network.os_vif_util [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converting VIF {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.775 186792 DEBUG nova.network.os_vif_util [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.781 186792 DEBUG nova.virt.libvirt.guest [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.784 186792 DEBUG nova.virt.libvirt.guest [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface>not found in domain: <domain type='kvm' id='83'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <name>instance-000000ab</name>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <uuid>5966747e-45f2-43c0-9339-5011e53635fd</uuid>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:name>tempest-TestNetworkBasicOps-server-453774490</nova:name>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:42:37</nova:creationTime>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:port uuid="b45d9774-1141-4b22-81f2-234b0c06226c">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:42:38 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <memory unit='KiB'>131072</memory>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <resource>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <partition>/machine</partition>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </resource>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <sysinfo type='smbios'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='serial'>5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='uuid'>5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <boot dev='hd'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <smbios mode='sysinfo'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <vmcoreinfo state='on'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <feature policy='require' name='x2apic'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <feature policy='require' name='vme'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <clock offset='utc'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <timer name='hpet' present='no'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <on_reboot>restart</on_reboot>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <on_crash>destroy</on_crash>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <disk type='file' device='disk'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk' index='2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <backingStore type='file' index='3'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:        <format type='raw'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:        <backingStore/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      </backingStore>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target dev='vda' bus='virtio'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='virtio-disk0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <disk type='file' device='cdrom'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.config' index='1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <backingStore/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target dev='sda' bus='sata'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <readonly/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='sata0-0-0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pcie.0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='1' port='0x10'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='2' port='0x11'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='3' port='0x12'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.3'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='4' port='0x13'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.4'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='5' port='0x14'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='6' port='0x15'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.6'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='7' port='0x16'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.7'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='8' port='0x17'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.8'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='9' port='0x18'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.9'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='10' port='0x19'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.10'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='11' port='0x1a'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.11'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='12' port='0x1b'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.12'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='13' port='0x1c'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.13'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='14' port='0x1d'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.14'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='15' port='0x1e'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.15'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='16' port='0x1f'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.16'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='17' port='0x20'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.17'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='18' port='0x21'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.18'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='19' port='0x22'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.19'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='20' port='0x23'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.20'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='21' port='0x24'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.21'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='22' port='0x25'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.22'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='23' port='0x26'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.23'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='24' port='0x27'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.24'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='25' port='0x28'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.25'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-pci-bridge'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.26'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='usb'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='sata' index='0'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='ide'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <interface type='ethernet'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <mac address='fa:16:3e:7f:10:e5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target dev='tapb45d9774-11'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model type='virtio'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <mtu size='1442'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='net0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <serial type='pty'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log' append='off'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target type='isa-serial' port='0'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:        <model name='isa-serial'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      </target>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log' append='off'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target type='serial' port='0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </console>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <input type='tablet' bus='usb'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='input0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <input type='mouse' bus='ps2'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='input1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <input type='keyboard' bus='ps2'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='input2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <listen type='address' address='::0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <audio id='1' type='none'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='video0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <watchdog model='itco' action='reset'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='watchdog0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </watchdog>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <memballoon model='virtio'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <stats period='10'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='balloon0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <rng model='virtio'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='rng0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <label>system_u:system_r:svirt_t:s0:c355,c962</label>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c355,c962</imagelabel>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <label>+107:+107</label>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:42:38 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:42:38 np0005531888 nova_compute[186788]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.785 186792 DEBUG nova.virt.libvirt.guest [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.789 186792 DEBUG nova.virt.libvirt.guest [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c0:cd:dd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape755646b-37"/></interface>not found in domain: <domain type='kvm' id='83'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <name>instance-000000ab</name>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <uuid>5966747e-45f2-43c0-9339-5011e53635fd</uuid>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:name>tempest-TestNetworkBasicOps-server-453774490</nova:name>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:42:37</nova:creationTime>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:port uuid="b45d9774-1141-4b22-81f2-234b0c06226c">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:42:38 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <memory unit='KiB'>131072</memory>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <resource>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <partition>/machine</partition>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </resource>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <sysinfo type='smbios'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='serial'>5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='uuid'>5966747e-45f2-43c0-9339-5011e53635fd</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <boot dev='hd'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <smbios mode='sysinfo'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <vmcoreinfo state='on'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <feature policy='require' name='x2apic'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <feature policy='require' name='vme'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <clock offset='utc'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <timer name='hpet' present='no'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <on_reboot>restart</on_reboot>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <on_crash>destroy</on_crash>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <disk type='file' device='disk'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk' index='2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <backingStore type='file' index='3'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:        <format type='raw'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:        <backingStore/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      </backingStore>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target dev='vda' bus='virtio'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='virtio-disk0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <disk type='file' device='cdrom'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <source file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/disk.config' index='1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <backingStore/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target dev='sda' bus='sata'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <readonly/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='sata0-0-0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pcie.0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='1' port='0x10'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='2' port='0x11'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='3' port='0x12'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.3'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='4' port='0x13'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.4'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='5' port='0x14'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='6' port='0x15'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.6'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='7' port='0x16'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.7'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='8' port='0x17'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.8'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='9' port='0x18'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.9'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='10' port='0x19'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.10'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='11' port='0x1a'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.11'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='12' port='0x1b'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.12'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='13' port='0x1c'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.13'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='14' port='0x1d'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.14'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='15' port='0x1e'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.15'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='16' port='0x1f'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.16'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='17' port='0x20'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.17'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='18' port='0x21'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.18'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='19' port='0x22'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.19'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='20' port='0x23'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.20'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='21' port='0x24'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.21'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='22' port='0x25'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.22'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='23' port='0x26'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.23'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='24' port='0x27'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.24'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-root-port'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target chassis='25' port='0x28'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.25'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model name='pcie-pci-bridge'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='pci.26'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='usb'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <controller type='sata' index='0'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='ide'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </controller>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <interface type='ethernet'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <mac address='fa:16:3e:7f:10:e5'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target dev='tapb45d9774-11'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model type='virtio'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <mtu size='1442'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='net0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <serial type='pty'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log' append='off'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target type='isa-serial' port='0'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:        <model name='isa-serial'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      </target>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <source path='/dev/pts/0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <log file='/var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd/console.log' append='off'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <target type='serial' port='0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='serial0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </console>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <input type='tablet' bus='usb'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='input0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <input type='mouse' bus='ps2'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='input1'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <input type='keyboard' bus='ps2'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='input2'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </input>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <listen type='address' address='::0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </graphics>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <audio id='1' type='none'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='video0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <watchdog model='itco' action='reset'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='watchdog0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </watchdog>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <memballoon model='virtio'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <stats period='10'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='balloon0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <rng model='virtio'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <alias name='rng0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <label>system_u:system_r:svirt_t:s0:c355,c962</label>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c355,c962</imagelabel>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <label>+107:+107</label>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </seclabel>
Nov 22 03:42:38 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:42:38 np0005531888 nova_compute[186788]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.789 186792 WARNING nova.virt.libvirt.driver [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Detaching interface fa:16:3e:c0:cd:dd failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tape755646b-37' not found.#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.789 186792 DEBUG nova.virt.libvirt.vif [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:40:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:40:11Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.790 186792 DEBUG nova.network.os_vif_util [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converting VIF {"id": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "address": "fa:16:3e:c0:cd:dd", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape755646b-37", "ovs_interfaceid": "e755646b-370c-417b-bd1d-eb2939ed2c4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.790 186792 DEBUG nova.network.os_vif_util [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.791 186792 DEBUG os_vif [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.792 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.792 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape755646b-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.792 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.794 186792 INFO os_vif [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:cd:dd,bridge_name='br-int',has_traffic_filtering=True,id=e755646b-370c-417b-bd1d-eb2939ed2c4e,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape755646b-37')#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.795 186792 DEBUG nova.virt.libvirt.guest [req-d532a11f-7056-4e42-b31c-ba60debcad00 req-9d1ee78e-d682-4435-acbb-abcdc4716599 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:name>tempest-TestNetworkBasicOps-server-453774490</nova:name>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:creationTime>2025-11-22 08:42:38</nova:creationTime>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:flavor name="m1.nano">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:memory>128</nova:memory>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:disk>1</nova:disk>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:swap>0</nova:swap>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:flavor>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:owner>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:owner>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  <nova:ports>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    <nova:port uuid="b45d9774-1141-4b22-81f2-234b0c06226c">
Nov 22 03:42:38 np0005531888 nova_compute[186788]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:    </nova:port>
Nov 22 03:42:38 np0005531888 nova_compute[186788]:  </nova:ports>
Nov 22 03:42:38 np0005531888 nova_compute[186788]: </nova:instance>
Nov 22 03:42:38 np0005531888 nova_compute[186788]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:42:38 np0005531888 nova_compute[186788]: 2025-11-22 08:42:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:39 np0005531888 nova_compute[186788]: 2025-11-22 08:42:39.508 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:39 np0005531888 podman[248923]: 2025-11-22 08:42:39.676263305 +0000 UTC m=+0.048090086 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:42:39 np0005531888 podman[248924]: 2025-11-22 08:42:39.708624932 +0000 UTC m=+0.077038579 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 22 03:42:40 np0005531888 nova_compute[186788]: 2025-11-22 08:42:40.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:40 np0005531888 nova_compute[186788]: 2025-11-22 08:42:40.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:42:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:42:41Z|00712|binding|INFO|Releasing lport c5e5de30-961a-4548-9448-54ac752e5c80 from this chassis (sb_readonly=0)
Nov 22 03:42:41 np0005531888 nova_compute[186788]: 2025-11-22 08:42:41.586 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:42 np0005531888 nova_compute[186788]: 2025-11-22 08:42:42.306 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:42 np0005531888 nova_compute[186788]: 2025-11-22 08:42:42.517 186792 INFO nova.network.neutron [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Port e755646b-370c-417b-bd1d-eb2939ed2c4e from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 22 03:42:42 np0005531888 nova_compute[186788]: 2025-11-22 08:42:42.518 186792 DEBUG nova.network.neutron [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:42:42 np0005531888 nova_compute[186788]: 2025-11-22 08:42:42.531 186792 DEBUG oslo_concurrency.lockutils [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:42:42 np0005531888 nova_compute[186788]: 2025-11-22 08:42:42.594 186792 DEBUG oslo_concurrency.lockutils [None req-5d34fa85-ce1e-4326-9ed0-93a070977710 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "interface-5966747e-45f2-43c0-9339-5011e53635fd-e755646b-370c-417b-bd1d-eb2939ed2c4e" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:43 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:43.718 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:42:44 np0005531888 nova_compute[186788]: 2025-11-22 08:42:44.510 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.748 186792 DEBUG nova.compute.manager [req-f982f2df-182b-412d-9246-9a08404ef3eb req-2d4e2820-9ba6-4f7d-9bdf-e31ba9cbb37b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-changed-b45d9774-1141-4b22-81f2-234b0c06226c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.749 186792 DEBUG nova.compute.manager [req-f982f2df-182b-412d-9246-9a08404ef3eb req-2d4e2820-9ba6-4f7d-9bdf-e31ba9cbb37b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing instance network info cache due to event network-changed-b45d9774-1141-4b22-81f2-234b0c06226c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.749 186792 DEBUG oslo_concurrency.lockutils [req-f982f2df-182b-412d-9246-9a08404ef3eb req-2d4e2820-9ba6-4f7d-9bdf-e31ba9cbb37b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.749 186792 DEBUG oslo_concurrency.lockutils [req-f982f2df-182b-412d-9246-9a08404ef3eb req-2d4e2820-9ba6-4f7d-9bdf-e31ba9cbb37b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.749 186792 DEBUG nova.network.neutron [req-f982f2df-182b-412d-9246-9a08404ef3eb req-2d4e2820-9ba6-4f7d-9bdf-e31ba9cbb37b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Refreshing network info cache for port b45d9774-1141-4b22-81f2-234b0c06226c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.793 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.794 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.794 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.794 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.794 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.802 186792 INFO nova.compute.manager [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Terminating instance#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.810 186792 DEBUG nova.compute.manager [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:42:45 np0005531888 kernel: tapb45d9774-11 (unregistering): left promiscuous mode
Nov 22 03:42:45 np0005531888 NetworkManager[55166]: <info>  [1763800965.8299] device (tapb45d9774-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:42:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:42:45Z|00713|binding|INFO|Releasing lport b45d9774-1141-4b22-81f2-234b0c06226c from this chassis (sb_readonly=0)
Nov 22 03:42:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:42:45Z|00714|binding|INFO|Setting lport b45d9774-1141-4b22-81f2-234b0c06226c down in Southbound
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.834 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:45 np0005531888 ovn_controller[95067]: 2025-11-22T08:42:45Z|00715|binding|INFO|Removing iface tapb45d9774-11 ovn-installed in OVS
Nov 22 03:42:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:45.840 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:10:e5 10.100.0.11'], port_security=['fa:16:3e:7f:10:e5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5966747e-45f2-43c0-9339-5011e53635fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33840711-4341-4e1a-a9eb-fdace57bd254', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7385b9c4-7ae8-4013-983d-fc34cdf06060', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa1f2ac2-785f-446c-9f84-7df67f45fc3e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=b45d9774-1141-4b22-81f2-234b0c06226c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:42:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:45.842 104023 INFO neutron.agent.ovn.metadata.agent [-] Port b45d9774-1141-4b22-81f2-234b0c06226c in datapath 33840711-4341-4e1a-a9eb-fdace57bd254 unbound from our chassis#033[00m
Nov 22 03:42:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:45.845 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33840711-4341-4e1a-a9eb-fdace57bd254, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:42:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:45.848 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d792a61a-d8cd-47e4-b347-0c0ebd446ae1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:45 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:45.849 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254 namespace which is not needed anymore#033[00m
Nov 22 03:42:45 np0005531888 nova_compute[186788]: 2025-11-22 08:42:45.853 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:45 np0005531888 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Nov 22 03:42:45 np0005531888 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ab.scope: Consumed 22.249s CPU time.
Nov 22 03:42:45 np0005531888 systemd-machined[153106]: Machine qemu-83-instance-000000ab terminated.
Nov 22 03:42:45 np0005531888 neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254[248003]: [NOTICE]   (248007) : haproxy version is 2.8.14-c23fe91
Nov 22 03:42:45 np0005531888 neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254[248003]: [NOTICE]   (248007) : path to executable is /usr/sbin/haproxy
Nov 22 03:42:45 np0005531888 neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254[248003]: [WARNING]  (248007) : Exiting Master process...
Nov 22 03:42:45 np0005531888 neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254[248003]: [ALERT]    (248007) : Current worker (248009) exited with code 143 (Terminated)
Nov 22 03:42:45 np0005531888 neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254[248003]: [WARNING]  (248007) : All workers exited. Exiting... (0)
Nov 22 03:42:45 np0005531888 systemd[1]: libpod-327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360.scope: Deactivated successfully.
Nov 22 03:42:45 np0005531888 podman[248990]: 2025-11-22 08:42:45.994837796 +0000 UTC m=+0.049296045 container died 327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:42:46 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360-userdata-shm.mount: Deactivated successfully.
Nov 22 03:42:46 np0005531888 systemd[1]: var-lib-containers-storage-overlay-a871537357e5ce2eac9b5d9b8e116700cff553292e3495542fc770a4c641d984-merged.mount: Deactivated successfully.
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.036 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:46 np0005531888 podman[248990]: 2025-11-22 08:42:46.03721863 +0000 UTC m=+0.091676879 container cleanup 327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.041 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:46 np0005531888 systemd[1]: libpod-conmon-327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360.scope: Deactivated successfully.
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.073 186792 INFO nova.virt.libvirt.driver [-] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Instance destroyed successfully.#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.073 186792 DEBUG nova.objects.instance [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid 5966747e-45f2-43c0-9339-5011e53635fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.091 186792 DEBUG nova.virt.libvirt.vif [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-453774490',display_name='tempest-TestNetworkBasicOps-server-453774490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-453774490',id=171,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJGtAKY4za00FJDc3MCnmEscTWf4JA85N5hFAd8mbJKu02kwfXWDkSw7qITv1GHnecxdmlFWbaKG1JAHwTHrLfxdloVdvXSACyxrRbx90GOLYympRetlcNkNcE/Zokg45w==',key_name='tempest-TestNetworkBasicOps-868749241',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:40:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-mulqkg3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:40:11Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=5966747e-45f2-43c0-9339-5011e53635fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.091 186792 DEBUG nova.network.os_vif_util [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.092 186792 DEBUG nova.network.os_vif_util [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:10:e5,bridge_name='br-int',has_traffic_filtering=True,id=b45d9774-1141-4b22-81f2-234b0c06226c,network=Network(33840711-4341-4e1a-a9eb-fdace57bd254),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb45d9774-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.093 186792 DEBUG os_vif [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:10:e5,bridge_name='br-int',has_traffic_filtering=True,id=b45d9774-1141-4b22-81f2-234b0c06226c,network=Network(33840711-4341-4e1a-a9eb-fdace57bd254),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb45d9774-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.094 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.094 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb45d9774-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.098 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.101 186792 INFO os_vif [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:10:e5,bridge_name='br-int',has_traffic_filtering=True,id=b45d9774-1141-4b22-81f2-234b0c06226c,network=Network(33840711-4341-4e1a-a9eb-fdace57bd254),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb45d9774-11')#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.102 186792 INFO nova.virt.libvirt.driver [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Deleting instance files /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd_del#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.102 186792 INFO nova.virt.libvirt.driver [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Deletion of /var/lib/nova/instances/5966747e-45f2-43c0-9339-5011e53635fd_del complete#033[00m
Nov 22 03:42:46 np0005531888 podman[249030]: 2025-11-22 08:42:46.121699821 +0000 UTC m=+0.057984400 container remove 327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.127 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6fab1d50-7273-4014-9926-42eec1c33e65]: (4, ('Sat Nov 22 08:42:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254 (327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360)\n327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360\nSat Nov 22 08:42:46 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254 (327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360)\n327d324340dea051aa6c94cf6dacf289577c3e978e3584a32a0a8cdec536e360\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.129 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[65d00115-ae41-49bf-8943-8f54b6d2fb33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.130 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33840711-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:42:46 np0005531888 kernel: tap33840711-40: left promiscuous mode
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.138 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e17c9768-07e3-4d03-9e63-4255cd94250f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.148 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.159 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[39da8b4f-caa5-464a-9902-41dbc24efbfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.161 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[133f520e-86a4-4b98-ac88-e76ac715909a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.179 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3d07fd1a-9221-42dd-bd92-a6ba541d161d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755134, 'reachable_time': 23063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249049, 'error': None, 'target': 'ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:46 np0005531888 systemd[1]: run-netns-ovnmeta\x2d33840711\x2d4341\x2d4e1a\x2da9eb\x2dfdace57bd254.mount: Deactivated successfully.
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.183 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-33840711-4341-4e1a-a9eb-fdace57bd254 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:42:46 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:42:46.183 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[57f83373-a985-4d28-9082-ef95905505ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.205 186792 INFO nova.compute.manager [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.206 186792 DEBUG oslo.service.loopingcall [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.207 186792 DEBUG nova.compute.manager [-] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:42:46 np0005531888 nova_compute[186788]: 2025-11-22 08:42:46.207 186792 DEBUG nova.network.neutron [-] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.868 186792 DEBUG nova.compute.manager [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-unplugged-b45d9774-1141-4b22-81f2-234b0c06226c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.868 186792 DEBUG oslo_concurrency.lockutils [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.868 186792 DEBUG oslo_concurrency.lockutils [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.869 186792 DEBUG oslo_concurrency.lockutils [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.869 186792 DEBUG nova.compute.manager [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] No waiting events found dispatching network-vif-unplugged-b45d9774-1141-4b22-81f2-234b0c06226c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.869 186792 DEBUG nova.compute.manager [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-unplugged-b45d9774-1141-4b22-81f2-234b0c06226c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.869 186792 DEBUG nova.compute.manager [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.869 186792 DEBUG oslo_concurrency.lockutils [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5966747e-45f2-43c0-9339-5011e53635fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.869 186792 DEBUG oslo_concurrency.lockutils [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.870 186792 DEBUG oslo_concurrency.lockutils [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.870 186792 DEBUG nova.compute.manager [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] No waiting events found dispatching network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:42:47 np0005531888 nova_compute[186788]: 2025-11-22 08:42:47.870 186792 WARNING nova.compute.manager [req-987a0840-ecd5-4f06-bad4-1aa0ad9e2fe6 req-8d3bac95-697d-4658-be58-ddaf2ea9da03 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received unexpected event network-vif-plugged-b45d9774-1141-4b22-81f2-234b0c06226c for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.450 186792 DEBUG nova.network.neutron [-] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.475 186792 INFO nova.compute.manager [-] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Took 2.27 seconds to deallocate network for instance.#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.485 186792 DEBUG nova.network.neutron [req-f982f2df-182b-412d-9246-9a08404ef3eb req-2d4e2820-9ba6-4f7d-9bdf-e31ba9cbb37b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updated VIF entry in instance network info cache for port b45d9774-1141-4b22-81f2-234b0c06226c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.485 186792 DEBUG nova.network.neutron [req-f982f2df-182b-412d-9246-9a08404ef3eb req-2d4e2820-9ba6-4f7d-9bdf-e31ba9cbb37b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Updating instance_info_cache with network_info: [{"id": "b45d9774-1141-4b22-81f2-234b0c06226c", "address": "fa:16:3e:7f:10:e5", "network": {"id": "33840711-4341-4e1a-a9eb-fdace57bd254", "bridge": "br-int", "label": "tempest-network-smoke--2021896611", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb45d9774-11", "ovs_interfaceid": "b45d9774-1141-4b22-81f2-234b0c06226c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.509 186792 DEBUG oslo_concurrency.lockutils [req-f982f2df-182b-412d-9246-9a08404ef3eb req-2d4e2820-9ba6-4f7d-9bdf-e31ba9cbb37b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5966747e-45f2-43c0-9339-5011e53635fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.549 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.550 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.629 186792 DEBUG nova.compute.provider_tree [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.648 186792 DEBUG nova.scheduler.client.report [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.677 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.784 186792 INFO nova.scheduler.client.report [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance 5966747e-45f2-43c0-9339-5011e53635fd#033[00m
Nov 22 03:42:48 np0005531888 nova_compute[186788]: 2025-11-22 08:42:48.945 186792 DEBUG oslo_concurrency.lockutils [None req-5ee5cfb7-49dd-4c04-b71f-eb881968e29e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "5966747e-45f2-43c0-9339-5011e53635fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:49 np0005531888 nova_compute[186788]: 2025-11-22 08:42:49.512 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:49 np0005531888 nova_compute[186788]: 2025-11-22 08:42:49.966 186792 DEBUG nova.compute.manager [req-16903338-3a79-4a67-8ef3-a6c1ce8ffaf1 req-6ad9a3a6-0463-448b-afa4-dc0c60c89258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Received event network-vif-deleted-b45d9774-1141-4b22-81f2-234b0c06226c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:42:49 np0005531888 nova_compute[186788]: 2025-11-22 08:42:49.967 186792 INFO nova.compute.manager [req-16903338-3a79-4a67-8ef3-a6c1ce8ffaf1 req-6ad9a3a6-0463-448b-afa4-dc0c60c89258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Neutron deleted interface b45d9774-1141-4b22-81f2-234b0c06226c; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:42:49 np0005531888 nova_compute[186788]: 2025-11-22 08:42:49.967 186792 DEBUG nova.network.neutron [req-16903338-3a79-4a67-8ef3-a6c1ce8ffaf1 req-6ad9a3a6-0463-448b-afa4-dc0c60c89258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 22 03:42:49 np0005531888 nova_compute[186788]: 2025-11-22 08:42:49.970 186792 DEBUG nova.compute.manager [req-16903338-3a79-4a67-8ef3-a6c1ce8ffaf1 req-6ad9a3a6-0463-448b-afa4-dc0c60c89258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Detach interface failed, port_id=b45d9774-1141-4b22-81f2-234b0c06226c, reason: Instance 5966747e-45f2-43c0-9339-5011e53635fd could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:42:51 np0005531888 nova_compute[186788]: 2025-11-22 08:42:51.097 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:51 np0005531888 podman[249051]: 2025-11-22 08:42:51.699553344 +0000 UTC m=+0.058950503 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:42:51 np0005531888 podman[249050]: 2025-11-22 08:42:51.726647992 +0000 UTC m=+0.092605992 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 22 03:42:54 np0005531888 nova_compute[186788]: 2025-11-22 08:42:54.514 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:55 np0005531888 nova_compute[186788]: 2025-11-22 08:42:55.032 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:55 np0005531888 nova_compute[186788]: 2025-11-22 08:42:55.106 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:55 np0005531888 podman[249095]: 2025-11-22 08:42:55.714408225 +0000 UTC m=+0.071183444 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 03:42:55 np0005531888 podman[249094]: 2025-11-22 08:42:55.715647726 +0000 UTC m=+0.074671431 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Nov 22 03:42:55 np0005531888 podman[249096]: 2025-11-22 08:42:55.752438772 +0000 UTC m=+0.106530285 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:42:56 np0005531888 nova_compute[186788]: 2025-11-22 08:42:56.098 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:59 np0005531888 nova_compute[186788]: 2025-11-22 08:42:59.515 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:01 np0005531888 nova_compute[186788]: 2025-11-22 08:43:01.073 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800966.0717332, 5966747e-45f2-43c0-9339-5011e53635fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:43:01 np0005531888 nova_compute[186788]: 2025-11-22 08:43:01.074 186792 INFO nova.compute.manager [-] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:43:01 np0005531888 nova_compute[186788]: 2025-11-22 08:43:01.096 186792 DEBUG nova.compute.manager [None req-5dd05eb5-56cf-46d2-bebd-68730a47d44d - - - - - -] [instance: 5966747e-45f2-43c0-9339-5011e53635fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:01 np0005531888 nova_compute[186788]: 2025-11-22 08:43:01.101 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:03 np0005531888 nova_compute[186788]: 2025-11-22 08:43:03.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:04 np0005531888 nova_compute[186788]: 2025-11-22 08:43:04.518 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:06 np0005531888 nova_compute[186788]: 2025-11-22 08:43:06.103 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:09 np0005531888 nova_compute[186788]: 2025-11-22 08:43:09.520 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:10 np0005531888 podman[249160]: 2025-11-22 08:43:10.67178062 +0000 UTC m=+0.046183289 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:43:10 np0005531888 podman[249159]: 2025-11-22 08:43:10.697020611 +0000 UTC m=+0.073821650 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:43:11 np0005531888 nova_compute[186788]: 2025-11-22 08:43:11.105 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:14 np0005531888 nova_compute[186788]: 2025-11-22 08:43:14.522 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:16 np0005531888 nova_compute[186788]: 2025-11-22 08:43:16.108 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:19 np0005531888 nova_compute[186788]: 2025-11-22 08:43:19.525 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:21 np0005531888 nova_compute[186788]: 2025-11-22 08:43:21.111 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:22 np0005531888 podman[249200]: 2025-11-22 08:43:22.697414014 +0000 UTC m=+0.063965206 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:43:22 np0005531888 podman[249199]: 2025-11-22 08:43:22.705699118 +0000 UTC m=+0.080383710 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:43:22 np0005531888 nova_compute[186788]: 2025-11-22 08:43:22.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:23 np0005531888 nova_compute[186788]: 2025-11-22 08:43:23.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:23 np0005531888 nova_compute[186788]: 2025-11-22 08:43:23.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:43:23 np0005531888 nova_compute[186788]: 2025-11-22 08:43:23.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:43:24 np0005531888 nova_compute[186788]: 2025-11-22 08:43:23.999 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:43:24 np0005531888 nova_compute[186788]: 2025-11-22 08:43:24.525 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:26 np0005531888 nova_compute[186788]: 2025-11-22 08:43:26.113 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:26 np0005531888 podman[249240]: 2025-11-22 08:43:26.677696443 +0000 UTC m=+0.048781063 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Nov 22 03:43:26 np0005531888 podman[249241]: 2025-11-22 08:43:26.707329163 +0000 UTC m=+0.074541607 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:43:26 np0005531888 podman[249242]: 2025-11-22 08:43:26.740677015 +0000 UTC m=+0.104459304 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:43:26 np0005531888 nova_compute[186788]: 2025-11-22 08:43:26.994 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:29 np0005531888 nova_compute[186788]: 2025-11-22 08:43:29.527 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:30 np0005531888 nova_compute[186788]: 2025-11-22 08:43:30.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:31 np0005531888 nova_compute[186788]: 2025-11-22 08:43:31.115 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:33 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:33Z|00716|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 22 03:43:34 np0005531888 nova_compute[186788]: 2025-11-22 08:43:34.528 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:35.005 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:43:35 np0005531888 nova_compute[186788]: 2025-11-22 08:43:35.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:35.006 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:43:35 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:35.007 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:36 np0005531888 nova_compute[186788]: 2025-11-22 08:43:36.117 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:36.865 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:36.865 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:36.866 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:36 np0005531888 nova_compute[186788]: 2025-11-22 08:43:36.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:36 np0005531888 nova_compute[186788]: 2025-11-22 08:43:36.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.127 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.128 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.128 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.128 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.286 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.286 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5710MB free_disk=73.26691818237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.287 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.287 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.360 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.360 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.380 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.395 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.414 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:43:37 np0005531888 nova_compute[186788]: 2025-11-22 08:43:37.414 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:39 np0005531888 nova_compute[186788]: 2025-11-22 08:43:39.413 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:39 np0005531888 nova_compute[186788]: 2025-11-22 08:43:39.531 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:39 np0005531888 nova_compute[186788]: 2025-11-22 08:43:39.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:40 np0005531888 nova_compute[186788]: 2025-11-22 08:43:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:40 np0005531888 nova_compute[186788]: 2025-11-22 08:43:40.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.118 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.388 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "429d2844-a37f-4e95-95d9-fac582824680" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.388 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.430 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.663 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.663 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:41 np0005531888 podman[249304]: 2025-11-22 08:43:41.66885843 +0000 UTC m=+0.045756798 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.672 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:43:41 np0005531888 podman[249305]: 2025-11-22 08:43:41.672637244 +0000 UTC m=+0.047460281 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.672 186792 INFO nova.compute.claims [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.784 186792 DEBUG nova.compute.provider_tree [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.796 186792 DEBUG nova.scheduler.client.report [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.815 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.816 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.872 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.873 186792 DEBUG nova.network.neutron [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.888 186792 INFO nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:43:41 np0005531888 nova_compute[186788]: 2025-11-22 08:43:41.903 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.026 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.028 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.028 186792 INFO nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Creating image(s)#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.029 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.029 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.030 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.042 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.102 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.103 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.104 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.114 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.172 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.173 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.205 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.207 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.207 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.267 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.268 186792 DEBUG nova.virt.disk.api [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.269 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.327 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.328 186792 DEBUG nova.virt.disk.api [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.328 186792 DEBUG nova.objects.instance [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid 429d2844-a37f-4e95-95d9-fac582824680 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.449 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.450 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Ensure instance console log exists: /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.451 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.451 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.451 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:42 np0005531888 nova_compute[186788]: 2025-11-22 08:43:42.775 186792 DEBUG nova.policy [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:43:44 np0005531888 nova_compute[186788]: 2025-11-22 08:43:44.532 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:45 np0005531888 nova_compute[186788]: 2025-11-22 08:43:45.837 186792 DEBUG nova.network.neutron [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Successfully updated port: 15aaa9ce-5a60-4a63-a8ba-48052e19c726 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:43:45 np0005531888 nova_compute[186788]: 2025-11-22 08:43:45.867 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-429d2844-a37f-4e95-95d9-fac582824680" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:43:45 np0005531888 nova_compute[186788]: 2025-11-22 08:43:45.867 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-429d2844-a37f-4e95-95d9-fac582824680" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:43:45 np0005531888 nova_compute[186788]: 2025-11-22 08:43:45.867 186792 DEBUG nova.network.neutron [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:43:45 np0005531888 nova_compute[186788]: 2025-11-22 08:43:45.992 186792 DEBUG nova.compute.manager [req-5bc412c7-bd34-4967-a8cc-feb40b843b23 req-6c18a9e6-b4a9-4c30-ae24-25ceecd105e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Received event network-changed-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:45 np0005531888 nova_compute[186788]: 2025-11-22 08:43:45.993 186792 DEBUG nova.compute.manager [req-5bc412c7-bd34-4967-a8cc-feb40b843b23 req-6c18a9e6-b4a9-4c30-ae24-25ceecd105e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Refreshing instance network info cache due to event network-changed-15aaa9ce-5a60-4a63-a8ba-48052e19c726. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:43:45 np0005531888 nova_compute[186788]: 2025-11-22 08:43:45.993 186792 DEBUG oslo_concurrency.lockutils [req-5bc412c7-bd34-4967-a8cc-feb40b843b23 req-6c18a9e6-b4a9-4c30-ae24-25ceecd105e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-429d2844-a37f-4e95-95d9-fac582824680" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:43:46 np0005531888 nova_compute[186788]: 2025-11-22 08:43:46.050 186792 DEBUG nova.network.neutron [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:43:46 np0005531888 nova_compute[186788]: 2025-11-22 08:43:46.120 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.122 186792 DEBUG nova.network.neutron [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Updating instance_info_cache with network_info: [{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.181 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-429d2844-a37f-4e95-95d9-fac582824680" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.181 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Instance network_info: |[{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.181 186792 DEBUG oslo_concurrency.lockutils [req-5bc412c7-bd34-4967-a8cc-feb40b843b23 req-6c18a9e6-b4a9-4c30-ae24-25ceecd105e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-429d2844-a37f-4e95-95d9-fac582824680" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.182 186792 DEBUG nova.network.neutron [req-5bc412c7-bd34-4967-a8cc-feb40b843b23 req-6c18a9e6-b4a9-4c30-ae24-25ceecd105e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Refreshing network info cache for port 15aaa9ce-5a60-4a63-a8ba-48052e19c726 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.184 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Start _get_guest_xml network_info=[{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.188 186792 WARNING nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.193 186792 DEBUG nova.virt.libvirt.host [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.194 186792 DEBUG nova.virt.libvirt.host [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.197 186792 DEBUG nova.virt.libvirt.host [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.198 186792 DEBUG nova.virt.libvirt.host [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.199 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.200 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.200 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.201 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.201 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.201 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.202 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.202 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.202 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.202 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.203 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.203 186792 DEBUG nova.virt.hardware [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.207 186792 DEBUG nova.virt.libvirt.vif [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:43:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1663158606',display_name='tempest-TestNetworkBasicOps-server-1663158606',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1663158606',id=174,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAll6M+sYFxjLWVVhK4modR4iE7KW0dq4PEv7sZt4BS3Kw2iJrjMhlNeTrtCIKnY9yvkdPTECcnm+gs2nncBRbnCxAUwNLIUChVCCndDUygYdrKviKGBUhX++7B0zBTXLw==',key_name='tempest-TestNetworkBasicOps-1377247375',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ey8vdv7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:43:41Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=429d2844-a37f-4e95-95d9-fac582824680,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.207 186792 DEBUG nova.network.os_vif_util [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.208 186792 DEBUG nova.network.os_vif_util [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.209 186792 DEBUG nova.objects.instance [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid 429d2844-a37f-4e95-95d9-fac582824680 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.223 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <uuid>429d2844-a37f-4e95-95d9-fac582824680</uuid>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <name>instance-000000ae</name>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkBasicOps-server-1663158606</nova:name>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:43:47</nova:creationTime>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        <nova:port uuid="15aaa9ce-5a60-4a63-a8ba-48052e19c726">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <entry name="serial">429d2844-a37f-4e95-95d9-fac582824680</entry>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <entry name="uuid">429d2844-a37f-4e95-95d9-fac582824680</entry>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk.config"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:26:d0:6c"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <target dev="tap15aaa9ce-5a"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/console.log" append="off"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:43:47 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:43:47 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:43:47 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:43:47 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.225 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Preparing to wait for external event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.225 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "429d2844-a37f-4e95-95d9-fac582824680-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.225 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.225 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.226 186792 DEBUG nova.virt.libvirt.vif [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:43:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1663158606',display_name='tempest-TestNetworkBasicOps-server-1663158606',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1663158606',id=174,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAll6M+sYFxjLWVVhK4modR4iE7KW0dq4PEv7sZt4BS3Kw2iJrjMhlNeTrtCIKnY9yvkdPTECcnm+gs2nncBRbnCxAUwNLIUChVCCndDUygYdrKviKGBUhX++7B0zBTXLw==',key_name='tempest-TestNetworkBasicOps-1377247375',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ey8vdv7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:43:41Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=429d2844-a37f-4e95-95d9-fac582824680,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.226 186792 DEBUG nova.network.os_vif_util [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.227 186792 DEBUG nova.network.os_vif_util [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.227 186792 DEBUG os_vif [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.228 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.228 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.229 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.231 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.231 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15aaa9ce-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.232 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15aaa9ce-5a, col_values=(('external_ids', {'iface-id': '15aaa9ce-5a60-4a63-a8ba-48052e19c726', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:d0:6c', 'vm-uuid': '429d2844-a37f-4e95-95d9-fac582824680'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.233 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:47 np0005531888 NetworkManager[55166]: <info>  [1763801027.2346] manager: (tap15aaa9ce-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.235 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.239 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.239 186792 INFO os_vif [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a')#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.287 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.288 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.288 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:26:d0:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.289 186792 INFO nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Using config drive#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.923 186792 INFO nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Creating config drive at /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk.config#033[00m
Nov 22 03:43:47 np0005531888 nova_compute[186788]: 2025-11-22 08:43:47.928 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcujn9oo0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.053 186792 DEBUG oslo_concurrency.processutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcujn9oo0" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:48 np0005531888 kernel: tap15aaa9ce-5a: entered promiscuous mode
Nov 22 03:43:48 np0005531888 NetworkManager[55166]: <info>  [1763801028.1196] manager: (tap15aaa9ce-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/341)
Nov 22 03:43:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:48Z|00717|binding|INFO|Claiming lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 for this chassis.
Nov 22 03:43:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:48Z|00718|binding|INFO|15aaa9ce-5a60-4a63-a8ba-48052e19c726: Claiming fa:16:3e:26:d0:6c 10.100.0.5
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.119 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.127 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 NetworkManager[55166]: <info>  [1763801028.1337] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Nov 22 03:43:48 np0005531888 NetworkManager[55166]: <info>  [1763801028.1346] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.153 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:d0:6c 10.100.0.5'], port_security=['fa:16:3e:26:d0:6c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1012956209', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '429d2844-a37f-4e95-95d9-fac582824680', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1012956209', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1130d42c-f40b-4a39-88f2-637246715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf52ec35-2c17-43e3-9550-8e20cdf2c2b7, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=15aaa9ce-5a60-4a63-a8ba-48052e19c726) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.155 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 15aaa9ce-5a60-4a63-a8ba-48052e19c726 in datapath 6b97ad36-fe6a-4ecc-ae0a-fc772d456632 bound to our chassis#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.156 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b97ad36-fe6a-4ecc-ae0a-fc772d456632#033[00m
Nov 22 03:43:48 np0005531888 systemd-machined[153106]: New machine qemu-84-instance-000000ae.
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.167 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ed7e71dd-0ce2-4f70-ab11-fa03b9970941]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.168 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b97ad36-f1 in ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:43:48 np0005531888 systemd-udevd[249384]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.170 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b97ad36-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.170 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fffc1043-8f74-4e7b-9126-ee84b817ef77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.171 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6e758029-f87b-43bb-8174-015f977f3811]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 NetworkManager[55166]: <info>  [1763801028.1854] device (tap15aaa9ce-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:43:48 np0005531888 NetworkManager[55166]: <info>  [1763801028.1860] device (tap15aaa9ce-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.185 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8a0b6a-4230-4d78-86d3-ae5b66e16c91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.195 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.204 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.211 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ec7fa855-e101-4cab-ae1c-c4d73d3a7f1e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 systemd[1]: Started Virtual Machine qemu-84-instance-000000ae.
Nov 22 03:43:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:48Z|00719|binding|INFO|Setting lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 ovn-installed in OVS
Nov 22 03:43:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:48Z|00720|binding|INFO|Setting lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 up in Southbound
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.215 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.237 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6b91d8-70f5-4183-a6d2-8ce11f39e2b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 NetworkManager[55166]: <info>  [1763801028.2436] manager: (tap6b97ad36-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/344)
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.243 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ea8593-c7bd-4cb6-8f20-ca51ecb62059]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.272 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[347f2ea4-cbf9-4c20-9ed8-4c7b942c4c54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.275 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[2874c473-979f-41d7-b5e1-3df523a69f4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 NetworkManager[55166]: <info>  [1763801028.2960] device (tap6b97ad36-f0): carrier: link connected
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.302 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[68bd9a55-a933-4194-ba76-355bb24f3819]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.319 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ee014eb3-db3b-40bf-9137-70d015bb7a01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b97ad36-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:84:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 216], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 776893, 'reachable_time': 26098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249416, 'error': None, 'target': 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.333 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0c440bfe-922d-48f6-94df-481dee482dec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:84a5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 776893, 'tstamp': 776893}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249417, 'error': None, 'target': 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.350 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[797273cc-e65c-48d8-86ac-56fc57c248eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b97ad36-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:84:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 216], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 776893, 'reachable_time': 26098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249418, 'error': None, 'target': 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.380 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f52499-abc6-4e19-93e7-0920752c2a3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.434 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9167fb6c-d3d2-4b42-8c79-d5eef9670d81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.436 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b97ad36-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.436 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.437 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b97ad36-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:48 np0005531888 NetworkManager[55166]: <info>  [1763801028.4403] manager: (tap6b97ad36-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.439 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 kernel: tap6b97ad36-f0: entered promiscuous mode
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.441 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.443 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b97ad36-f0, col_values=(('external_ids', {'iface-id': '04e092e8-b0e3-44aa-842f-7ec0ec9be431'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.444 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:48Z|00721|binding|INFO|Releasing lport 04e092e8-b0e3-44aa-842f-7ec0ec9be431 from this chassis (sb_readonly=0)
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.445 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b97ad36-fe6a-4ecc-ae0a-fc772d456632.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b97ad36-fe6a-4ecc-ae0a-fc772d456632.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.446 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9f527edd-26d1-43a6-bcd9-2d7978534e15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.446 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-6b97ad36-fe6a-4ecc-ae0a-fc772d456632
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/6b97ad36-fe6a-4ecc-ae0a-fc772d456632.pid.haproxy
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 6b97ad36-fe6a-4ecc-ae0a-fc772d456632
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:43:48 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:48.447 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'env', 'PROCESS_TAG=haproxy-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b97ad36-fe6a-4ecc-ae0a-fc772d456632.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.455 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.477 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801028.4775286, 429d2844-a37f-4e95-95d9-fac582824680 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.478 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] VM Started (Lifecycle Event)#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.498 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.502 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801028.478316, 429d2844-a37f-4e95-95d9-fac582824680 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.502 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.518 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.520 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.544 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.777 186792 DEBUG nova.compute.manager [req-42725d7c-130f-4aab-9b10-ccf31a566bb0 req-7fe5e47a-34b4-48c0-af8f-f9924e3dabe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Received event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.778 186792 DEBUG oslo_concurrency.lockutils [req-42725d7c-130f-4aab-9b10-ccf31a566bb0 req-7fe5e47a-34b4-48c0-af8f-f9924e3dabe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "429d2844-a37f-4e95-95d9-fac582824680-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.779 186792 DEBUG oslo_concurrency.lockutils [req-42725d7c-130f-4aab-9b10-ccf31a566bb0 req-7fe5e47a-34b4-48c0-af8f-f9924e3dabe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.779 186792 DEBUG oslo_concurrency.lockutils [req-42725d7c-130f-4aab-9b10-ccf31a566bb0 req-7fe5e47a-34b4-48c0-af8f-f9924e3dabe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.779 186792 DEBUG nova.compute.manager [req-42725d7c-130f-4aab-9b10-ccf31a566bb0 req-7fe5e47a-34b4-48c0-af8f-f9924e3dabe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Processing event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.780 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.784 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801028.7842815, 429d2844-a37f-4e95-95d9-fac582824680 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.785 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.787 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.793 186792 INFO nova.virt.libvirt.driver [-] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Instance spawned successfully.#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.794 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.800 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:48 np0005531888 podman[249459]: 2025-11-22 08:43:48.806639798 +0000 UTC m=+0.053514547 container create 6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:43:48 np0005531888 systemd[1]: Started libpod-conmon-6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999.scope.
Nov 22 03:43:48 np0005531888 podman[249459]: 2025-11-22 08:43:48.775483562 +0000 UTC m=+0.022358331 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:43:48 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:43:48 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d06eca8248ff4044f5cd6ebdd4b5ecc32a6fe97a3e93b82ed204036d8057a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:43:48 np0005531888 podman[249459]: 2025-11-22 08:43:48.923405731 +0000 UTC m=+0.170280510 container init 6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 03:43:48 np0005531888 podman[249459]: 2025-11-22 08:43:48.92908212 +0000 UTC m=+0.175956869 container start 6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.934 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.938 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.939 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.939 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.940 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.940 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.940 186792 DEBUG nova.virt.libvirt.driver [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:48 np0005531888 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[249475]: [NOTICE]   (249479) : New worker (249481) forked
Nov 22 03:43:48 np0005531888 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[249475]: [NOTICE]   (249479) : Loading success.
Nov 22 03:43:48 np0005531888 nova_compute[186788]: 2025-11-22 08:43:48.968 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:43:49 np0005531888 nova_compute[186788]: 2025-11-22 08:43:49.011 186792 INFO nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Took 6.98 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:43:49 np0005531888 nova_compute[186788]: 2025-11-22 08:43:49.012 186792 DEBUG nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:49 np0005531888 nova_compute[186788]: 2025-11-22 08:43:49.106 186792 INFO nova.compute.manager [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Took 7.58 seconds to build instance.#033[00m
Nov 22 03:43:49 np0005531888 nova_compute[186788]: 2025-11-22 08:43:49.133 186792 DEBUG oslo_concurrency.lockutils [None req-2860a7d1-766b-489e-97e7-181b9f81af4e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:49 np0005531888 nova_compute[186788]: 2025-11-22 08:43:49.160 186792 DEBUG nova.network.neutron [req-5bc412c7-bd34-4967-a8cc-feb40b843b23 req-6c18a9e6-b4a9-4c30-ae24-25ceecd105e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Updated VIF entry in instance network info cache for port 15aaa9ce-5a60-4a63-a8ba-48052e19c726. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:43:49 np0005531888 nova_compute[186788]: 2025-11-22 08:43:49.160 186792 DEBUG nova.network.neutron [req-5bc412c7-bd34-4967-a8cc-feb40b843b23 req-6c18a9e6-b4a9-4c30-ae24-25ceecd105e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Updating instance_info_cache with network_info: [{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:43:49 np0005531888 nova_compute[186788]: 2025-11-22 08:43:49.183 186792 DEBUG oslo_concurrency.lockutils [req-5bc412c7-bd34-4967-a8cc-feb40b843b23 req-6c18a9e6-b4a9-4c30-ae24-25ceecd105e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-429d2844-a37f-4e95-95d9-fac582824680" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:43:49 np0005531888 nova_compute[186788]: 2025-11-22 08:43:49.536 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:50 np0005531888 nova_compute[186788]: 2025-11-22 08:43:50.853 186792 DEBUG nova.compute.manager [req-9e286765-e451-4dcf-b6ae-119214195a9a req-ac72e1d6-65de-4f44-a692-fabac1138335 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Received event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:50 np0005531888 nova_compute[186788]: 2025-11-22 08:43:50.854 186792 DEBUG oslo_concurrency.lockutils [req-9e286765-e451-4dcf-b6ae-119214195a9a req-ac72e1d6-65de-4f44-a692-fabac1138335 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "429d2844-a37f-4e95-95d9-fac582824680-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:50 np0005531888 nova_compute[186788]: 2025-11-22 08:43:50.854 186792 DEBUG oslo_concurrency.lockutils [req-9e286765-e451-4dcf-b6ae-119214195a9a req-ac72e1d6-65de-4f44-a692-fabac1138335 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:50 np0005531888 nova_compute[186788]: 2025-11-22 08:43:50.854 186792 DEBUG oslo_concurrency.lockutils [req-9e286765-e451-4dcf-b6ae-119214195a9a req-ac72e1d6-65de-4f44-a692-fabac1138335 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:50 np0005531888 nova_compute[186788]: 2025-11-22 08:43:50.855 186792 DEBUG nova.compute.manager [req-9e286765-e451-4dcf-b6ae-119214195a9a req-ac72e1d6-65de-4f44-a692-fabac1138335 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] No waiting events found dispatching network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:43:50 np0005531888 nova_compute[186788]: 2025-11-22 08:43:50.855 186792 WARNING nova.compute.manager [req-9e286765-e451-4dcf-b6ae-119214195a9a req-ac72e1d6-65de-4f44-a692-fabac1138335 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Received unexpected event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:43:51 np0005531888 nova_compute[186788]: 2025-11-22 08:43:51.987 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "429d2844-a37f-4e95-95d9-fac582824680" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:51 np0005531888 nova_compute[186788]: 2025-11-22 08:43:51.988 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:51 np0005531888 nova_compute[186788]: 2025-11-22 08:43:51.988 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "429d2844-a37f-4e95-95d9-fac582824680-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:51 np0005531888 nova_compute[186788]: 2025-11-22 08:43:51.988 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:51 np0005531888 nova_compute[186788]: 2025-11-22 08:43:51.989 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:51 np0005531888 nova_compute[186788]: 2025-11-22 08:43:51.997 186792 INFO nova.compute.manager [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Terminating instance#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.004 186792 DEBUG nova.compute.manager [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:43:52 np0005531888 kernel: tap15aaa9ce-5a (unregistering): left promiscuous mode
Nov 22 03:43:52 np0005531888 NetworkManager[55166]: <info>  [1763801032.0223] device (tap15aaa9ce-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.030 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:52Z|00722|binding|INFO|Releasing lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 from this chassis (sb_readonly=0)
Nov 22 03:43:52 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:52Z|00723|binding|INFO|Setting lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 down in Southbound
Nov 22 03:43:52 np0005531888 ovn_controller[95067]: 2025-11-22T08:43:52Z|00724|binding|INFO|Removing iface tap15aaa9ce-5a ovn-installed in OVS
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.032 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.037 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:d0:6c 10.100.0.5'], port_security=['fa:16:3e:26:d0:6c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1012956209', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '429d2844-a37f-4e95-95d9-fac582824680', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1012956209', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '9', 'neutron:security_group_ids': '1130d42c-f40b-4a39-88f2-637246715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.189', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf52ec35-2c17-43e3-9550-8e20cdf2c2b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=15aaa9ce-5a60-4a63-a8ba-48052e19c726) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.039 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 15aaa9ce-5a60-4a63-a8ba-48052e19c726 in datapath 6b97ad36-fe6a-4ecc-ae0a-fc772d456632 unbound from our chassis#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.040 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b97ad36-fe6a-4ecc-ae0a-fc772d456632, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.041 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[06f0a6e5-e876-4faa-ad1c-f1e9c09bc6e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.041 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 namespace which is not needed anymore#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.048 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Nov 22 03:43:52 np0005531888 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000ae.scope: Consumed 3.489s CPU time.
Nov 22 03:43:52 np0005531888 systemd-machined[153106]: Machine qemu-84-instance-000000ae terminated.
Nov 22 03:43:52 np0005531888 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[249475]: [NOTICE]   (249479) : haproxy version is 2.8.14-c23fe91
Nov 22 03:43:52 np0005531888 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[249475]: [NOTICE]   (249479) : path to executable is /usr/sbin/haproxy
Nov 22 03:43:52 np0005531888 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[249475]: [WARNING]  (249479) : Exiting Master process...
Nov 22 03:43:52 np0005531888 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[249475]: [WARNING]  (249479) : Exiting Master process...
Nov 22 03:43:52 np0005531888 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[249475]: [ALERT]    (249479) : Current worker (249481) exited with code 143 (Terminated)
Nov 22 03:43:52 np0005531888 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[249475]: [WARNING]  (249479) : All workers exited. Exiting... (0)
Nov 22 03:43:52 np0005531888 systemd[1]: libpod-6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999.scope: Deactivated successfully.
Nov 22 03:43:52 np0005531888 podman[249514]: 2025-11-22 08:43:52.208865449 +0000 UTC m=+0.085078244 container died 6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.226 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.231 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.233 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999-userdata-shm.mount: Deactivated successfully.
Nov 22 03:43:52 np0005531888 systemd[1]: var-lib-containers-storage-overlay-57d06eca8248ff4044f5cd6ebdd4b5ecc32a6fe97a3e93b82ed204036d8057a5-merged.mount: Deactivated successfully.
Nov 22 03:43:52 np0005531888 podman[249514]: 2025-11-22 08:43:52.270588098 +0000 UTC m=+0.146800893 container cleanup 6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.270 186792 INFO nova.virt.libvirt.driver [-] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Instance destroyed successfully.#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.270 186792 DEBUG nova.objects.instance [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid 429d2844-a37f-4e95-95d9-fac582824680 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:43:52 np0005531888 systemd[1]: libpod-conmon-6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999.scope: Deactivated successfully.
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.286 186792 DEBUG nova.virt.libvirt.vif [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:43:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1663158606',display_name='tempest-TestNetworkBasicOps-server-1663158606',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1663158606',id=174,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAll6M+sYFxjLWVVhK4modR4iE7KW0dq4PEv7sZt4BS3Kw2iJrjMhlNeTrtCIKnY9yvkdPTECcnm+gs2nncBRbnCxAUwNLIUChVCCndDUygYdrKviKGBUhX++7B0zBTXLw==',key_name='tempest-TestNetworkBasicOps-1377247375',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:43:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ey8vdv7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:43:49Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=429d2844-a37f-4e95-95d9-fac582824680,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.286 186792 DEBUG nova.network.os_vif_util [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.287 186792 DEBUG nova.network.os_vif_util [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.287 186792 DEBUG os_vif [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.289 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15aaa9ce-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.291 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.292 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.294 186792 INFO os_vif [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a')#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.294 186792 INFO nova.virt.libvirt.driver [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Deleting instance files /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680_del#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.295 186792 INFO nova.virt.libvirt.driver [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Deletion of /var/lib/nova/instances/429d2844-a37f-4e95-95d9-fac582824680_del complete#033[00m
Nov 22 03:43:52 np0005531888 podman[249558]: 2025-11-22 08:43:52.351658282 +0000 UTC m=+0.057892515 container remove 6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.359 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ead62db8-22cd-4269-8b0c-65022bbdb3cc]: (4, ('Sat Nov 22 08:43:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 (6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999)\n6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999\nSat Nov 22 08:43:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 (6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999)\n6b8afe9f7f19f01d238ed48477c32c995a7ea38c9e2c47f734d441412fce8999\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.361 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b863a0f1-d934-42c8-a653-a2f65cc96494]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.361 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b97ad36-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.363 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 kernel: tap6b97ad36-f0: left promiscuous mode
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.377 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.378 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[858b3a6e-038c-4900-b1aa-c7a01140ff05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.392 186792 INFO nova.compute.manager [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.393 186792 DEBUG oslo.service.loopingcall [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.393 186792 DEBUG nova.compute.manager [-] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.393 186792 DEBUG nova.network.neutron [-] [instance: 429d2844-a37f-4e95-95d9-fac582824680] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.398 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[df0cd1ce-6cd2-4e04-b5c8-3fb25aa609fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.399 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5a8baea5-4940-43a9-89ed-7e449c198aa3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.417 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b1528a50-a652-48dc-949c-40a8ac13a846]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 776887, 'reachable_time': 44869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249572, 'error': None, 'target': 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:52 np0005531888 systemd[1]: run-netns-ovnmeta\x2d6b97ad36\x2dfe6a\x2d4ecc\x2dae0a\x2dfc772d456632.mount: Deactivated successfully.
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.422 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:43:52 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:43:52.422 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3a389c-e37e-460a-9f20-98393698e07a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.927 186792 DEBUG nova.compute.manager [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Received event network-vif-unplugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.928 186792 DEBUG oslo_concurrency.lockutils [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "429d2844-a37f-4e95-95d9-fac582824680-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.928 186792 DEBUG oslo_concurrency.lockutils [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.929 186792 DEBUG oslo_concurrency.lockutils [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.929 186792 DEBUG nova.compute.manager [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] No waiting events found dispatching network-vif-unplugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.929 186792 DEBUG nova.compute.manager [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Received event network-vif-unplugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.929 186792 DEBUG nova.compute.manager [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Received event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.930 186792 DEBUG oslo_concurrency.lockutils [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "429d2844-a37f-4e95-95d9-fac582824680-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.930 186792 DEBUG oslo_concurrency.lockutils [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.930 186792 DEBUG oslo_concurrency.lockutils [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.930 186792 DEBUG nova.compute.manager [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] No waiting events found dispatching network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:43:52 np0005531888 nova_compute[186788]: 2025-11-22 08:43:52.931 186792 WARNING nova.compute.manager [req-9b3f5624-7747-4083-b3e9-9ceea970cbcd req-ab51e7bc-d5d2-4bcf-9b25-28dd83164b4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Received unexpected event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.639 186792 DEBUG nova.network.neutron [-] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.658 186792 INFO nova.compute.manager [-] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Took 1.26 seconds to deallocate network for instance.#033[00m
Nov 22 03:43:53 np0005531888 podman[249573]: 2025-11-22 08:43:53.691274369 +0000 UTC m=+0.063882833 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:43:53 np0005531888 podman[249574]: 2025-11-22 08:43:53.692361525 +0000 UTC m=+0.061429981 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.731 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.731 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.801 186792 DEBUG nova.compute.provider_tree [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.815 186792 DEBUG nova.scheduler.client.report [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.832 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.862 186792 INFO nova.scheduler.client.report [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance 429d2844-a37f-4e95-95d9-fac582824680#033[00m
Nov 22 03:43:53 np0005531888 nova_compute[186788]: 2025-11-22 08:43:53.935 186792 DEBUG oslo_concurrency.lockutils [None req-f6dfcfdc-1bf1-42d1-9d9f-3c46d5682233 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "429d2844-a37f-4e95-95d9-fac582824680" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:54 np0005531888 nova_compute[186788]: 2025-11-22 08:43:54.538 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:57 np0005531888 nova_compute[186788]: 2025-11-22 08:43:57.293 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:57 np0005531888 podman[249615]: 2025-11-22 08:43:57.682057578 +0000 UTC m=+0.055833315 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:43:57 np0005531888 podman[249616]: 2025-11-22 08:43:57.703070565 +0000 UTC m=+0.070559287 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:43:57 np0005531888 podman[249617]: 2025-11-22 08:43:57.72401907 +0000 UTC m=+0.090110118 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:43:59 np0005531888 nova_compute[186788]: 2025-11-22 08:43:59.542 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:00 np0005531888 nova_compute[186788]: 2025-11-22 08:44:00.083 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:00 np0005531888 nova_compute[186788]: 2025-11-22 08:44:00.153 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:02 np0005531888 nova_compute[186788]: 2025-11-22 08:44:02.296 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:04 np0005531888 nova_compute[186788]: 2025-11-22 08:44:04.543 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:07 np0005531888 nova_compute[186788]: 2025-11-22 08:44:07.269 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801032.2670155, 429d2844-a37f-4e95-95d9-fac582824680 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:44:07 np0005531888 nova_compute[186788]: 2025-11-22 08:44:07.270 186792 INFO nova.compute.manager [-] [instance: 429d2844-a37f-4e95-95d9-fac582824680] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:44:07 np0005531888 nova_compute[186788]: 2025-11-22 08:44:07.288 186792 DEBUG nova.compute.manager [None req-36c09541-2a86-4760-a822-2ab32e6f7743 - - - - - -] [instance: 429d2844-a37f-4e95-95d9-fac582824680] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:44:07 np0005531888 nova_compute[186788]: 2025-11-22 08:44:07.299 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:09 np0005531888 nova_compute[186788]: 2025-11-22 08:44:09.546 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:12 np0005531888 nova_compute[186788]: 2025-11-22 08:44:12.302 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:12 np0005531888 podman[249681]: 2025-11-22 08:44:12.830454563 +0000 UTC m=+0.196545166 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 03:44:12 np0005531888 podman[249680]: 2025-11-22 08:44:12.847356959 +0000 UTC m=+0.219312066 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:44:14 np0005531888 nova_compute[186788]: 2025-11-22 08:44:14.547 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:17 np0005531888 nova_compute[186788]: 2025-11-22 08:44:17.305 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:19 np0005531888 nova_compute[186788]: 2025-11-22 08:44:19.548 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:22 np0005531888 nova_compute[186788]: 2025-11-22 08:44:22.308 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:24 np0005531888 nova_compute[186788]: 2025-11-22 08:44:24.551 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:24 np0005531888 podman[249725]: 2025-11-22 08:44:24.679697753 +0000 UTC m=+0.049623972 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:44:24 np0005531888 podman[249724]: 2025-11-22 08:44:24.683533547 +0000 UTC m=+0.056794439 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 22 03:44:24 np0005531888 nova_compute[186788]: 2025-11-22 08:44:24.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:25 np0005531888 nova_compute[186788]: 2025-11-22 08:44:25.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:25 np0005531888 nova_compute[186788]: 2025-11-22 08:44:25.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:44:25 np0005531888 nova_compute[186788]: 2025-11-22 08:44:25.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:44:25 np0005531888 nova_compute[186788]: 2025-11-22 08:44:25.966 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:44:26 np0005531888 nova_compute[186788]: 2025-11-22 08:44:26.713 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:26 np0005531888 nova_compute[186788]: 2025-11-22 08:44:26.714 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:26 np0005531888 nova_compute[186788]: 2025-11-22 08:44:26.946 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:44:26 np0005531888 nova_compute[186788]: 2025-11-22 08:44:26.960 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:27 np0005531888 nova_compute[186788]: 2025-11-22 08:44:27.229 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:27 np0005531888 nova_compute[186788]: 2025-11-22 08:44:27.230 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:27 np0005531888 nova_compute[186788]: 2025-11-22 08:44:27.238 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:44:27 np0005531888 nova_compute[186788]: 2025-11-22 08:44:27.238 186792 INFO nova.compute.claims [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:44:27 np0005531888 nova_compute[186788]: 2025-11-22 08:44:27.312 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.431 186792 DEBUG nova.compute.provider_tree [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.445 186792 DEBUG nova.scheduler.client.report [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.514 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.515 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:44:28 np0005531888 podman[249769]: 2025-11-22 08:44:28.680448037 +0000 UTC m=+0.055186878 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 22 03:44:28 np0005531888 podman[249768]: 2025-11-22 08:44:28.680781336 +0000 UTC m=+0.057181958 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public)
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.694 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.694 186792 DEBUG nova.network.neutron [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:44:28 np0005531888 podman[249770]: 2025-11-22 08:44:28.705406451 +0000 UTC m=+0.077187410 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.805 186792 INFO nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.854 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:44:28 np0005531888 nova_compute[186788]: 2025-11-22 08:44:28.911 186792 DEBUG nova.policy [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.035 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.036 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.037 186792 INFO nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Creating image(s)#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.037 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.037 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.038 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.049 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.107 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.108 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.109 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.121 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.176 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.177 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.381 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk 1073741824" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.382 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.383 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.443 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.444 186792 DEBUG nova.virt.disk.api [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.445 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.496 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.497 186792 DEBUG nova.virt.disk.api [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.497 186792 DEBUG nova.objects.instance [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid 1642882a-1940-4212-a4b4-85fb63259b3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.511 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.511 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Ensure instance console log exists: /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.512 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.512 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.513 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:29 np0005531888 nova_compute[186788]: 2025-11-22 08:44:29.551 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:30 np0005531888 nova_compute[186788]: 2025-11-22 08:44:30.749 186792 DEBUG nova.network.neutron [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Successfully created port: 3033e008-cd6e-4748-9b74-590905825b5d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:44:31 np0005531888 nova_compute[186788]: 2025-11-22 08:44:31.890 186792 DEBUG nova.network.neutron [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Successfully updated port: 3033e008-cd6e-4748-9b74-590905825b5d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:44:31 np0005531888 nova_compute[186788]: 2025-11-22 08:44:31.971 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:44:31 np0005531888 nova_compute[186788]: 2025-11-22 08:44:31.972 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:44:31 np0005531888 nova_compute[186788]: 2025-11-22 08:44:31.972 186792 DEBUG nova.network.neutron [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:44:31 np0005531888 nova_compute[186788]: 2025-11-22 08:44:31.995 186792 DEBUG nova.compute.manager [req-dcf8bb2a-0251-45ab-b116-aad8ba3ae153 req-11e66878-1883-4243-8c71-a6b38a9f2efa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-changed-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:44:31 np0005531888 nova_compute[186788]: 2025-11-22 08:44:31.995 186792 DEBUG nova.compute.manager [req-dcf8bb2a-0251-45ab-b116-aad8ba3ae153 req-11e66878-1883-4243-8c71-a6b38a9f2efa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Refreshing instance network info cache due to event network-changed-3033e008-cd6e-4748-9b74-590905825b5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:44:31 np0005531888 nova_compute[186788]: 2025-11-22 08:44:31.995 186792 DEBUG oslo_concurrency.lockutils [req-dcf8bb2a-0251-45ab-b116-aad8ba3ae153 req-11e66878-1883-4243-8c71-a6b38a9f2efa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:44:32 np0005531888 nova_compute[186788]: 2025-11-22 08:44:32.112 186792 DEBUG nova.network.neutron [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:44:32 np0005531888 nova_compute[186788]: 2025-11-22 08:44:32.315 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:32 np0005531888 nova_compute[186788]: 2025-11-22 08:44:32.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.106 186792 DEBUG nova.network.neutron [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Updating instance_info_cache with network_info: [{"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.158 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.158 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Instance network_info: |[{"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.159 186792 DEBUG oslo_concurrency.lockutils [req-dcf8bb2a-0251-45ab-b116-aad8ba3ae153 req-11e66878-1883-4243-8c71-a6b38a9f2efa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.159 186792 DEBUG nova.network.neutron [req-dcf8bb2a-0251-45ab-b116-aad8ba3ae153 req-11e66878-1883-4243-8c71-a6b38a9f2efa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Refreshing network info cache for port 3033e008-cd6e-4748-9b74-590905825b5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.162 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Start _get_guest_xml network_info=[{"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.165 186792 WARNING nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.171 186792 DEBUG nova.virt.libvirt.host [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.172 186792 DEBUG nova.virt.libvirt.host [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.179 186792 DEBUG nova.virt.libvirt.host [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.180 186792 DEBUG nova.virt.libvirt.host [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.181 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.181 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.181 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.182 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.182 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.182 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.182 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.183 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.183 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.183 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.183 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.184 186792 DEBUG nova.virt.hardware [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.187 186792 DEBUG nova.virt.libvirt.vif [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:44:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1738764145',display_name='tempest-TestNetworkBasicOps-server-1738764145',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1738764145',id=175,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ9I30YDOJc7zeNSbR4OTBRrIZBRWgzX0sO0vVuFLlLZHjv8EuLeLmF0elL6jI/157DdKAZTLm8gwZq48MSq86/kGh73bqrqC9SVezNrCrOmrFURIRseA/Bfw27EY0ZvRw==',key_name='tempest-TestNetworkBasicOps-812610295',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-vi0k7qqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:44:28Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=1642882a-1940-4212-a4b4-85fb63259b3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.187 186792 DEBUG nova.network.os_vif_util [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.188 186792 DEBUG nova.network.os_vif_util [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:d1:70,bridge_name='br-int',has_traffic_filtering=True,id=3033e008-cd6e-4748-9b74-590905825b5d,network=Network(ea197933-b137-4edc-a409-ccba428ee2cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3033e008-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.189 186792 DEBUG nova.objects.instance [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1642882a-1940-4212-a4b4-85fb63259b3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.202 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <uuid>1642882a-1940-4212-a4b4-85fb63259b3d</uuid>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <name>instance-000000af</name>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <nova:name>tempest-TestNetworkBasicOps-server-1738764145</nova:name>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:44:33</nova:creationTime>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        <nova:port uuid="3033e008-cd6e-4748-9b74-590905825b5d">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <entry name="serial">1642882a-1940-4212-a4b4-85fb63259b3d</entry>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <entry name="uuid">1642882a-1940-4212-a4b4-85fb63259b3d</entry>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk.config"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:79:d1:70"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <target dev="tap3033e008-cd"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/console.log" append="off"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:44:33 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:44:33 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:44:33 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:44:33 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.203 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Preparing to wait for external event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.203 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.204 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.204 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.205 186792 DEBUG nova.virt.libvirt.vif [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:44:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1738764145',display_name='tempest-TestNetworkBasicOps-server-1738764145',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1738764145',id=175,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ9I30YDOJc7zeNSbR4OTBRrIZBRWgzX0sO0vVuFLlLZHjv8EuLeLmF0elL6jI/157DdKAZTLm8gwZq48MSq86/kGh73bqrqC9SVezNrCrOmrFURIRseA/Bfw27EY0ZvRw==',key_name='tempest-TestNetworkBasicOps-812610295',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-vi0k7qqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:44:28Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=1642882a-1940-4212-a4b4-85fb63259b3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.205 186792 DEBUG nova.network.os_vif_util [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.205 186792 DEBUG nova.network.os_vif_util [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:d1:70,bridge_name='br-int',has_traffic_filtering=True,id=3033e008-cd6e-4748-9b74-590905825b5d,network=Network(ea197933-b137-4edc-a409-ccba428ee2cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3033e008-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.206 186792 DEBUG os_vif [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:d1:70,bridge_name='br-int',has_traffic_filtering=True,id=3033e008-cd6e-4748-9b74-590905825b5d,network=Network(ea197933-b137-4edc-a409-ccba428ee2cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3033e008-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.206 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.206 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.207 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.209 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.209 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3033e008-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.209 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3033e008-cd, col_values=(('external_ids', {'iface-id': '3033e008-cd6e-4748-9b74-590905825b5d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:79:d1:70', 'vm-uuid': '1642882a-1940-4212-a4b4-85fb63259b3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.211 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:33 np0005531888 NetworkManager[55166]: <info>  [1763801073.2126] manager: (tap3033e008-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.213 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.216 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.217 186792 INFO os_vif [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:d1:70,bridge_name='br-int',has_traffic_filtering=True,id=3033e008-cd6e-4748-9b74-590905825b5d,network=Network(ea197933-b137-4edc-a409-ccba428ee2cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3033e008-cd')#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.405 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.405 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.405 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:79:d1:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.406 186792 INFO nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Using config drive#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.781 186792 INFO nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Creating config drive at /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk.config#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.785 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphk_qxe34 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.908 186792 DEBUG oslo_concurrency.processutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphk_qxe34" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:44:33 np0005531888 kernel: tap3033e008-cd: entered promiscuous mode
Nov 22 03:44:33 np0005531888 NetworkManager[55166]: <info>  [1763801073.9823] manager: (tap3033e008-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/347)
Nov 22 03:44:33 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:33Z|00725|binding|INFO|Claiming lport 3033e008-cd6e-4748-9b74-590905825b5d for this chassis.
Nov 22 03:44:33 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:33Z|00726|binding|INFO|3033e008-cd6e-4748-9b74-590905825b5d: Claiming fa:16:3e:79:d1:70 10.100.0.8
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:33 np0005531888 nova_compute[186788]: 2025-11-22 08:44:33.985 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.000 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:d1:70 10.100.0.8'], port_security=['fa:16:3e:79:d1:70 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea197933-b137-4edc-a409-ccba428ee2cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '54e44449-0bb8-4e26-9a58-cb316fa17a67', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fd495cc-3d66-4689-af7a-1e87868108e0, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=3033e008-cd6e-4748-9b74-590905825b5d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.001 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 3033e008-cd6e-4748-9b74-590905825b5d in datapath ea197933-b137-4edc-a409-ccba428ee2cd bound to our chassis#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.002 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ea197933-b137-4edc-a409-ccba428ee2cd#033[00m
Nov 22 03:44:34 np0005531888 systemd-udevd[249864]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.012 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[8024be4a-09b6-40eb-9889-6431517d0b37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.013 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapea197933-b1 in ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.017 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapea197933-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.017 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2a4fe0-e099-4d01-aa70-1e1c229bde05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.018 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ea9a4f-2cf9-4d3e-8f51-ab81c96c8ab2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 systemd-machined[153106]: New machine qemu-85-instance-000000af.
Nov 22 03:44:34 np0005531888 NetworkManager[55166]: <info>  [1763801074.0262] device (tap3033e008-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:44:34 np0005531888 NetworkManager[55166]: <info>  [1763801074.0273] device (tap3033e008-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.031 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[85f0a2da-3a78-41ed-a107-6082479cb87e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.041 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.044 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[40fa0aa6-9f3d-4705-9774-e8d05d27e4e6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:34Z|00727|binding|INFO|Setting lport 3033e008-cd6e-4748-9b74-590905825b5d ovn-installed in OVS
Nov 22 03:44:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:34Z|00728|binding|INFO|Setting lport 3033e008-cd6e-4748-9b74-590905825b5d up in Southbound
Nov 22 03:44:34 np0005531888 systemd[1]: Started Virtual Machine qemu-85-instance-000000af.
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.049 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.070 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[6d47cb8e-5212-4779-b6ae-351e5814f9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.077 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[682aab32-498b-4b9a-9f41-6cd394a32a2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 NetworkManager[55166]: <info>  [1763801074.0795] manager: (tapea197933-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/348)
Nov 22 03:44:34 np0005531888 systemd-udevd[249868]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.112 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[123f0966-1cdd-44fc-9a0a-b71364ce98e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.116 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3517c0-a3b3-43d0-ab1e-cf9ecf308560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 NetworkManager[55166]: <info>  [1763801074.1444] device (tapea197933-b0): carrier: link connected
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.151 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[a6be51e0-606e-4284-ba04-b54755ad4be4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.170 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9f2c81-194f-48ee-aa2f-9979a62e6f9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea197933-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:66:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 781478, 'reachable_time': 33211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249900, 'error': None, 'target': 'ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.187 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[64ad0267-a1e2-4529-8346-9349c22cf5b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:666c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 781478, 'tstamp': 781478}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249901, 'error': None, 'target': 'ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.209 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d1dd3e0a-49df-45ea-869d-200f16b215c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea197933-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:66:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 781478, 'reachable_time': 33211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249902, 'error': None, 'target': 'ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.240 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5c0e06-d7bf-4b8a-bdbb-f10cb34b6354]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.301 186792 DEBUG nova.compute.manager [req-22252c2c-c39e-40d0-9c0d-0e3378f32f81 req-97646977-7fee-41a5-8275-c1708902521f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.305 186792 DEBUG oslo_concurrency.lockutils [req-22252c2c-c39e-40d0-9c0d-0e3378f32f81 req-97646977-7fee-41a5-8275-c1708902521f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.305 186792 DEBUG oslo_concurrency.lockutils [req-22252c2c-c39e-40d0-9c0d-0e3378f32f81 req-97646977-7fee-41a5-8275-c1708902521f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.306 186792 DEBUG oslo_concurrency.lockutils [req-22252c2c-c39e-40d0-9c0d-0e3378f32f81 req-97646977-7fee-41a5-8275-c1708902521f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.306 186792 DEBUG nova.compute.manager [req-22252c2c-c39e-40d0-9c0d-0e3378f32f81 req-97646977-7fee-41a5-8275-c1708902521f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Processing event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.310 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[101f889a-d8db-493b-834f-052b3a1bfd1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.312 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea197933-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.312 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.313 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea197933-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.315 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:34 np0005531888 kernel: tapea197933-b0: entered promiscuous mode
Nov 22 03:44:34 np0005531888 NetworkManager[55166]: <info>  [1763801074.3164] manager: (tapea197933-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.318 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.321 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapea197933-b0, col_values=(('external_ids', {'iface-id': '0aea6cf7-2802-4ea0-83fb-156a81dc390f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.322 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:34 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:34Z|00729|binding|INFO|Releasing lport 0aea6cf7-2802-4ea0-83fb-156a81dc390f from this chassis (sb_readonly=0)
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.326 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea197933-b137-4edc-a409-ccba428ee2cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea197933-b137-4edc-a409-ccba428ee2cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.327 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd501d6-d52d-4623-b3ec-e2eb8176c5c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.328 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-ea197933-b137-4edc-a409-ccba428ee2cd
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/ea197933-b137-4edc-a409-ccba428ee2cd.pid.haproxy
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID ea197933-b137-4edc-a409-ccba428ee2cd
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:44:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:34.330 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd', 'env', 'PROCESS_TAG=haproxy-ea197933-b137-4edc-a409-ccba428ee2cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ea197933-b137-4edc-a409-ccba428ee2cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.335 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.516 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801074.5154781, 1642882a-1940-4212-a4b4-85fb63259b3d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.517 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] VM Started (Lifecycle Event)#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.520 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.524 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.528 186792 INFO nova.virt.libvirt.driver [-] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Instance spawned successfully.#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.529 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.533 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.540 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.556 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.559 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.559 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801074.5183444, 1642882a-1940-4212-a4b4-85fb63259b3d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.559 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.562 186792 DEBUG nova.network.neutron [req-dcf8bb2a-0251-45ab-b116-aad8ba3ae153 req-11e66878-1883-4243-8c71-a6b38a9f2efa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Updated VIF entry in instance network info cache for port 3033e008-cd6e-4748-9b74-590905825b5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.562 186792 DEBUG nova.network.neutron [req-dcf8bb2a-0251-45ab-b116-aad8ba3ae153 req-11e66878-1883-4243-8c71-a6b38a9f2efa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Updating instance_info_cache with network_info: [{"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.568 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.569 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.569 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.570 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.570 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.570 186792 DEBUG nova.virt.libvirt.driver [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.602 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.604 186792 DEBUG oslo_concurrency.lockutils [req-dcf8bb2a-0251-45ab-b116-aad8ba3ae153 req-11e66878-1883-4243-8c71-a6b38a9f2efa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.607 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801074.5251017, 1642882a-1940-4212-a4b4-85fb63259b3d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.608 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.626 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.630 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.660 186792 INFO nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Took 5.62 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.660 186792 DEBUG nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.672 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.722 186792 INFO nova.compute.manager [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Took 7.53 seconds to build instance.#033[00m
Nov 22 03:44:34 np0005531888 nova_compute[186788]: 2025-11-22 08:44:34.737 186792 DEBUG oslo_concurrency.lockutils [None req-51663806-49e9-4d32-a8b5-42012fe19e02 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:34 np0005531888 podman[249940]: 2025-11-22 08:44:34.686973327 +0000 UTC m=+0.022476274 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:44:34 np0005531888 podman[249940]: 2025-11-22 08:44:34.805827491 +0000 UTC m=+0.141330408 container create 415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:44:34 np0005531888 systemd[1]: Started libpod-conmon-415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c.scope.
Nov 22 03:44:34 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:44:34 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e43a12e63238784311f83f70143b552db059e45fbd46f7ea4281305de601aadc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:44:34 np0005531888 podman[249940]: 2025-11-22 08:44:34.914374962 +0000 UTC m=+0.249877909 container init 415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 03:44:34 np0005531888 podman[249940]: 2025-11-22 08:44:34.92119632 +0000 UTC m=+0.256699237 container start 415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 03:44:34 np0005531888 neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd[249956]: [NOTICE]   (249960) : New worker (249962) forked
Nov 22 03:44:34 np0005531888 neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd[249956]: [NOTICE]   (249960) : Loading success.
Nov 22 03:44:36 np0005531888 nova_compute[186788]: 2025-11-22 08:44:36.401 186792 DEBUG nova.compute.manager [req-b1113afe-49ba-492e-8082-875da227ecdc req-bf3319d9-b919-4385-9378-eeb555597714 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:44:36 np0005531888 nova_compute[186788]: 2025-11-22 08:44:36.401 186792 DEBUG oslo_concurrency.lockutils [req-b1113afe-49ba-492e-8082-875da227ecdc req-bf3319d9-b919-4385-9378-eeb555597714 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:36 np0005531888 nova_compute[186788]: 2025-11-22 08:44:36.401 186792 DEBUG oslo_concurrency.lockutils [req-b1113afe-49ba-492e-8082-875da227ecdc req-bf3319d9-b919-4385-9378-eeb555597714 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:36 np0005531888 nova_compute[186788]: 2025-11-22 08:44:36.401 186792 DEBUG oslo_concurrency.lockutils [req-b1113afe-49ba-492e-8082-875da227ecdc req-bf3319d9-b919-4385-9378-eeb555597714 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:36 np0005531888 nova_compute[186788]: 2025-11-22 08:44:36.402 186792 DEBUG nova.compute.manager [req-b1113afe-49ba-492e-8082-875da227ecdc req-bf3319d9-b919-4385-9378-eeb555597714 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] No waiting events found dispatching network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:44:36 np0005531888 nova_compute[186788]: 2025-11-22 08:44:36.402 186792 WARNING nova.compute.manager [req-b1113afe-49ba-492e-8082-875da227ecdc req-bf3319d9-b919-4385-9378-eeb555597714 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received unexpected event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d for instance with vm_state active and task_state None.#033[00m
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.858 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'name': 'tempest-TestNetworkBasicOps-server-1738764145', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000af', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '12f63a6d87a947758ab928c0d625ff06', 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'hostId': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.859 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.863 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1642882a-1940-4212-a4b4-85fb63259b3d / tap3033e008-cd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.863 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31d65a3d-99d0-4d41-9596-2105bbaa68a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:36.860093', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79ea1700-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': '6fb6a952d4218a3288a606865395e98a020aea53e4b047cd2d9bb4258000e2e7'}]}, 'timestamp': '2025-11-22 08:44:36.864588', '_unique_id': '63fcec8fd30845c0853ba24463d04090'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.865 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:36.867 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:36.867 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:36.868 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.869 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.878 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.879 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd51cb655-6217-46d6-a252-dc689a90b14b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:36.869305', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79ec4ef8-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.56881976, 'message_signature': '279d4095f2fff1cdf72335e70b62040d62810f780acf8ef890033accd7376482'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:36.869305', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79ec5e8e-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.56881976, 'message_signature': 'a2f6cd966358e4d6957d637d28b6ba675f3bc37eed8643139530efc961617b27'}]}, 'timestamp': '2025-11-22 08:44:36.879435', '_unique_id': '7ffdda87bb764ea39854ff7cf44a3f1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.883 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f303209-43b3-41d1-a061-047f01b3e022', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:36.883472', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79ed0ab4-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': 'fc7cb06dbc9fdb9f3cf8241b5ee52b3287be16775d323104a4234559dd49b439'}]}, 'timestamp': '2025-11-22 08:44:36.883850', '_unique_id': '922886c868fe49a288fbecaebe9a4805'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.884 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.887 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.887 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.888 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1738764145>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1738764145>]
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.888 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.888 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.888 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1738764145>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1738764145>]
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.889 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.889 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e83f257-aa3d-4bc8-9b43-76d676120001', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:36.889226', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79edea2e-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': '882ac5bef05ded3609796d53cccece1aa74114bb0b2451892f14d980e8de11cc'}]}, 'timestamp': '2025-11-22 08:44:36.889590', '_unique_id': '1f65bb41fec3478197ca3095eaf33a24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.890 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.893 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d20670d-d376-4fb4-9e08-f2842b280201', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:36.893070', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79ee80ba-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': '199306d033034e39541f0eeb4ed21ce11ac9cb0db32318e6951d64f999c44564'}]}, 'timestamp': '2025-11-22 08:44:36.893427', '_unique_id': '910d8541a06f4616a37e584709426bf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.894 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.897 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.915 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/cpu volume: 2290000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17343b62-e059-4777-9a09-dfc55e818620', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2290000000, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'timestamp': '2025-11-22T08:44:36.897615', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '79f1ed0e-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.614532844, 'message_signature': '1349438f5ec22fac939a127e22843bf37bedb48be567e924812bc719537b5b6e'}]}, 'timestamp': '2025-11-22 08:44:36.915981', '_unique_id': 'feae725becc64c03823e5c753e73d513'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.916 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.920 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.921 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1738764145>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1738764145>]
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.947 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.948 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6de4586-d7de-4802-b2ae-9a2d26481cab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:36.921712', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79f6e980-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': '4656e1e29d457740a85c43c177b2254bbc3ca839c1eb9ca4f86b2324ff526345'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:36.921712', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79f6fd1c-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': 'c2eecebd168d38a6dabd26ddd8a6b39bb5d0334721ad20b5097ede9994f44265'}]}, 'timestamp': '2025-11-22 08:44:36.949037', '_unique_id': '2ab902026feb4885adadbe7be894c8cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.950 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.954 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.954 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba833c5f-a52a-4c2a-b3e6-929d975136a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:36.954780', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79f7f230-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': '4a597ebeedae42098c6d464085056b0548334c7acb47a907c676bd9311c71a60'}]}, 'timestamp': '2025-11-22 08:44:36.955422', '_unique_id': 'f55712bb3ed94b20baa8004b7fb3033b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.960 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bc2440b-e5e4-4482-8893-cb50b10ddb7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:36.960517', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79f8ce30-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': 'c0cf64cc858fa35e030e7219e5a2455c8e53f4fb30131744465d3010bc8eee1d'}]}, 'timestamp': '2025-11-22 08:44:36.961016', '_unique_id': '35f2777cdea940c6b1957e087272d9bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.961 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.965 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.966 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '537c2084-f3b6-4abe-aa45-92d50e3142b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:36.965639', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79f9978e-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': '9b02aad913ca0bb47c41833006f6bbe056573fded07fb49783ec51b0f1a95d54'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:36.965639', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79f9a756-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': 'fe7a7bda9f91d2d475fe3891efa838e2b82dd81cee8b47f3beba0b8375519e0b'}]}, 'timestamp': '2025-11-22 08:44:36.966499', '_unique_id': 'de91e18e7cf94ebe98164b13503322c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.970 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.970 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.read.latency volume: 1251669669 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.971 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.read.latency volume: 655226 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58c76c1e-75f9-40bc-8f0c-60ddf3e85427', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1251669669, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:36.970773', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79fa5caa-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': '363ceacf93070f34fdbfb65e21173bca2410f3b2f058fa302123467493609079'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 655226, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:36.970773', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79fa697a-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': '6784144f1420987f7d8689256de0e85b142f9ba476e5cf57f4fe260e133946e4'}]}, 'timestamp': '2025-11-22 08:44:36.971477', '_unique_id': '50c933dab3914f6c907e1f1b856d3178'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.972 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.975 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c3cd963-80a4-47ce-aafa-f18999c35dd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:36.975758', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79fb22a2-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': '72172973e7f8bc1257acc7dc9001a06fbb23f69841b70430aa16995a5d97f2fb'}]}, 'timestamp': '2025-11-22 08:44:36.976284', '_unique_id': '999a4e1165a74feba66e424a72f22d88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.977 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.980 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.981 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26ff082c-c04d-4712-98c6-2e3e253d3cc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:36.980782', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79fbe6f6-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.56881976, 'message_signature': 'c673903a86342cb4e3e3102148a85e73095273cb5da2d2dc30d11ecbb55c6218'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:36.980782', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79fbf768-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.56881976, 'message_signature': '2965d4001216d8e20faf9548f8b8ab821ac20dbae68eb54fadc66b458bf77eda'}]}, 'timestamp': '2025-11-22 08:44:36.981680', '_unique_id': 'bfb8fe5abea548eab43c52dd6552b692'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.986 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.986 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.986 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '130fe4c8-e058-415b-870b-7475892e04e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:36.986275', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79fcbd24-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.56881976, 'message_signature': '76d941ef213e9fd9b907c4d8d41d2962dc3046111b1a863ac50ed54f468d3740'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:36.986275', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79fcd534-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.56881976, 'message_signature': 'dfcc35265a97d2ac1ba780d49fef3eb0c619db409774caca8342f54ed9bd9594'}]}, 'timestamp': '2025-11-22 08:44:36.987369', '_unique_id': '8ac6134e797246eaae4111ba03ccf7e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.988 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:44:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.991 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3dfdb5c-7d0c-4ebe-95fd-6fe5e0ac4b96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:36.991182', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79fd7c78-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': '2d451c85d9e524cb0557c0ef3fdc12d49f5cbbaf3d7ae37cbadea1dedbf38bad'}]}, 'timestamp': '2025-11-22 08:44:36.991770', '_unique_id': '86c82abf0a0f410bb5e7c8e2d4f56146'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.992 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.994 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.994 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44c85a09-2f9b-4399-8db7-bfc608dd7644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:36.994067', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79fde7c6-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': 'df2366b005509e58014a2eb167a028a329b0ade7f7ccbf5287b9d3fe18797679'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:36.994067', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79fdf32e-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': '0b54bd60f4fa002d5b19658991e51ced7351099f8d7b70ce70d40b6c6d6e156b'}]}, 'timestamp': '2025-11-22 08:44:36.994706', '_unique_id': '16538bf7a21e46f59db08ca5c8f62caf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.995 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.996 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.996 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.996 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 1642882a-1940-4212-a4b4-85fb63259b3d: ceilometer.compute.pollsters.NoVolumeException
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.997 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.997 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1738764145>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1738764145>]
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.997 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.997 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.998 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7285427f-be65-46d7-9c16-bdd8a667e8c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:36.997547', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79fe722c-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': '2e4238b82879172bc49a735ee9f65a6d37871e62ae906160db89f4cf83ed6af4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:36.997547', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79fe8104-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': '1f77ecdce328ce28c2ed9185201aaf3ea37a3bfa9f434414f01ff2a33c24a525'}]}, 'timestamp': '2025-11-22 08:44:36.998330', '_unique_id': '4f6001ee2eb245cf95b76be1eca8c101'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:36.999 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.002 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3dee2df-7b79-4066-8e67-06f443219542', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:37.002116', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79ff251e-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': '6f229492e6706ffc68e76c186129b033ba60002baef413135bc70b41dd3c9bc4'}]}, 'timestamp': '2025-11-22 08:44:37.002551', '_unique_id': 'bd676e047b2645a9bb401debe7f90fb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.003 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.004 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.004 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '570047bb-fcac-467b-8118-74cc43475e6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000af-1642882a-1940-4212-a4b4-85fb63259b3d-tap3033e008-cd', 'timestamp': '2025-11-22T08:44:37.004856', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'tap3033e008-cd', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:d1:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3033e008-cd'}, 'message_id': '79ff8edc-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.559591152, 'message_signature': 'da62e8a6948b945657fd1c47aabc2548885557037e616a555e23afef73506d42'}]}, 'timestamp': '2025-11-22 08:44:37.005250', '_unique_id': 'c0ab20a429bd4d4f8214ab8edd4acd4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.006 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.007 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.007 12 DEBUG ceilometer.compute.pollsters [-] 1642882a-1940-4212-a4b4-85fb63259b3d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b3119a9-9996-4178-9411-60c0d2499f70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-vda', 'timestamp': '2025-11-22T08:44:37.007337', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '79ffef8a-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': 'dc0a8537ab13e5536d0adb46235f57a0b1ec6cab1fe9c1475a6cdb45eafef1ad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1642882a-1940-4212-a4b4-85fb63259b3d-sda', 'timestamp': '2025-11-22T08:44:37.007337', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1738764145', 'name': 'instance-000000af', 'instance_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'instance_type': 'm1.nano', 'host': '267a2a30d7b18cb697672146875e487df8231cd0caa1bb373fad6e24', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '79fffd90-c77f-11f0-941d-fa163e6775e5', 'monotonic_time': 7817.621242669, 'message_signature': '3bb73e2637c83d44c34c3330b1a67400e435c8cd8590ab771a977df6caea01f3'}]}, 'timestamp': '2025-11-22 08:44:37.008055', '_unique_id': 'f8bfc0cc173d456e84d4b671e00a3a58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:44:37 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:44:37.008 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:44:37 np0005531888 NetworkManager[55166]: <info>  [1763801077.3683] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Nov 22 03:44:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:37Z|00730|binding|INFO|Releasing lport 0aea6cf7-2802-4ea0-83fb-156a81dc390f from this chassis (sb_readonly=0)
Nov 22 03:44:37 np0005531888 NetworkManager[55166]: <info>  [1763801077.3696] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Nov 22 03:44:37 np0005531888 nova_compute[186788]: 2025-11-22 08:44:37.369 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:37Z|00731|binding|INFO|Releasing lport 0aea6cf7-2802-4ea0-83fb-156a81dc390f from this chassis (sb_readonly=0)
Nov 22 03:44:37 np0005531888 nova_compute[186788]: 2025-11-22 08:44:37.402 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:37 np0005531888 nova_compute[186788]: 2025-11-22 08:44:37.408 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:37 np0005531888 nova_compute[186788]: 2025-11-22 08:44:37.642 186792 DEBUG nova.compute.manager [req-498870e7-d5b3-4ca2-9d12-94e247dd33ea req-cc34c0ed-48a1-4066-923a-cfa8be894a1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-changed-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:44:37 np0005531888 nova_compute[186788]: 2025-11-22 08:44:37.642 186792 DEBUG nova.compute.manager [req-498870e7-d5b3-4ca2-9d12-94e247dd33ea req-cc34c0ed-48a1-4066-923a-cfa8be894a1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Refreshing instance network info cache due to event network-changed-3033e008-cd6e-4748-9b74-590905825b5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:44:37 np0005531888 nova_compute[186788]: 2025-11-22 08:44:37.643 186792 DEBUG oslo_concurrency.lockutils [req-498870e7-d5b3-4ca2-9d12-94e247dd33ea req-cc34c0ed-48a1-4066-923a-cfa8be894a1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:44:37 np0005531888 nova_compute[186788]: 2025-11-22 08:44:37.643 186792 DEBUG oslo_concurrency.lockutils [req-498870e7-d5b3-4ca2-9d12-94e247dd33ea req-cc34c0ed-48a1-4066-923a-cfa8be894a1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:44:37 np0005531888 nova_compute[186788]: 2025-11-22 08:44:37.643 186792 DEBUG nova.network.neutron [req-498870e7-d5b3-4ca2-9d12-94e247dd33ea req-cc34c0ed-48a1-4066-923a-cfa8be894a1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Refreshing network info cache for port 3033e008-cd6e-4748-9b74-590905825b5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:44:38 np0005531888 nova_compute[186788]: 2025-11-22 08:44:38.212 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:38 np0005531888 nova_compute[186788]: 2025-11-22 08:44:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:38 np0005531888 nova_compute[186788]: 2025-11-22 08:44:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:38 np0005531888 nova_compute[186788]: 2025-11-22 08:44:38.975 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:38 np0005531888 nova_compute[186788]: 2025-11-22 08:44:38.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:38 np0005531888 nova_compute[186788]: 2025-11-22 08:44:38.976 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:38 np0005531888 nova_compute[186788]: 2025-11-22 08:44:38.976 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.043 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.105 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.106 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.163 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.327 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.329 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5545MB free_disk=73.265625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.329 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.329 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.411 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 1642882a-1940-4212-a4b4-85fb63259b3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.412 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.412 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.452 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.468 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.494 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.495 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.556 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:39 np0005531888 nova_compute[186788]: 2025-11-22 08:44:39.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:44:40 np0005531888 nova_compute[186788]: 2025-11-22 08:44:40.563 186792 DEBUG nova.network.neutron [req-498870e7-d5b3-4ca2-9d12-94e247dd33ea req-cc34c0ed-48a1-4066-923a-cfa8be894a1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Updated VIF entry in instance network info cache for port 3033e008-cd6e-4748-9b74-590905825b5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:44:40 np0005531888 nova_compute[186788]: 2025-11-22 08:44:40.563 186792 DEBUG nova.network.neutron [req-498870e7-d5b3-4ca2-9d12-94e247dd33ea req-cc34c0ed-48a1-4066-923a-cfa8be894a1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Updating instance_info_cache with network_info: [{"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:44:40 np0005531888 nova_compute[186788]: 2025-11-22 08:44:40.589 186792 DEBUG oslo_concurrency.lockutils [req-498870e7-d5b3-4ca2-9d12-94e247dd33ea req-cc34c0ed-48a1-4066-923a-cfa8be894a1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:44:40 np0005531888 nova_compute[186788]: 2025-11-22 08:44:40.966 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:41 np0005531888 nova_compute[186788]: 2025-11-22 08:44:41.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:41 np0005531888 nova_compute[186788]: 2025-11-22 08:44:41.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:41 np0005531888 nova_compute[186788]: 2025-11-22 08:44:41.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:44:43 np0005531888 nova_compute[186788]: 2025-11-22 08:44:43.216 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:43 np0005531888 podman[249979]: 2025-11-22 08:44:43.676951965 +0000 UTC m=+0.048948986 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:44:43 np0005531888 podman[249980]: 2025-11-22 08:44:43.682696486 +0000 UTC m=+0.051996140 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 03:44:44 np0005531888 nova_compute[186788]: 2025-11-22 08:44:44.557 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:48 np0005531888 nova_compute[186788]: 2025-11-22 08:44:48.220 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:48Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:79:d1:70 10.100.0.8
Nov 22 03:44:48 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:48Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:79:d1:70 10.100.0.8
Nov 22 03:44:49 np0005531888 nova_compute[186788]: 2025-11-22 08:44:49.559 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:53 np0005531888 nova_compute[186788]: 2025-11-22 08:44:53.222 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:54 np0005531888 nova_compute[186788]: 2025-11-22 08:44:54.411 186792 INFO nova.compute.manager [None req-cf034ed2-8e75-4e78-bedc-7e1b5504d974 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Get console output#033[00m
Nov 22 03:44:54 np0005531888 nova_compute[186788]: 2025-11-22 08:44:54.416 213221 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:44:54 np0005531888 nova_compute[186788]: 2025-11-22 08:44:54.561 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:54 np0005531888 nova_compute[186788]: 2025-11-22 08:44:54.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:55 np0005531888 podman[250038]: 2025-11-22 08:44:55.680796557 +0000 UTC m=+0.054469061 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:44:55 np0005531888 podman[250037]: 2025-11-22 08:44:55.689890721 +0000 UTC m=+0.066318242 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:44:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:55.818 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:44:55 np0005531888 nova_compute[186788]: 2025-11-22 08:44:55.818 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:55.819 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:44:56 np0005531888 nova_compute[186788]: 2025-11-22 08:44:56.104 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:56 np0005531888 nova_compute[186788]: 2025-11-22 08:44:56.104 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:44:56 np0005531888 nova_compute[186788]: 2025-11-22 08:44:56.128 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:44:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:56Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:79:d1:70 10.100.0.8
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.773 186792 DEBUG nova.compute.manager [req-469f2445-c3cd-4a85-b116-c11141509061 req-27ba3937-23d7-4d61-a3ed-ca10d3763ace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-changed-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.773 186792 DEBUG nova.compute.manager [req-469f2445-c3cd-4a85-b116-c11141509061 req-27ba3937-23d7-4d61-a3ed-ca10d3763ace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Refreshing instance network info cache due to event network-changed-3033e008-cd6e-4748-9b74-590905825b5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.773 186792 DEBUG oslo_concurrency.lockutils [req-469f2445-c3cd-4a85-b116-c11141509061 req-27ba3937-23d7-4d61-a3ed-ca10d3763ace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.774 186792 DEBUG oslo_concurrency.lockutils [req-469f2445-c3cd-4a85-b116-c11141509061 req-27ba3937-23d7-4d61-a3ed-ca10d3763ace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.774 186792 DEBUG nova.network.neutron [req-469f2445-c3cd-4a85-b116-c11141509061 req-27ba3937-23d7-4d61-a3ed-ca10d3763ace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Refreshing network info cache for port 3033e008-cd6e-4748-9b74-590905825b5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.828 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.829 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.829 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.830 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.830 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.839 186792 INFO nova.compute.manager [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Terminating instance#033[00m
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.845 186792 DEBUG nova.compute.manager [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:44:57 np0005531888 kernel: tap3033e008-cd (unregistering): left promiscuous mode
Nov 22 03:44:57 np0005531888 NetworkManager[55166]: <info>  [1763801097.9822] device (tap3033e008-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:44:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:57Z|00732|binding|INFO|Releasing lport 3033e008-cd6e-4748-9b74-590905825b5d from this chassis (sb_readonly=0)
Nov 22 03:44:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:57Z|00733|binding|INFO|Setting lport 3033e008-cd6e-4748-9b74-590905825b5d down in Southbound
Nov 22 03:44:57 np0005531888 nova_compute[186788]: 2025-11-22 08:44:57.991 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:57 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:57Z|00734|binding|INFO|Removing iface tap3033e008-cd ovn-installed in OVS
Nov 22 03:44:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:57.998 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:d1:70 10.100.0.8'], port_security=['fa:16:3e:79:d1:70 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea197933-b137-4edc-a409-ccba428ee2cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '54e44449-0bb8-4e26-9a58-cb316fa17a67', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fd495cc-3d66-4689-af7a-1e87868108e0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=3033e008-cd6e-4748-9b74-590905825b5d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:44:57 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:57.999 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 3033e008-cd6e-4748-9b74-590905825b5d in datapath ea197933-b137-4edc-a409-ccba428ee2cd unbound from our chassis#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.000 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea197933-b137-4edc-a409-ccba428ee2cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.001 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[5325c5e4-1a35-42e8-8b55-c0460d7c6579]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.002 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd namespace which is not needed anymore#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.011 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000af.scope: Deactivated successfully.
Nov 22 03:44:58 np0005531888 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000af.scope: Consumed 14.954s CPU time.
Nov 22 03:44:58 np0005531888 systemd-machined[153106]: Machine qemu-85-instance-000000af terminated.
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.225 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 kernel: tap3033e008-cd: entered promiscuous mode
Nov 22 03:44:58 np0005531888 kernel: tap3033e008-cd (unregistering): left promiscuous mode
Nov 22 03:44:58 np0005531888 neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd[249956]: [NOTICE]   (249960) : haproxy version is 2.8.14-c23fe91
Nov 22 03:44:58 np0005531888 neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd[249956]: [NOTICE]   (249960) : path to executable is /usr/sbin/haproxy
Nov 22 03:44:58 np0005531888 neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd[249956]: [WARNING]  (249960) : Exiting Master process...
Nov 22 03:44:58 np0005531888 neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd[249956]: [ALERT]    (249960) : Current worker (249962) exited with code 143 (Terminated)
Nov 22 03:44:58 np0005531888 neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd[249956]: [WARNING]  (249960) : All workers exited. Exiting... (0)
Nov 22 03:44:58 np0005531888 systemd[1]: libpod-415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c.scope: Deactivated successfully.
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00735|binding|INFO|Claiming lport 3033e008-cd6e-4748-9b74-590905825b5d for this chassis.
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00736|binding|INFO|3033e008-cd6e-4748-9b74-590905825b5d: Claiming fa:16:3e:79:d1:70 10.100.0.8
Nov 22 03:44:58 np0005531888 podman[250104]: 2025-11-22 08:44:58.29622176 +0000 UTC m=+0.194805343 container died 415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.302 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:d1:70 10.100.0.8'], port_security=['fa:16:3e:79:d1:70 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea197933-b137-4edc-a409-ccba428ee2cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '54e44449-0bb8-4e26-9a58-cb316fa17a67', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fd495cc-3d66-4689-af7a-1e87868108e0, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=3033e008-cd6e-4748-9b74-590905825b5d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.316 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00737|binding|INFO|Setting lport 3033e008-cd6e-4748-9b74-590905825b5d ovn-installed in OVS
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00738|binding|INFO|Setting lport 3033e008-cd6e-4748-9b74-590905825b5d up in Southbound
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00739|binding|INFO|Releasing lport 3033e008-cd6e-4748-9b74-590905825b5d from this chassis (sb_readonly=1)
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00740|if_status|INFO|Dropped 2 log messages in last 941 seconds (most recently, 941 seconds ago) due to excessive rate
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00741|if_status|INFO|Not setting lport 3033e008-cd6e-4748-9b74-590905825b5d down as sb is readonly
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00742|binding|INFO|Removing iface tap3033e008-cd ovn-installed in OVS
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.317 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.319 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00743|binding|INFO|Releasing lport 3033e008-cd6e-4748-9b74-590905825b5d from this chassis (sb_readonly=0)
Nov 22 03:44:58 np0005531888 ovn_controller[95067]: 2025-11-22T08:44:58Z|00744|binding|INFO|Setting lport 3033e008-cd6e-4748-9b74-590905825b5d down in Southbound
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.332 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:d1:70 10.100.0.8'], port_security=['fa:16:3e:79:d1:70 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1642882a-1940-4212-a4b4-85fb63259b3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea197933-b137-4edc-a409-ccba428ee2cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '54e44449-0bb8-4e26-9a58-cb316fa17a67', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fd495cc-3d66-4689-af7a-1e87868108e0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=3033e008-cd6e-4748-9b74-590905825b5d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.337 186792 INFO nova.virt.libvirt.driver [-] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Instance destroyed successfully.#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.337 186792 DEBUG nova.objects.instance [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid 1642882a-1940-4212-a4b4-85fb63259b3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.343 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.352 186792 DEBUG nova.virt.libvirt.vif [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:44:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1738764145',display_name='tempest-TestNetworkBasicOps-server-1738764145',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1738764145',id=175,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ9I30YDOJc7zeNSbR4OTBRrIZBRWgzX0sO0vVuFLlLZHjv8EuLeLmF0elL6jI/157DdKAZTLm8gwZq48MSq86/kGh73bqrqC9SVezNrCrOmrFURIRseA/Bfw27EY0ZvRw==',key_name='tempest-TestNetworkBasicOps-812610295',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:44:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-vi0k7qqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:44:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=1642882a-1940-4212-a4b4-85fb63259b3d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.352 186792 DEBUG nova.network.os_vif_util [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.353 186792 DEBUG nova.network.os_vif_util [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:79:d1:70,bridge_name='br-int',has_traffic_filtering=True,id=3033e008-cd6e-4748-9b74-590905825b5d,network=Network(ea197933-b137-4edc-a409-ccba428ee2cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3033e008-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.353 186792 DEBUG os_vif [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:d1:70,bridge_name='br-int',has_traffic_filtering=True,id=3033e008-cd6e-4748-9b74-590905825b5d,network=Network(ea197933-b137-4edc-a409-ccba428ee2cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3033e008-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.355 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.355 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3033e008-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.357 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.358 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.359 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.363 186792 DEBUG nova.compute.manager [req-9d134f51-30b2-4e46-86bd-e6ea302f9b9d req-0b7cb057-1c67-42a0-b007-8943498f5ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-unplugged-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.364 186792 DEBUG oslo_concurrency.lockutils [req-9d134f51-30b2-4e46-86bd-e6ea302f9b9d req-0b7cb057-1c67-42a0-b007-8943498f5ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.364 186792 DEBUG oslo_concurrency.lockutils [req-9d134f51-30b2-4e46-86bd-e6ea302f9b9d req-0b7cb057-1c67-42a0-b007-8943498f5ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.364 186792 DEBUG oslo_concurrency.lockutils [req-9d134f51-30b2-4e46-86bd-e6ea302f9b9d req-0b7cb057-1c67-42a0-b007-8943498f5ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.365 186792 DEBUG nova.compute.manager [req-9d134f51-30b2-4e46-86bd-e6ea302f9b9d req-0b7cb057-1c67-42a0-b007-8943498f5ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] No waiting events found dispatching network-vif-unplugged-3033e008-cd6e-4748-9b74-590905825b5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.365 186792 DEBUG nova.compute.manager [req-9d134f51-30b2-4e46-86bd-e6ea302f9b9d req-0b7cb057-1c67-42a0-b007-8943498f5ca8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-unplugged-3033e008-cd6e-4748-9b74-590905825b5d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.366 186792 INFO os_vif [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:d1:70,bridge_name='br-int',has_traffic_filtering=True,id=3033e008-cd6e-4748-9b74-590905825b5d,network=Network(ea197933-b137-4edc-a409-ccba428ee2cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3033e008-cd')#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.367 186792 INFO nova.virt.libvirt.driver [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Deleting instance files /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d_del#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.367 186792 INFO nova.virt.libvirt.driver [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Deletion of /var/lib/nova/instances/1642882a-1940-4212-a4b4-85fb63259b3d_del complete#033[00m
Nov 22 03:44:58 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c-userdata-shm.mount: Deactivated successfully.
Nov 22 03:44:58 np0005531888 systemd[1]: var-lib-containers-storage-overlay-e43a12e63238784311f83f70143b552db059e45fbd46f7ea4281305de601aadc-merged.mount: Deactivated successfully.
Nov 22 03:44:58 np0005531888 podman[250104]: 2025-11-22 08:44:58.434442051 +0000 UTC m=+0.333025634 container cleanup 415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.438 186792 INFO nova.compute.manager [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.439 186792 DEBUG oslo.service.loopingcall [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.439 186792 DEBUG nova.compute.manager [-] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.439 186792 DEBUG nova.network.neutron [-] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:44:58 np0005531888 systemd[1]: libpod-conmon-415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c.scope: Deactivated successfully.
Nov 22 03:44:58 np0005531888 podman[250146]: 2025-11-22 08:44:58.503760837 +0000 UTC m=+0.044520537 container remove 415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.509 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b9ce72-40f7-4332-8542-9ffe9a21961c]: (4, ('Sat Nov 22 08:44:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd (415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c)\n415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c\nSat Nov 22 08:44:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd (415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c)\n415e2b0b3ce2da310b33930ce165575fa0fcff3497f8a44c778a7c45c68a527c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.511 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4e553d88-5f41-41cf-932b-9bf15b4557b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.512 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea197933-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:44:58 np0005531888 kernel: tapea197933-b0: left promiscuous mode
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.514 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 nova_compute[186788]: 2025-11-22 08:44:58.525 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.528 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[9c81bf73-1f0a-4648-8f38-629a68da6b8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.543 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a61aea27-6c61-480f-985f-e8d653e8e645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.544 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a962d42d-e974-44ff-87e1-04295af31444]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.560 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8fc868-f5c5-40f1-ad6d-0cc9583f1644]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 781470, 'reachable_time': 32633, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250161, 'error': None, 'target': 'ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.564 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ea197933-b137-4edc-a409-ccba428ee2cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.564 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[766b6dc6-9ea6-405f-9a61-18bf03f0286f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 systemd[1]: run-netns-ovnmeta\x2dea197933\x2db137\x2d4edc\x2da409\x2dccba428ee2cd.mount: Deactivated successfully.
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.565 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 3033e008-cd6e-4748-9b74-590905825b5d in datapath ea197933-b137-4edc-a409-ccba428ee2cd unbound from our chassis#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.566 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea197933-b137-4edc-a409-ccba428ee2cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.567 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b0fb6315-8325-437f-a5bc-7e04f4e45d94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.567 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 3033e008-cd6e-4748-9b74-590905825b5d in datapath ea197933-b137-4edc-a409-ccba428ee2cd unbound from our chassis#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.570 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea197933-b137-4edc-a409-ccba428ee2cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:44:58 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:44:58.570 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[95a47f15-96c5-4b88-bd53-8b79cc24dd70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.257 186792 DEBUG nova.network.neutron [req-469f2445-c3cd-4a85-b116-c11141509061 req-27ba3937-23d7-4d61-a3ed-ca10d3763ace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Updated VIF entry in instance network info cache for port 3033e008-cd6e-4748-9b74-590905825b5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.258 186792 DEBUG nova.network.neutron [req-469f2445-c3cd-4a85-b116-c11141509061 req-27ba3937-23d7-4d61-a3ed-ca10d3763ace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Updating instance_info_cache with network_info: [{"id": "3033e008-cd6e-4748-9b74-590905825b5d", "address": "fa:16:3e:79:d1:70", "network": {"id": "ea197933-b137-4edc-a409-ccba428ee2cd", "bridge": "br-int", "label": "tempest-network-smoke--1992926129", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3033e008-cd", "ovs_interfaceid": "3033e008-cd6e-4748-9b74-590905825b5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.274 186792 DEBUG oslo_concurrency.lockutils [req-469f2445-c3cd-4a85-b116-c11141509061 req-27ba3937-23d7-4d61-a3ed-ca10d3763ace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1642882a-1940-4212-a4b4-85fb63259b3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.350 186792 DEBUG nova.network.neutron [-] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.388 186792 INFO nova.compute.manager [-] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Took 0.95 seconds to deallocate network for instance.#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.504 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.505 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.563 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.590 186792 DEBUG nova.compute.provider_tree [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.607 186792 DEBUG nova.scheduler.client.report [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.639 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:59 np0005531888 podman[250162]: 2025-11-22 08:44:59.69288534 +0000 UTC m=+0.059870894 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.699 186792 INFO nova.scheduler.client.report [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance 1642882a-1940-4212-a4b4-85fb63259b3d#033[00m
Nov 22 03:44:59 np0005531888 podman[250164]: 2025-11-22 08:44:59.71523665 +0000 UTC m=+0.077607920 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:44:59 np0005531888 podman[250163]: 2025-11-22 08:44:59.724316253 +0000 UTC m=+0.091343968 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.765 186792 DEBUG oslo_concurrency.lockutils [None req-768cc8f8-d7b6-4490-a9b9-10d3b2a7f8f5 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:59 np0005531888 nova_compute[186788]: 2025-11-22 08:44:59.867 186792 DEBUG nova.compute.manager [req-352c1b0a-b692-46f6-98c5-45eb91df082a req-1f375ada-1e30-46ad-a852-2a65b7bd30e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-deleted-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.520 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.520 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.520 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.521 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.521 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] No waiting events found dispatching network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.522 186792 WARNING nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received unexpected event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.522 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.522 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.523 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.523 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.523 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] No waiting events found dispatching network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.523 186792 WARNING nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received unexpected event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.524 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.524 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.524 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.525 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.525 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] No waiting events found dispatching network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.525 186792 WARNING nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received unexpected event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.525 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-unplugged-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.526 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.526 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.526 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.527 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] No waiting events found dispatching network-vif-unplugged-3033e008-cd6e-4748-9b74-590905825b5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.527 186792 WARNING nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received unexpected event network-vif-unplugged-3033e008-cd6e-4748-9b74-590905825b5d for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.527 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.527 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.528 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.528 186792 DEBUG oslo_concurrency.lockutils [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1642882a-1940-4212-a4b4-85fb63259b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.528 186792 DEBUG nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] No waiting events found dispatching network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:45:00 np0005531888 nova_compute[186788]: 2025-11-22 08:45:00.529 186792 WARNING nova.compute.manager [req-42e84842-6f5d-47bb-a275-2f859a791a5c req-a5d0d9bc-6c08-4ec0-a564-8500b4c4058a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Received unexpected event network-vif-plugged-3033e008-cd6e-4748-9b74-590905825b5d for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:45:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:45:00.821 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:45:03 np0005531888 nova_compute[186788]: 2025-11-22 08:45:03.357 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:04 np0005531888 nova_compute[186788]: 2025-11-22 08:45:04.566 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:05 np0005531888 nova_compute[186788]: 2025-11-22 08:45:05.086 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:05 np0005531888 nova_compute[186788]: 2025-11-22 08:45:05.153 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:05 np0005531888 nova_compute[186788]: 2025-11-22 08:45:05.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:08 np0005531888 nova_compute[186788]: 2025-11-22 08:45:08.360 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:09 np0005531888 nova_compute[186788]: 2025-11-22 08:45:09.567 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:13 np0005531888 nova_compute[186788]: 2025-11-22 08:45:13.335 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801098.3342438, 1642882a-1940-4212-a4b4-85fb63259b3d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:45:13 np0005531888 nova_compute[186788]: 2025-11-22 08:45:13.335 186792 INFO nova.compute.manager [-] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:45:13 np0005531888 nova_compute[186788]: 2025-11-22 08:45:13.362 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:13 np0005531888 nova_compute[186788]: 2025-11-22 08:45:13.442 186792 DEBUG nova.compute.manager [None req-af1e1281-7889-4e70-9ee7-7cb02201aa1a - - - - - -] [instance: 1642882a-1940-4212-a4b4-85fb63259b3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:45:14 np0005531888 nova_compute[186788]: 2025-11-22 08:45:14.568 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:14 np0005531888 podman[250233]: 2025-11-22 08:45:14.686533148 +0000 UTC m=+0.050200197 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 03:45:14 np0005531888 podman[250232]: 2025-11-22 08:45:14.686545498 +0000 UTC m=+0.056717997 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:45:18 np0005531888 nova_compute[186788]: 2025-11-22 08:45:18.363 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:19 np0005531888 nova_compute[186788]: 2025-11-22 08:45:19.570 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:23 np0005531888 nova_compute[186788]: 2025-11-22 08:45:23.365 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:24 np0005531888 nova_compute[186788]: 2025-11-22 08:45:24.571 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:25 np0005531888 nova_compute[186788]: 2025-11-22 08:45:25.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:26 np0005531888 podman[250277]: 2025-11-22 08:45:26.678801747 +0000 UTC m=+0.049884069 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:45:26 np0005531888 podman[250276]: 2025-11-22 08:45:26.692422461 +0000 UTC m=+0.067048470 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 03:45:26 np0005531888 nova_compute[186788]: 2025-11-22 08:45:26.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:27 np0005531888 nova_compute[186788]: 2025-11-22 08:45:27.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:27 np0005531888 nova_compute[186788]: 2025-11-22 08:45:27.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:45:27 np0005531888 nova_compute[186788]: 2025-11-22 08:45:27.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:45:27 np0005531888 nova_compute[186788]: 2025-11-22 08:45:27.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:45:28 np0005531888 nova_compute[186788]: 2025-11-22 08:45:28.367 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:29 np0005531888 nova_compute[186788]: 2025-11-22 08:45:29.572 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:30 np0005531888 podman[250322]: 2025-11-22 08:45:30.679292144 +0000 UTC m=+0.053540838 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:45:30 np0005531888 podman[250321]: 2025-11-22 08:45:30.680797711 +0000 UTC m=+0.058396368 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible)
Nov 22 03:45:30 np0005531888 podman[250323]: 2025-11-22 08:45:30.710112242 +0000 UTC m=+0.079652860 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:45:33 np0005531888 nova_compute[186788]: 2025-11-22 08:45:33.369 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:33 np0005531888 nova_compute[186788]: 2025-11-22 08:45:33.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:34 np0005531888 nova_compute[186788]: 2025-11-22 08:45:34.573 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:45:36.868 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:45:36.869 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:45:36.869 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:38 np0005531888 nova_compute[186788]: 2025-11-22 08:45:38.370 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:39 np0005531888 nova_compute[186788]: 2025-11-22 08:45:39.574 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:40 np0005531888 nova_compute[186788]: 2025-11-22 08:45:40.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:40 np0005531888 nova_compute[186788]: 2025-11-22 08:45:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:40 np0005531888 nova_compute[186788]: 2025-11-22 08:45:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:40 np0005531888 nova_compute[186788]: 2025-11-22 08:45:40.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:40 np0005531888 nova_compute[186788]: 2025-11-22 08:45:40.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:40 np0005531888 nova_compute[186788]: 2025-11-22 08:45:40.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:40 np0005531888 nova_compute[186788]: 2025-11-22 08:45:40.980 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.141 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.142 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5696MB free_disk=73.26662063598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.142 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.142 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:45:41Z|00745|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.210 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.211 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.241 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.257 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.257 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.270 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.290 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.311 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.326 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.352 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:45:41 np0005531888 nova_compute[186788]: 2025-11-22 08:45:41.353 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:43 np0005531888 nova_compute[186788]: 2025-11-22 08:45:43.353 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:43 np0005531888 nova_compute[186788]: 2025-11-22 08:45:43.353 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:45:43 np0005531888 nova_compute[186788]: 2025-11-22 08:45:43.372 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:43 np0005531888 nova_compute[186788]: 2025-11-22 08:45:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:44 np0005531888 nova_compute[186788]: 2025-11-22 08:45:44.576 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:45 np0005531888 podman[250383]: 2025-11-22 08:45:45.679417961 +0000 UTC m=+0.053901857 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:45:45 np0005531888 podman[250384]: 2025-11-22 08:45:45.710738301 +0000 UTC m=+0.076568624 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:45:48 np0005531888 nova_compute[186788]: 2025-11-22 08:45:48.373 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:49 np0005531888 nova_compute[186788]: 2025-11-22 08:45:49.578 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:53 np0005531888 nova_compute[186788]: 2025-11-22 08:45:53.374 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:54 np0005531888 nova_compute[186788]: 2025-11-22 08:45:54.579 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:57 np0005531888 podman[250426]: 2025-11-22 08:45:57.679519284 +0000 UTC m=+0.054400250 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 22 03:45:57 np0005531888 podman[250427]: 2025-11-22 08:45:57.679575585 +0000 UTC m=+0.052635786 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:45:58 np0005531888 nova_compute[186788]: 2025-11-22 08:45:58.376 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:59 np0005531888 nova_compute[186788]: 2025-11-22 08:45:59.581 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:01 np0005531888 podman[250468]: 2025-11-22 08:46:01.68036033 +0000 UTC m=+0.047291165 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:46:01 np0005531888 podman[250467]: 2025-11-22 08:46:01.68116546 +0000 UTC m=+0.050762639 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=)
Nov 22 03:46:01 np0005531888 podman[250469]: 2025-11-22 08:46:01.71411882 +0000 UTC m=+0.077873826 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:46:03 np0005531888 nova_compute[186788]: 2025-11-22 08:46:03.377 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:04 np0005531888 nova_compute[186788]: 2025-11-22 08:46:04.581 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:05 np0005531888 nova_compute[186788]: 2025-11-22 08:46:05.879 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:08 np0005531888 nova_compute[186788]: 2025-11-22 08:46:08.379 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:09 np0005531888 nova_compute[186788]: 2025-11-22 08:46:09.583 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:13 np0005531888 nova_compute[186788]: 2025-11-22 08:46:13.381 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:14 np0005531888 nova_compute[186788]: 2025-11-22 08:46:14.588 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:46:16.209 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:46:16 np0005531888 nova_compute[186788]: 2025-11-22 08:46:16.210 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:16 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:46:16.211 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:46:16 np0005531888 podman[250529]: 2025-11-22 08:46:16.670788778 +0000 UTC m=+0.049639452 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:46:16 np0005531888 podman[250530]: 2025-11-22 08:46:16.673826583 +0000 UTC m=+0.050592115 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 22 03:46:18 np0005531888 nova_compute[186788]: 2025-11-22 08:46:18.383 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:19 np0005531888 nova_compute[186788]: 2025-11-22 08:46:19.588 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:23 np0005531888 nova_compute[186788]: 2025-11-22 08:46:23.384 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:24 np0005531888 nova_compute[186788]: 2025-11-22 08:46:24.590 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:26 np0005531888 nova_compute[186788]: 2025-11-22 08:46:26.064 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:46:26.213 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:46:28 np0005531888 nova_compute[186788]: 2025-11-22 08:46:28.386 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:28 np0005531888 podman[250571]: 2025-11-22 08:46:28.456419902 +0000 UTC m=+0.042493296 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:46:28 np0005531888 podman[250570]: 2025-11-22 08:46:28.462472921 +0000 UTC m=+0.052234196 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 03:46:28 np0005531888 nova_compute[186788]: 2025-11-22 08:46:28.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:28 np0005531888 nova_compute[186788]: 2025-11-22 08:46:28.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:28 np0005531888 nova_compute[186788]: 2025-11-22 08:46:28.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:46:28 np0005531888 nova_compute[186788]: 2025-11-22 08:46:28.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:46:28 np0005531888 nova_compute[186788]: 2025-11-22 08:46:28.966 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:46:29 np0005531888 nova_compute[186788]: 2025-11-22 08:46:29.592 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:32 np0005531888 podman[250612]: 2025-11-22 08:46:32.66953765 +0000 UTC m=+0.046941076 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:46:32 np0005531888 podman[250613]: 2025-11-22 08:46:32.679554016 +0000 UTC m=+0.052374609 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 22 03:46:32 np0005531888 podman[250614]: 2025-11-22 08:46:32.698621885 +0000 UTC m=+0.067357228 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:46:33 np0005531888 nova_compute[186788]: 2025-11-22 08:46:33.388 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:34 np0005531888 nova_compute[186788]: 2025-11-22 08:46:34.594 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:35 np0005531888 nova_compute[186788]: 2025-11-22 08:46:35.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.856 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:46:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:46:36.869 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:46:36.870 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:46:36.870 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:38 np0005531888 nova_compute[186788]: 2025-11-22 08:46:38.389 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:39 np0005531888 nova_compute[186788]: 2025-11-22 08:46:39.595 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:40 np0005531888 nova_compute[186788]: 2025-11-22 08:46:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:40 np0005531888 nova_compute[186788]: 2025-11-22 08:46:40.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:40 np0005531888 nova_compute[186788]: 2025-11-22 08:46:40.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:40 np0005531888 nova_compute[186788]: 2025-11-22 08:46:40.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:40 np0005531888 nova_compute[186788]: 2025-11-22 08:46:40.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:40 np0005531888 nova_compute[186788]: 2025-11-22 08:46:40.982 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.146 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.147 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5719MB free_disk=73.26662063598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.147 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.147 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.304 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.304 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.390 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.402 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.404 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:46:41 np0005531888 nova_compute[186788]: 2025-11-22 08:46:41.404 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:42 np0005531888 nova_compute[186788]: 2025-11-22 08:46:42.403 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:43 np0005531888 nova_compute[186788]: 2025-11-22 08:46:43.391 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:43 np0005531888 nova_compute[186788]: 2025-11-22 08:46:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:43 np0005531888 nova_compute[186788]: 2025-11-22 08:46:43.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:46:44 np0005531888 nova_compute[186788]: 2025-11-22 08:46:44.596 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:45 np0005531888 nova_compute[186788]: 2025-11-22 08:46:45.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:47 np0005531888 podman[250677]: 2025-11-22 08:46:47.678042383 +0000 UTC m=+0.052688617 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:46:47 np0005531888 podman[250676]: 2025-11-22 08:46:47.704607886 +0000 UTC m=+0.082067599 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:46:48 np0005531888 nova_compute[186788]: 2025-11-22 08:46:48.392 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:49 np0005531888 nova_compute[186788]: 2025-11-22 08:46:49.597 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:53 np0005531888 nova_compute[186788]: 2025-11-22 08:46:53.393 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:54 np0005531888 nova_compute[186788]: 2025-11-22 08:46:54.598 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:58 np0005531888 nova_compute[186788]: 2025-11-22 08:46:58.395 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:58 np0005531888 podman[250717]: 2025-11-22 08:46:58.681928557 +0000 UTC m=+0.055166348 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 03:46:58 np0005531888 podman[250718]: 2025-11-22 08:46:58.692591699 +0000 UTC m=+0.058155431 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:46:59 np0005531888 nova_compute[186788]: 2025-11-22 08:46:59.601 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:03 np0005531888 nova_compute[186788]: 2025-11-22 08:47:03.396 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:03 np0005531888 podman[250761]: 2025-11-22 08:47:03.683633147 +0000 UTC m=+0.058087910 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:47:03 np0005531888 podman[250762]: 2025-11-22 08:47:03.684530919 +0000 UTC m=+0.055028365 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:47:03 np0005531888 podman[250763]: 2025-11-22 08:47:03.708321594 +0000 UTC m=+0.077810385 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 22 03:47:04 np0005531888 nova_compute[186788]: 2025-11-22 08:47:04.604 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:06 np0005531888 nova_compute[186788]: 2025-11-22 08:47:06.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:08 np0005531888 nova_compute[186788]: 2025-11-22 08:47:08.398 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:09 np0005531888 nova_compute[186788]: 2025-11-22 08:47:09.604 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:13 np0005531888 nova_compute[186788]: 2025-11-22 08:47:13.400 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:14 np0005531888 nova_compute[186788]: 2025-11-22 08:47:14.605 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:18 np0005531888 nova_compute[186788]: 2025-11-22 08:47:18.401 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:18 np0005531888 podman[250828]: 2025-11-22 08:47:18.668597249 +0000 UTC m=+0.044198417 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:47:18 np0005531888 podman[250829]: 2025-11-22 08:47:18.675217862 +0000 UTC m=+0.050384360 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:47:19 np0005531888 nova_compute[186788]: 2025-11-22 08:47:19.606 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:47:20.371 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:47:20 np0005531888 nova_compute[186788]: 2025-11-22 08:47:20.372 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:47:20.372 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:47:23 np0005531888 nova_compute[186788]: 2025-11-22 08:47:23.402 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:24 np0005531888 nova_compute[186788]: 2025-11-22 08:47:24.607 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:25 np0005531888 nova_compute[186788]: 2025-11-22 08:47:25.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:28 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:47:28.374 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:47:28 np0005531888 nova_compute[186788]: 2025-11-22 08:47:28.403 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:29 np0005531888 nova_compute[186788]: 2025-11-22 08:47:29.609 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:29 np0005531888 podman[250868]: 2025-11-22 08:47:29.695669503 +0000 UTC m=+0.060609701 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:47:29 np0005531888 podman[250869]: 2025-11-22 08:47:29.696435482 +0000 UTC m=+0.058423738 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:47:30 np0005531888 nova_compute[186788]: 2025-11-22 08:47:30.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:30 np0005531888 nova_compute[186788]: 2025-11-22 08:47:30.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:30 np0005531888 nova_compute[186788]: 2025-11-22 08:47:30.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:47:30 np0005531888 nova_compute[186788]: 2025-11-22 08:47:30.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:47:30 np0005531888 nova_compute[186788]: 2025-11-22 08:47:30.976 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:47:33 np0005531888 nova_compute[186788]: 2025-11-22 08:47:33.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:34 np0005531888 nova_compute[186788]: 2025-11-22 08:47:34.610 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:34 np0005531888 podman[250912]: 2025-11-22 08:47:34.697371782 +0000 UTC m=+0.067355778 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:47:34 np0005531888 podman[250913]: 2025-11-22 08:47:34.704213201 +0000 UTC m=+0.068518317 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:47:34 np0005531888 podman[250914]: 2025-11-22 08:47:34.751488144 +0000 UTC m=+0.114590930 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:47:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:47:36.871 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:47:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:47:36.871 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:47:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:47:36.871 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:47:37 np0005531888 nova_compute[186788]: 2025-11-22 08:47:37.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:38 np0005531888 nova_compute[186788]: 2025-11-22 08:47:38.408 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:39 np0005531888 nova_compute[186788]: 2025-11-22 08:47:39.613 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:42 np0005531888 nova_compute[186788]: 2025-11-22 08:47:42.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:42 np0005531888 nova_compute[186788]: 2025-11-22 08:47:42.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:42 np0005531888 nova_compute[186788]: 2025-11-22 08:47:42.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:42 np0005531888 nova_compute[186788]: 2025-11-22 08:47:42.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:47:42 np0005531888 nova_compute[186788]: 2025-11-22 08:47:42.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:47:42 np0005531888 nova_compute[186788]: 2025-11-22 08:47:42.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:47:42 np0005531888 nova_compute[186788]: 2025-11-22 08:47:42.981 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.166 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.168 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5704MB free_disk=73.26662063598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.168 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.168 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.229 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.229 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.256 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.289 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.290 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.290 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:47:43 np0005531888 nova_compute[186788]: 2025-11-22 08:47:43.410 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:44 np0005531888 nova_compute[186788]: 2025-11-22 08:47:44.290 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:44 np0005531888 nova_compute[186788]: 2025-11-22 08:47:44.290 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:47:44 np0005531888 nova_compute[186788]: 2025-11-22 08:47:44.614 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:46 np0005531888 nova_compute[186788]: 2025-11-22 08:47:46.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:48 np0005531888 nova_compute[186788]: 2025-11-22 08:47:48.411 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:49 np0005531888 nova_compute[186788]: 2025-11-22 08:47:49.617 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:49 np0005531888 podman[250976]: 2025-11-22 08:47:49.672603657 +0000 UTC m=+0.048168997 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 03:47:49 np0005531888 podman[250975]: 2025-11-22 08:47:49.673220282 +0000 UTC m=+0.048882744 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:47:53 np0005531888 nova_compute[186788]: 2025-11-22 08:47:53.413 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:54 np0005531888 nova_compute[186788]: 2025-11-22 08:47:54.619 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:58 np0005531888 nova_compute[186788]: 2025-11-22 08:47:58.414 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:47:59.468 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:47:59 np0005531888 nova_compute[186788]: 2025-11-22 08:47:59.469 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:47:59.469 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:47:59 np0005531888 nova_compute[186788]: 2025-11-22 08:47:59.620 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:00.472 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:00 np0005531888 podman[251018]: 2025-11-22 08:48:00.686903596 +0000 UTC m=+0.050309359 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:48:00 np0005531888 podman[251019]: 2025-11-22 08:48:00.700490861 +0000 UTC m=+0.056395389 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:48:03 np0005531888 nova_compute[186788]: 2025-11-22 08:48:03.415 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:04 np0005531888 nova_compute[186788]: 2025-11-22 08:48:04.622 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:05 np0005531888 podman[251061]: 2025-11-22 08:48:05.701755479 +0000 UTC m=+0.070991228 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, name=ubi9-minimal)
Nov 22 03:48:05 np0005531888 podman[251063]: 2025-11-22 08:48:05.737817746 +0000 UTC m=+0.104607224 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:48:05 np0005531888 podman[251062]: 2025-11-22 08:48:05.739070727 +0000 UTC m=+0.109074394 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:48:08 np0005531888 nova_compute[186788]: 2025-11-22 08:48:08.417 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:09 np0005531888 nova_compute[186788]: 2025-11-22 08:48:09.623 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:13 np0005531888 nova_compute[186788]: 2025-11-22 08:48:13.418 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:14 np0005531888 nova_compute[186788]: 2025-11-22 08:48:14.626 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:18 np0005531888 nova_compute[186788]: 2025-11-22 08:48:18.421 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:19 np0005531888 nova_compute[186788]: 2025-11-22 08:48:19.627 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:20 np0005531888 podman[251123]: 2025-11-22 08:48:20.679544575 +0000 UTC m=+0.053316833 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:48:20 np0005531888 podman[251124]: 2025-11-22 08:48:20.713203993 +0000 UTC m=+0.078423750 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:48:23 np0005531888 nova_compute[186788]: 2025-11-22 08:48:23.422 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:24 np0005531888 nova_compute[186788]: 2025-11-22 08:48:24.628 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:27 np0005531888 nova_compute[186788]: 2025-11-22 08:48:27.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:28 np0005531888 nova_compute[186788]: 2025-11-22 08:48:28.423 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:29 np0005531888 nova_compute[186788]: 2025-11-22 08:48:29.630 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:30 np0005531888 nova_compute[186788]: 2025-11-22 08:48:30.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:31 np0005531888 podman[251166]: 2025-11-22 08:48:31.685330475 +0000 UTC m=+0.055108768 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:48:31 np0005531888 podman[251165]: 2025-11-22 08:48:31.723691258 +0000 UTC m=+0.095305186 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 03:48:31 np0005531888 nova_compute[186788]: 2025-11-22 08:48:31.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:31 np0005531888 nova_compute[186788]: 2025-11-22 08:48:31.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:48:31 np0005531888 nova_compute[186788]: 2025-11-22 08:48:31.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:48:31 np0005531888 nova_compute[186788]: 2025-11-22 08:48:31.974 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:48:33 np0005531888 nova_compute[186788]: 2025-11-22 08:48:33.425 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:34 np0005531888 nova_compute[186788]: 2025-11-22 08:48:34.631 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:36 np0005531888 podman[251204]: 2025-11-22 08:48:36.680419372 +0000 UTC m=+0.058180362 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Nov 22 03:48:36 np0005531888 podman[251205]: 2025-11-22 08:48:36.711069706 +0000 UTC m=+0.084840747 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:48:36 np0005531888 podman[251206]: 2025-11-22 08:48:36.74372479 +0000 UTC m=+0.114828025 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.857 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.858 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:48:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:36.871 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:36.872 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:36.872 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:38 np0005531888 nova_compute[186788]: 2025-11-22 08:48:38.428 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:38 np0005531888 nova_compute[186788]: 2025-11-22 08:48:38.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:39 np0005531888 nova_compute[186788]: 2025-11-22 08:48:39.632 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:40.455 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:48:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:40.456 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:48:40 np0005531888 nova_compute[186788]: 2025-11-22 08:48:40.456 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:42 np0005531888 nova_compute[186788]: 2025-11-22 08:48:42.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:43 np0005531888 nova_compute[186788]: 2025-11-22 08:48:43.432 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:43 np0005531888 nova_compute[186788]: 2025-11-22 08:48:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:44 np0005531888 nova_compute[186788]: 2025-11-22 08:48:44.633 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:44 np0005531888 nova_compute[186788]: 2025-11-22 08:48:44.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:44 np0005531888 nova_compute[186788]: 2025-11-22 08:48:44.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:48:44 np0005531888 nova_compute[186788]: 2025-11-22 08:48:44.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:44 np0005531888 nova_compute[186788]: 2025-11-22 08:48:44.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:44 np0005531888 nova_compute[186788]: 2025-11-22 08:48:44.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:44 np0005531888 nova_compute[186788]: 2025-11-22 08:48:44.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:44 np0005531888 nova_compute[186788]: 2025-11-22 08:48:44.988 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.148 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.149 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5708MB free_disk=73.26662063598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.149 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.149 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.198 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.199 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.218 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.228 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.229 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:48:45 np0005531888 nova_compute[186788]: 2025-11-22 08:48:45.229 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:47 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:47.459 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:47 np0005531888 nova_compute[186788]: 2025-11-22 08:48:47.895 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:47 np0005531888 nova_compute[186788]: 2025-11-22 08:48:47.896 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:47 np0005531888 nova_compute[186788]: 2025-11-22 08:48:47.911 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.027 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.028 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.035 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.035 186792 INFO nova.compute.claims [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.147 186792 DEBUG nova.compute.provider_tree [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.160 186792 DEBUG nova.scheduler.client.report [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.188 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.188 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.229 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.242 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.243 186792 DEBUG nova.network.neutron [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.265 186792 INFO nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.283 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.433 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.435 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.436 186792 INFO nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Creating image(s)#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.436 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "/var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.437 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.437 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.451 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.515 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.517 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.518 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.533 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.593 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.594 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.635 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.637 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.637 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.706 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.708 186792 DEBUG nova.virt.disk.api [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Checking if we can resize image /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.709 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.767 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.768 186792 DEBUG nova.virt.disk.api [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Cannot resize image /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.769 186792 DEBUG nova.objects.instance [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'migration_context' on Instance uuid ee55d0b1-1b51-43ec-9130-fcd07598b09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.782 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.783 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Ensure instance console log exists: /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.783 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.784 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:48 np0005531888 nova_compute[186788]: 2025-11-22 08:48:48.784 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:49 np0005531888 nova_compute[186788]: 2025-11-22 08:48:49.414 186792 DEBUG nova.policy [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:48:49 np0005531888 nova_compute[186788]: 2025-11-22 08:48:49.635 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:51 np0005531888 nova_compute[186788]: 2025-11-22 08:48:51.286 186792 DEBUG nova.network.neutron [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Successfully created port: 771762db-e480-4c22-adb2-5add2ca49ca1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:48:51 np0005531888 podman[251283]: 2025-11-22 08:48:51.67244405 +0000 UTC m=+0.051043997 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:48:51 np0005531888 podman[251284]: 2025-11-22 08:48:51.701508435 +0000 UTC m=+0.075064918 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 22 03:48:52 np0005531888 nova_compute[186788]: 2025-11-22 08:48:52.708 186792 DEBUG nova.network.neutron [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Successfully updated port: 771762db-e480-4c22-adb2-5add2ca49ca1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:48:52 np0005531888 nova_compute[186788]: 2025-11-22 08:48:52.721 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:48:52 np0005531888 nova_compute[186788]: 2025-11-22 08:48:52.722 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquired lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:48:52 np0005531888 nova_compute[186788]: 2025-11-22 08:48:52.722 186792 DEBUG nova.network.neutron [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:48:52 np0005531888 nova_compute[186788]: 2025-11-22 08:48:52.801 186792 DEBUG nova.compute.manager [req-067a51e6-888b-4484-bd50-27640afde868 req-0c4662a1-ea2d-4550-81d0-9731c0242baf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-changed-771762db-e480-4c22-adb2-5add2ca49ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:48:52 np0005531888 nova_compute[186788]: 2025-11-22 08:48:52.801 186792 DEBUG nova.compute.manager [req-067a51e6-888b-4484-bd50-27640afde868 req-0c4662a1-ea2d-4550-81d0-9731c0242baf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Refreshing instance network info cache due to event network-changed-771762db-e480-4c22-adb2-5add2ca49ca1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:48:52 np0005531888 nova_compute[186788]: 2025-11-22 08:48:52.801 186792 DEBUG oslo_concurrency.lockutils [req-067a51e6-888b-4484-bd50-27640afde868 req-0c4662a1-ea2d-4550-81d0-9731c0242baf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:48:53 np0005531888 nova_compute[186788]: 2025-11-22 08:48:53.384 186792 DEBUG nova.network.neutron [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:48:53 np0005531888 nova_compute[186788]: 2025-11-22 08:48:53.451 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.482 186792 DEBUG nova.network.neutron [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updating instance_info_cache with network_info: [{"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.528 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Releasing lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.528 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Instance network_info: |[{"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.529 186792 DEBUG oslo_concurrency.lockutils [req-067a51e6-888b-4484-bd50-27640afde868 req-0c4662a1-ea2d-4550-81d0-9731c0242baf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.529 186792 DEBUG nova.network.neutron [req-067a51e6-888b-4484-bd50-27640afde868 req-0c4662a1-ea2d-4550-81d0-9731c0242baf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Refreshing network info cache for port 771762db-e480-4c22-adb2-5add2ca49ca1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.532 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Start _get_guest_xml network_info=[{"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.536 186792 WARNING nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.549 186792 DEBUG nova.virt.libvirt.host [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.549 186792 DEBUG nova.virt.libvirt.host [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.554 186792 DEBUG nova.virt.libvirt.host [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.554 186792 DEBUG nova.virt.libvirt.host [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.556 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.556 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.556 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.556 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.557 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.557 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.557 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.557 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.558 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.558 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.558 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.558 186792 DEBUG nova.virt.hardware [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.561 186792 DEBUG nova.virt.libvirt.vif [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=179,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHijZSELA8UT9gszsLo3RsTctjovO/NtSWYTVXotL5SJYEd8P/VZKPX7fPbirjPiSQmt0NO+JaNFVLlHWa5XGAcq5av/AMCam8IbhgKAnkOspqonr6bwIW7QLasEA2sUlw==',key_name='tempest-TestSecurityGroupsBasicOps-1510272773',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-ek9chz33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:48:48Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=ee55d0b1-1b51-43ec-9130-fcd07598b09d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.562 186792 DEBUG nova.network.os_vif_util [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.562 186792 DEBUG nova.network.os_vif_util [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:af:4d,bridge_name='br-int',has_traffic_filtering=True,id=771762db-e480-4c22-adb2-5add2ca49ca1,network=Network(ce8ebe40-99cf-4666-80d8-abaaccce68fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap771762db-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.563 186792 DEBUG nova.objects.instance [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee55d0b1-1b51-43ec-9130-fcd07598b09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.577 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <uuid>ee55d0b1-1b51-43ec-9130-fcd07598b09d</uuid>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <name>instance-000000b3</name>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267</nova:name>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:48:54</nova:creationTime>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        <nova:user uuid="7bb85b33f2b44468ab5d86bf5ba98421">tempest-TestSecurityGroupsBasicOps-588574044-project-member</nova:user>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        <nova:project uuid="b5da13b07bb34fc3b4cd1452f7dd6971">tempest-TestSecurityGroupsBasicOps-588574044</nova:project>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        <nova:port uuid="771762db-e480-4c22-adb2-5add2ca49ca1">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <entry name="serial">ee55d0b1-1b51-43ec-9130-fcd07598b09d</entry>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <entry name="uuid">ee55d0b1-1b51-43ec-9130-fcd07598b09d</entry>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.config"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:83:af:4d"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <target dev="tap771762db-e4"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/console.log" append="off"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:48:54 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:48:54 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:48:54 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:48:54 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.579 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Preparing to wait for external event network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.579 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.579 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.580 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.580 186792 DEBUG nova.virt.libvirt.vif [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=179,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHijZSELA8UT9gszsLo3RsTctjovO/NtSWYTVXotL5SJYEd8P/VZKPX7fPbirjPiSQmt0NO+JaNFVLlHWa5XGAcq5av/AMCam8IbhgKAnkOspqonr6bwIW7QLasEA2sUlw==',key_name='tempest-TestSecurityGroupsBasicOps-1510272773',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-ek9chz33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:48:48Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=ee55d0b1-1b51-43ec-9130-fcd07598b09d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.581 186792 DEBUG nova.network.os_vif_util [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.581 186792 DEBUG nova.network.os_vif_util [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:af:4d,bridge_name='br-int',has_traffic_filtering=True,id=771762db-e480-4c22-adb2-5add2ca49ca1,network=Network(ce8ebe40-99cf-4666-80d8-abaaccce68fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap771762db-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.582 186792 DEBUG os_vif [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:af:4d,bridge_name='br-int',has_traffic_filtering=True,id=771762db-e480-4c22-adb2-5add2ca49ca1,network=Network(ce8ebe40-99cf-4666-80d8-abaaccce68fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap771762db-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.582 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.583 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.583 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.586 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.587 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap771762db-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.587 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap771762db-e4, col_values=(('external_ids', {'iface-id': '771762db-e480-4c22-adb2-5add2ca49ca1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:af:4d', 'vm-uuid': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.589 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:54 np0005531888 NetworkManager[55166]: <info>  [1763801334.5906] manager: (tap771762db-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.591 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.598 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.599 186792 INFO os_vif [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:af:4d,bridge_name='br-int',has_traffic_filtering=True,id=771762db-e480-4c22-adb2-5add2ca49ca1,network=Network(ce8ebe40-99cf-4666-80d8-abaaccce68fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap771762db-e4')#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.638 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.747 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.748 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.748 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No VIF found with MAC fa:16:3e:83:af:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:48:54 np0005531888 nova_compute[186788]: 2025-11-22 08:48:54.748 186792 INFO nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Using config drive#033[00m
Nov 22 03:48:55 np0005531888 nova_compute[186788]: 2025-11-22 08:48:55.571 186792 INFO nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Creating config drive at /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.config#033[00m
Nov 22 03:48:55 np0005531888 nova_compute[186788]: 2025-11-22 08:48:55.576 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdha4s860 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:48:55 np0005531888 nova_compute[186788]: 2025-11-22 08:48:55.702 186792 DEBUG oslo_concurrency.processutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdha4s860" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:48:55 np0005531888 kernel: tap771762db-e4: entered promiscuous mode
Nov 22 03:48:55 np0005531888 NetworkManager[55166]: <info>  [1763801335.7570] manager: (tap771762db-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/353)
Nov 22 03:48:55 np0005531888 ovn_controller[95067]: 2025-11-22T08:48:55Z|00746|binding|INFO|Claiming lport 771762db-e480-4c22-adb2-5add2ca49ca1 for this chassis.
Nov 22 03:48:55 np0005531888 ovn_controller[95067]: 2025-11-22T08:48:55Z|00747|binding|INFO|771762db-e480-4c22-adb2-5add2ca49ca1: Claiming fa:16:3e:83:af:4d 10.100.0.3
Nov 22 03:48:55 np0005531888 nova_compute[186788]: 2025-11-22 08:48:55.757 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:55 np0005531888 nova_compute[186788]: 2025-11-22 08:48:55.763 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.770 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:af:4d 10.100.0.3'], port_security=['fa:16:3e:83:af:4d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce8ebe40-99cf-4666-80d8-abaaccce68fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '2', 'neutron:security_group_ids': '715876a5-b868-42b2-a805-7eead09bd16c f416bbe7-d5d7-442c-98e9-b8655a8e5fb3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=37c2f4c5-2df7-42e5-bbdc-62b417459328, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=771762db-e480-4c22-adb2-5add2ca49ca1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.772 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 771762db-e480-4c22-adb2-5add2ca49ca1 in datapath ce8ebe40-99cf-4666-80d8-abaaccce68fa bound to our chassis#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.773 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce8ebe40-99cf-4666-80d8-abaaccce68fa#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.785 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ac520481-1deb-4635-9fa4-6e0a63a3ac88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.786 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce8ebe40-91 in ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.787 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce8ebe40-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.788 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[44f43950-8f92-4d63-bc28-736d14ece6f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.788 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc47fd3-0fc2-4e4f-ac86-5fd39f9d73d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 systemd-udevd[251345]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.802 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[42aba819-f988-4207-ad34-890761b26115]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 systemd-machined[153106]: New machine qemu-86-instance-000000b3.
Nov 22 03:48:55 np0005531888 NetworkManager[55166]: <info>  [1763801335.8091] device (tap771762db-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:48:55 np0005531888 NetworkManager[55166]: <info>  [1763801335.8101] device (tap771762db-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:48:55 np0005531888 nova_compute[186788]: 2025-11-22 08:48:55.812 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:55 np0005531888 ovn_controller[95067]: 2025-11-22T08:48:55Z|00748|binding|INFO|Setting lport 771762db-e480-4c22-adb2-5add2ca49ca1 ovn-installed in OVS
Nov 22 03:48:55 np0005531888 ovn_controller[95067]: 2025-11-22T08:48:55Z|00749|binding|INFO|Setting lport 771762db-e480-4c22-adb2-5add2ca49ca1 up in Southbound
Nov 22 03:48:55 np0005531888 nova_compute[186788]: 2025-11-22 08:48:55.817 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:55 np0005531888 systemd[1]: Started Virtual Machine qemu-86-instance-000000b3.
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.825 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6facae8b-5ebd-4b29-ac47-55857c142cdf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.853 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f2bb6f12-d9e1-406e-b17d-477dda0f79bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.858 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[10a688ed-02e3-45f2-b305-f2990ded7193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 NetworkManager[55166]: <info>  [1763801335.8595] manager: (tapce8ebe40-90): new Veth device (/org/freedesktop/NetworkManager/Devices/354)
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.889 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[15cbafa1-4e49-49e6-9a60-221d0f2ddbe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.892 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[def4f6db-8e84-4731-a10b-1f4e5bc01889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 NetworkManager[55166]: <info>  [1763801335.9147] device (tapce8ebe40-90): carrier: link connected
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.920 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[b6bf12bf-233f-42b0-b9ac-c0fdc0efeedc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.936 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[42a6163c-0816-478f-bd16-1b6ca13799f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce8ebe40-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:2b:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 807655, 'reachable_time': 33990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251378, 'error': None, 'target': 'ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.953 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9e457a-2621-4700-a69c-081051f050bd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:2b44'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 807655, 'tstamp': 807655}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251379, 'error': None, 'target': 'ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:55.970 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1f7ebcb8-99af-4807-913e-62854b255ece]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce8ebe40-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:2b:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 807655, 'reachable_time': 33990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251380, 'error': None, 'target': 'ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.000 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[df1adf9a-2e18-4329-9a6b-433c1d5b19a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.060 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[99360097-0d42-4dc5-af8b-aa81ea6d3ee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.062 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce8ebe40-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.062 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.062 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce8ebe40-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:56 np0005531888 kernel: tapce8ebe40-90: entered promiscuous mode
Nov 22 03:48:56 np0005531888 NetworkManager[55166]: <info>  [1763801336.0650] manager: (tapce8ebe40-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.064 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.066 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.068 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce8ebe40-90, col_values=(('external_ids', {'iface-id': '2bd744f8-acc7-4ee7-ae3d-7803a979dc4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.069 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:48:56Z|00750|binding|INFO|Releasing lport 2bd744f8-acc7-4ee7-ae3d-7803a979dc4f from this chassis (sb_readonly=0)
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.082 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.084 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce8ebe40-99cf-4666-80d8-abaaccce68fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce8ebe40-99cf-4666-80d8-abaaccce68fa.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.085 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e35f56-1d31-49b4-b9af-62faced7a7fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.085 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-ce8ebe40-99cf-4666-80d8-abaaccce68fa
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/ce8ebe40-99cf-4666-80d8-abaaccce68fa.pid.haproxy
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID ce8ebe40-99cf-4666-80d8-abaaccce68fa
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:48:56 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:48:56.086 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa', 'env', 'PROCESS_TAG=haproxy-ce8ebe40-99cf-4666-80d8-abaaccce68fa', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce8ebe40-99cf-4666-80d8-abaaccce68fa.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.139 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801336.138636, ee55d0b1-1b51-43ec-9130-fcd07598b09d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.139 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] VM Started (Lifecycle Event)#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.158 186792 DEBUG nova.network.neutron [req-067a51e6-888b-4484-bd50-27640afde868 req-0c4662a1-ea2d-4550-81d0-9731c0242baf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updated VIF entry in instance network info cache for port 771762db-e480-4c22-adb2-5add2ca49ca1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.158 186792 DEBUG nova.network.neutron [req-067a51e6-888b-4484-bd50-27640afde868 req-0c4662a1-ea2d-4550-81d0-9731c0242baf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updating instance_info_cache with network_info: [{"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.160 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.164 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801336.1392617, ee55d0b1-1b51-43ec-9130-fcd07598b09d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.165 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.186 186792 DEBUG oslo_concurrency.lockutils [req-067a51e6-888b-4484-bd50-27640afde868 req-0c4662a1-ea2d-4550-81d0-9731c0242baf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.189 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.192 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.210 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.403 186792 DEBUG nova.compute.manager [req-57439db6-e8ed-4b4f-8146-4c213938c40c req-7adf72a0-2e5c-4f1f-b360-08cbdbcc27fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.403 186792 DEBUG oslo_concurrency.lockutils [req-57439db6-e8ed-4b4f-8146-4c213938c40c req-7adf72a0-2e5c-4f1f-b360-08cbdbcc27fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.403 186792 DEBUG oslo_concurrency.lockutils [req-57439db6-e8ed-4b4f-8146-4c213938c40c req-7adf72a0-2e5c-4f1f-b360-08cbdbcc27fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.404 186792 DEBUG oslo_concurrency.lockutils [req-57439db6-e8ed-4b4f-8146-4c213938c40c req-7adf72a0-2e5c-4f1f-b360-08cbdbcc27fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.404 186792 DEBUG nova.compute.manager [req-57439db6-e8ed-4b4f-8146-4c213938c40c req-7adf72a0-2e5c-4f1f-b360-08cbdbcc27fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Processing event network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.404 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.408 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801336.4079726, ee55d0b1-1b51-43ec-9130-fcd07598b09d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.408 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.410 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.413 186792 INFO nova.virt.libvirt.driver [-] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Instance spawned successfully.#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.413 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.427 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.435 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.440 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.441 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.442 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.442 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.443 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.443 186792 DEBUG nova.virt.libvirt.driver [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:48:56 np0005531888 podman[251417]: 2025-11-22 08:48:56.451299017 +0000 UTC m=+0.076323389 container create 6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.465 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:48:56 np0005531888 systemd[1]: Started libpod-conmon-6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474.scope.
Nov 22 03:48:56 np0005531888 podman[251417]: 2025-11-22 08:48:56.395531755 +0000 UTC m=+0.020556157 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:48:56 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:48:56 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc3e43da3460363e8ef7c18befb52b70c8392fef6afeb1a55809b96ff21b036/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:48:56 np0005531888 podman[251417]: 2025-11-22 08:48:56.542826969 +0000 UTC m=+0.167851341 container init 6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 03:48:56 np0005531888 podman[251417]: 2025-11-22 08:48:56.550023836 +0000 UTC m=+0.175048208 container start 6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:48:56 np0005531888 neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa[251432]: [NOTICE]   (251436) : New worker (251438) forked
Nov 22 03:48:56 np0005531888 neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa[251432]: [NOTICE]   (251436) : Loading success.
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.623 186792 INFO nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Took 8.19 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.624 186792 DEBUG nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.773 186792 INFO nova.compute.manager [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Took 8.81 seconds to build instance.#033[00m
Nov 22 03:48:56 np0005531888 nova_compute[186788]: 2025-11-22 08:48:56.822 186792 DEBUG oslo_concurrency.lockutils [None req-e0cabba9-f2fb-426c-9593-6bfaa4618177 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.925s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:58 np0005531888 nova_compute[186788]: 2025-11-22 08:48:58.495 186792 DEBUG nova.compute.manager [req-77261df9-e28d-4268-aaf6-d8586534afd6 req-30d4290a-9807-498e-981a-1ad8b3d02c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:48:58 np0005531888 nova_compute[186788]: 2025-11-22 08:48:58.495 186792 DEBUG oslo_concurrency.lockutils [req-77261df9-e28d-4268-aaf6-d8586534afd6 req-30d4290a-9807-498e-981a-1ad8b3d02c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:58 np0005531888 nova_compute[186788]: 2025-11-22 08:48:58.495 186792 DEBUG oslo_concurrency.lockutils [req-77261df9-e28d-4268-aaf6-d8586534afd6 req-30d4290a-9807-498e-981a-1ad8b3d02c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:58 np0005531888 nova_compute[186788]: 2025-11-22 08:48:58.495 186792 DEBUG oslo_concurrency.lockutils [req-77261df9-e28d-4268-aaf6-d8586534afd6 req-30d4290a-9807-498e-981a-1ad8b3d02c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:58 np0005531888 nova_compute[186788]: 2025-11-22 08:48:58.496 186792 DEBUG nova.compute.manager [req-77261df9-e28d-4268-aaf6-d8586534afd6 req-30d4290a-9807-498e-981a-1ad8b3d02c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] No waiting events found dispatching network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:48:58 np0005531888 nova_compute[186788]: 2025-11-22 08:48:58.496 186792 WARNING nova.compute.manager [req-77261df9-e28d-4268-aaf6-d8586534afd6 req-30d4290a-9807-498e-981a-1ad8b3d02c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received unexpected event network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:48:59 np0005531888 nova_compute[186788]: 2025-11-22 08:48:59.591 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:59 np0005531888 nova_compute[186788]: 2025-11-22 08:48:59.640 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:00 np0005531888 nova_compute[186788]: 2025-11-22 08:49:00.859 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:00 np0005531888 NetworkManager[55166]: <info>  [1763801340.8616] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Nov 22 03:49:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:49:00Z|00751|binding|INFO|Releasing lport 2bd744f8-acc7-4ee7-ae3d-7803a979dc4f from this chassis (sb_readonly=0)
Nov 22 03:49:00 np0005531888 NetworkManager[55166]: <info>  [1763801340.8627] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Nov 22 03:49:00 np0005531888 nova_compute[186788]: 2025-11-22 08:49:00.888 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:49:00Z|00752|binding|INFO|Releasing lport 2bd744f8-acc7-4ee7-ae3d-7803a979dc4f from this chassis (sb_readonly=0)
Nov 22 03:49:00 np0005531888 nova_compute[186788]: 2025-11-22 08:49:00.892 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:01 np0005531888 nova_compute[186788]: 2025-11-22 08:49:01.284 186792 DEBUG nova.compute.manager [req-b999df77-5d3e-4f9e-b881-5d270591f3bd req-6d802490-6e60-48e3-af04-819f2a147e6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-changed-771762db-e480-4c22-adb2-5add2ca49ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:49:01 np0005531888 nova_compute[186788]: 2025-11-22 08:49:01.285 186792 DEBUG nova.compute.manager [req-b999df77-5d3e-4f9e-b881-5d270591f3bd req-6d802490-6e60-48e3-af04-819f2a147e6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Refreshing instance network info cache due to event network-changed-771762db-e480-4c22-adb2-5add2ca49ca1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:49:01 np0005531888 nova_compute[186788]: 2025-11-22 08:49:01.286 186792 DEBUG oslo_concurrency.lockutils [req-b999df77-5d3e-4f9e-b881-5d270591f3bd req-6d802490-6e60-48e3-af04-819f2a147e6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:49:01 np0005531888 nova_compute[186788]: 2025-11-22 08:49:01.286 186792 DEBUG oslo_concurrency.lockutils [req-b999df77-5d3e-4f9e-b881-5d270591f3bd req-6d802490-6e60-48e3-af04-819f2a147e6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:49:01 np0005531888 nova_compute[186788]: 2025-11-22 08:49:01.286 186792 DEBUG nova.network.neutron [req-b999df77-5d3e-4f9e-b881-5d270591f3bd req-6d802490-6e60-48e3-af04-819f2a147e6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Refreshing network info cache for port 771762db-e480-4c22-adb2-5add2ca49ca1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:49:02 np0005531888 podman[251449]: 2025-11-22 08:49:02.69084722 +0000 UTC m=+0.054077661 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:49:02 np0005531888 podman[251448]: 2025-11-22 08:49:02.695118905 +0000 UTC m=+0.056589953 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 03:49:02 np0005531888 nova_compute[186788]: 2025-11-22 08:49:02.805 186792 DEBUG nova.network.neutron [req-b999df77-5d3e-4f9e-b881-5d270591f3bd req-6d802490-6e60-48e3-af04-819f2a147e6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updated VIF entry in instance network info cache for port 771762db-e480-4c22-adb2-5add2ca49ca1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:49:02 np0005531888 nova_compute[186788]: 2025-11-22 08:49:02.805 186792 DEBUG nova.network.neutron [req-b999df77-5d3e-4f9e-b881-5d270591f3bd req-6d802490-6e60-48e3-af04-819f2a147e6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updating instance_info_cache with network_info: [{"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:49:02 np0005531888 nova_compute[186788]: 2025-11-22 08:49:02.832 186792 DEBUG oslo_concurrency.lockutils [req-b999df77-5d3e-4f9e-b881-5d270591f3bd req-6d802490-6e60-48e3-af04-819f2a147e6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:49:04 np0005531888 nova_compute[186788]: 2025-11-22 08:49:04.600 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:04 np0005531888 nova_compute[186788]: 2025-11-22 08:49:04.641 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:07 np0005531888 podman[251490]: 2025-11-22 08:49:07.68727786 +0000 UTC m=+0.056958533 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Nov 22 03:49:07 np0005531888 podman[251491]: 2025-11-22 08:49:07.701622533 +0000 UTC m=+0.067868141 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 03:49:07 np0005531888 podman[251492]: 2025-11-22 08:49:07.733912447 +0000 UTC m=+0.094354972 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 03:49:09 np0005531888 nova_compute[186788]: 2025-11-22 08:49:09.602 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:09 np0005531888 nova_compute[186788]: 2025-11-22 08:49:09.642 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:11 np0005531888 nova_compute[186788]: 2025-11-22 08:49:11.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:14 np0005531888 nova_compute[186788]: 2025-11-22 08:49:14.604 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:14 np0005531888 nova_compute[186788]: 2025-11-22 08:49:14.644 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:14 np0005531888 ovn_controller[95067]: 2025-11-22T08:49:14Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:af:4d 10.100.0.3
Nov 22 03:49:14 np0005531888 ovn_controller[95067]: 2025-11-22T08:49:14Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:af:4d 10.100.0.3
Nov 22 03:49:19 np0005531888 nova_compute[186788]: 2025-11-22 08:49:19.608 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:19 np0005531888 nova_compute[186788]: 2025-11-22 08:49:19.647 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:22 np0005531888 podman[251579]: 2025-11-22 08:49:22.673604068 +0000 UTC m=+0.046699919 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:49:22 np0005531888 podman[251580]: 2025-11-22 08:49:22.673814593 +0000 UTC m=+0.043393078 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:49:24 np0005531888 nova_compute[186788]: 2025-11-22 08:49:24.612 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:24 np0005531888 nova_compute[186788]: 2025-11-22 08:49:24.648 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:28 np0005531888 nova_compute[186788]: 2025-11-22 08:49:28.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:29 np0005531888 nova_compute[186788]: 2025-11-22 08:49:29.615 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:29 np0005531888 nova_compute[186788]: 2025-11-22 08:49:29.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:30 np0005531888 ovn_controller[95067]: 2025-11-22T08:49:30Z|00753|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 03:49:32 np0005531888 nova_compute[186788]: 2025-11-22 08:49:32.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:32 np0005531888 nova_compute[186788]: 2025-11-22 08:49:32.952 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:32 np0005531888 nova_compute[186788]: 2025-11-22 08:49:32.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:49:32 np0005531888 nova_compute[186788]: 2025-11-22 08:49:32.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:49:33 np0005531888 nova_compute[186788]: 2025-11-22 08:49:33.414 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:49:33 np0005531888 nova_compute[186788]: 2025-11-22 08:49:33.415 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:49:33 np0005531888 nova_compute[186788]: 2025-11-22 08:49:33.415 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:49:33 np0005531888 nova_compute[186788]: 2025-11-22 08:49:33.415 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ee55d0b1-1b51-43ec-9130-fcd07598b09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:49:33 np0005531888 podman[251622]: 2025-11-22 08:49:33.677267875 +0000 UTC m=+0.047253983 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:49:33 np0005531888 podman[251621]: 2025-11-22 08:49:33.688229045 +0000 UTC m=+0.060328805 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:49:34 np0005531888 nova_compute[186788]: 2025-11-22 08:49:34.410 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:49:34.411 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:49:34 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:49:34.413 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:49:34 np0005531888 nova_compute[186788]: 2025-11-22 08:49:34.616 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:34 np0005531888 nova_compute[186788]: 2025-11-22 08:49:34.651 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:34 np0005531888 nova_compute[186788]: 2025-11-22 08:49:34.742 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updating instance_info_cache with network_info: [{"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:49:34 np0005531888 nova_compute[186788]: 2025-11-22 08:49:34.766 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:49:34 np0005531888 nova_compute[186788]: 2025-11-22 08:49:34.767 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:49:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:49:36.416 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:49:36 np0005531888 nova_compute[186788]: 2025-11-22 08:49:36.483 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:49:36.874 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:49:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:49:36.874 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:49:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:49:36.875 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:49:38 np0005531888 podman[251666]: 2025-11-22 08:49:38.684354846 +0000 UTC m=+0.052676626 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:49:38 np0005531888 podman[251665]: 2025-11-22 08:49:38.687701799 +0000 UTC m=+0.060525020 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:49:38 np0005531888 podman[251667]: 2025-11-22 08:49:38.721427499 +0000 UTC m=+0.084509480 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:49:39 np0005531888 nova_compute[186788]: 2025-11-22 08:49:39.617 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:39 np0005531888 nova_compute[186788]: 2025-11-22 08:49:39.654 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:39 np0005531888 nova_compute[186788]: 2025-11-22 08:49:39.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:40 np0005531888 nova_compute[186788]: 2025-11-22 08:49:40.464 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:44 np0005531888 nova_compute[186788]: 2025-11-22 08:49:44.621 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:44 np0005531888 nova_compute[186788]: 2025-11-22 08:49:44.656 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:44 np0005531888 nova_compute[186788]: 2025-11-22 08:49:44.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:44 np0005531888 nova_compute[186788]: 2025-11-22 08:49:44.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:44 np0005531888 nova_compute[186788]: 2025-11-22 08:49:44.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:49:45 np0005531888 nova_compute[186788]: 2025-11-22 08:49:45.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:46 np0005531888 nova_compute[186788]: 2025-11-22 08:49:46.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:46 np0005531888 nova_compute[186788]: 2025-11-22 08:49:46.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:49:46 np0005531888 nova_compute[186788]: 2025-11-22 08:49:46.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:49:46 np0005531888 nova_compute[186788]: 2025-11-22 08:49:46.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:49:46 np0005531888 nova_compute[186788]: 2025-11-22 08:49:46.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.042 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.097 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.098 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.165 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.346 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.347 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5520MB free_disk=73.23789596557617GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.348 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.348 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.422 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance ee55d0b1-1b51-43ec-9130-fcd07598b09d actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.423 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.423 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.462 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.483 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.503 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:49:47 np0005531888 nova_compute[186788]: 2025-11-22 08:49:47.503 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:49:49 np0005531888 nova_compute[186788]: 2025-11-22 08:49:49.623 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:49 np0005531888 nova_compute[186788]: 2025-11-22 08:49:49.658 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:49 np0005531888 nova_compute[186788]: 2025-11-22 08:49:49.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:49 np0005531888 nova_compute[186788]: 2025-11-22 08:49:49.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:49 np0005531888 nova_compute[186788]: 2025-11-22 08:49:49.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:49:53 np0005531888 podman[251732]: 2025-11-22 08:49:53.676405426 +0000 UTC m=+0.044387953 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:49:53 np0005531888 podman[251733]: 2025-11-22 08:49:53.709766807 +0000 UTC m=+0.070296790 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 03:49:54 np0005531888 nova_compute[186788]: 2025-11-22 08:49:54.625 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:54 np0005531888 nova_compute[186788]: 2025-11-22 08:49:54.659 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:58 np0005531888 nova_compute[186788]: 2025-11-22 08:49:58.965 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:58 np0005531888 nova_compute[186788]: 2025-11-22 08:49:58.966 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:49:58 np0005531888 nova_compute[186788]: 2025-11-22 08:49:58.982 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:49:59 np0005531888 nova_compute[186788]: 2025-11-22 08:49:59.628 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:59 np0005531888 nova_compute[186788]: 2025-11-22 08:49:59.661 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:59 np0005531888 nova_compute[186788]: 2025-11-22 08:49:59.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:04 np0005531888 nova_compute[186788]: 2025-11-22 08:50:04.630 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:04 np0005531888 nova_compute[186788]: 2025-11-22 08:50:04.662 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:04 np0005531888 podman[251772]: 2025-11-22 08:50:04.693546186 +0000 UTC m=+0.061489373 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:50:04 np0005531888 podman[251773]: 2025-11-22 08:50:04.69370621 +0000 UTC m=+0.057361422 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:50:09 np0005531888 nova_compute[186788]: 2025-11-22 08:50:09.633 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:09 np0005531888 nova_compute[186788]: 2025-11-22 08:50:09.668 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:09 np0005531888 podman[251815]: 2025-11-22 08:50:09.730477932 +0000 UTC m=+0.083078605 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:50:09 np0005531888 podman[251814]: 2025-11-22 08:50:09.736926791 +0000 UTC m=+0.097804077 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public)
Nov 22 03:50:09 np0005531888 podman[251816]: 2025-11-22 08:50:09.780855511 +0000 UTC m=+0.135525225 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 22 03:50:14 np0005531888 nova_compute[186788]: 2025-11-22 08:50:14.637 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:14 np0005531888 nova_compute[186788]: 2025-11-22 08:50:14.672 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:19 np0005531888 nova_compute[186788]: 2025-11-22 08:50:19.641 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:19 np0005531888 nova_compute[186788]: 2025-11-22 08:50:19.674 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:24 np0005531888 nova_compute[186788]: 2025-11-22 08:50:24.643 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:24 np0005531888 nova_compute[186788]: 2025-11-22 08:50:24.677 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:24 np0005531888 podman[251880]: 2025-11-22 08:50:24.68069717 +0000 UTC m=+0.051668162 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 22 03:50:24 np0005531888 podman[251879]: 2025-11-22 08:50:24.69941132 +0000 UTC m=+0.065258667 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:50:29 np0005531888 nova_compute[186788]: 2025-11-22 08:50:29.645 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:29 np0005531888 nova_compute[186788]: 2025-11-22 08:50:29.678 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:30 np0005531888 nova_compute[186788]: 2025-11-22 08:50:30.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:32 np0005531888 nova_compute[186788]: 2025-11-22 08:50:32.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:32 np0005531888 nova_compute[186788]: 2025-11-22 08:50:32.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:50:32 np0005531888 nova_compute[186788]: 2025-11-22 08:50:32.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:50:33 np0005531888 nova_compute[186788]: 2025-11-22 08:50:33.088 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:50:33 np0005531888 nova_compute[186788]: 2025-11-22 08:50:33.088 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:50:33 np0005531888 nova_compute[186788]: 2025-11-22 08:50:33.088 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:50:33 np0005531888 nova_compute[186788]: 2025-11-22 08:50:33.089 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ee55d0b1-1b51-43ec-9130-fcd07598b09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:33.500 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:33.501 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:50:33 np0005531888 nova_compute[186788]: 2025-11-22 08:50:33.501 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:33.502 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:50:34 np0005531888 nova_compute[186788]: 2025-11-22 08:50:34.461 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updating instance_info_cache with network_info: [{"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:50:34 np0005531888 nova_compute[186788]: 2025-11-22 08:50:34.472 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:50:34 np0005531888 nova_compute[186788]: 2025-11-22 08:50:34.472 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:50:34 np0005531888 nova_compute[186788]: 2025-11-22 08:50:34.647 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:34 np0005531888 nova_compute[186788]: 2025-11-22 08:50:34.680 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:35 np0005531888 ovn_controller[95067]: 2025-11-22T08:50:35Z|00754|binding|INFO|Releasing lport 2bd744f8-acc7-4ee7-ae3d-7803a979dc4f from this chassis (sb_readonly=0)
Nov 22 03:50:35 np0005531888 nova_compute[186788]: 2025-11-22 08:50:35.477 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:35 np0005531888 podman[251919]: 2025-11-22 08:50:35.677488549 +0000 UTC m=+0.054356838 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 03:50:35 np0005531888 podman[251920]: 2025-11-22 08:50:35.699608133 +0000 UTC m=+0.075229962 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:50:36 np0005531888 nova_compute[186788]: 2025-11-22 08:50:36.467 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.860 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b3', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'hostId': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.861 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:50:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:36.873 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:36.874 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:36.874 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.874 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.875 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2698dbbd-f843-4da2-bcc9-62094aff62de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.862011', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '507f6568-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.561517653, 'message_signature': '04438a6878512c3fe3d4752c36d716b194e7bb7fb081e36d34d7558e94711e75'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.862011', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '507f78aa-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.561517653, 'message_signature': '8a6255144106ac5abd98543c9bc8b097104680dbd3d90785bba4663268c2976b'}]}, 'timestamp': '2025-11-22 08:50:36.876011', '_unique_id': 'e8093dd4f7ec44f3b7ebcf176dfa7266'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.877 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.884 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ee55d0b1-1b51-43ec-9130-fcd07598b09d / tap771762db-e4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.884 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '810ddfa9-9947-4e6a-ac57-f40d45269f01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.879447', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '5080dace-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': 'f27d17122978201a30367a245b7a19607324175529f758d4b832fc84f9fa4b00'}]}, 'timestamp': '2025-11-22 08:50:36.885103', '_unique_id': 'c2789841b3fd4583acec0ce342163723'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.886 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.887 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.887 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267>]
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.888 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.888 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.incoming.bytes volume: 18742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '023df053-26cb-4457-bc63-217f7b106416', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18742, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.888276', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '5081693a-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': '2e2dc4e8d245daa7c50ed52c7cad0567aee382a41730ce82be2b5319f0276c76'}]}, 'timestamp': '2025-11-22 08:50:36.888735', '_unique_id': '40253d260b8e49b1a1eaa5e5ddb0644f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.891 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.891 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.outgoing.bytes volume: 15440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '702c91a4-86ca-429a-9f0e-dc2066bee223', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15440, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.891195', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '5081db5e-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': '05c3c5f319e9f393579f8a46030e34588553693e22582ffa8f989eb68ada093d'}]}, 'timestamp': '2025-11-22 08:50:36.891620', '_unique_id': 'e737ba2eba6f44a1bf80948d6911a1f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.892 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.924 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.write.requests volume: 373 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.925 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a9acfed-cc99-4fee-b128-df28c298f52f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 373, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.893796', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5086f3f0-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': '51ef0a20bf87b0c079ca3dc166028f18d33548684b81563ad1f49f5a729230bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.893796', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5087061a-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': '5c609b426cc3ca9215114f6862d3c8b086981a2da659b378ab43086e67f70ae8'}]}, 'timestamp': '2025-11-22 08:50:36.925463', '_unique_id': '8a55bb2ee51640a2a250c63a0e4863c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.928 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.928 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267>]
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.928 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.929 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3b677ab-634c-454f-bfe4-a84cf4a39289', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.928911', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '50879c4c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.561517653, 'message_signature': 'a30f23bf720c153996c18f8c4fe9da66afa1a5690438d5689372d8af45bde65d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.928911', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5087a87c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.561517653, 'message_signature': '7be80ca4057163a74ac3002ca3fc0cb13320311ef8fc21d2c8ce2924225da6f3'}]}, 'timestamp': '2025-11-22 08:50:36.929555', '_unique_id': 'eadc0fd31e6b40579f8e57ccb6885896'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.930 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.931 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a748cfb6-54a6-4589-a777-99b45a053acc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.931789', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '50880dbc-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': '1750f1c119ae1b5f3504f6c1bac904fb017b88dda9dc4d64591714a12aef4b6e'}]}, 'timestamp': '2025-11-22 08:50:36.932199', '_unique_id': '4ffc121db56c46a19fd055dac91dbe75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.933 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.934 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.read.bytes volume: 31599104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.934 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08de484b-d30f-40dd-a81a-5ca257285dbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31599104, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.934318', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '50886dd4-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': '59be4350ec2ad15ddfd4ce382c5b02944e4ee285ee958ba520688e13fd98c081'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.934318', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '50887b80-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': 'f2ab4141998f0ac25f1bc43fc30f0de6d0c85c712feba9f42e49dd3fa3de4a35'}]}, 'timestamp': '2025-11-22 08:50:36.934951', '_unique_id': 'b9cf32ff927146c4824ea92b7c9a628e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.935 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.936 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.936 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.read.latency volume: 1445415962 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.938 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.read.latency volume: 79061671 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a7f9792-db31-4e8d-a310-157ce0915c8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1445415962, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.936792', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5088fc7c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': 'c462f6fa84986546b718755cf263ba521bb46e2c424124340de35d0797daffd0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 79061671, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.936792', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '50891b80-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': 'f97de1a3d1a4fc4522ae84979d9a4ca81dc4adf7944c7d2efbe2bf0df74ba07b'}]}, 'timestamp': '2025-11-22 08:50:36.939120', '_unique_id': 'c7c5150a274a44deb25fa197edd3a216'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.940 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.941 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.942 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.read.requests volume: 1160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.942 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29735290-86e2-48c0-9d23-d21c8f44489d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1160, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.942075', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '50899d1c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': '5c4607d2540d9e4cee7797bd0ae26d971901deb2223ec152a2a2e55c5056cb8f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.942075', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5089aa78-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': '7e114b16cdebe3cebd9e46f96021f91c867c7a5db2b4176d33de31a0ccb6a148'}]}, 'timestamp': '2025-11-22 08:50:36.942759', '_unique_id': '0e1366e625514e6db2eb622853641e38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.943 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.944 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.961 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/memory.usage volume: 42.2109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d5e4e64-30a8-4a21-bfaa-2d5e444fdb77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.2109375, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'timestamp': '2025-11-22T08:50:36.944400', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '508caad4-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.661192895, 'message_signature': '256a1e5541eca3c4d362366475ddbb19e86274b8602e169023d5ca59a70cb072'}]}, 'timestamp': '2025-11-22 08:50:36.962491', '_unique_id': 'ebb4b6268a934cc48728f56f87f3458a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.963 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.965 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.965 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267>]
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.966 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.967 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.outgoing.packets volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '298a399d-0e58-433d-9353-c8e99c4335f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 108, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.967012', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '508d71d0-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': 'f824faafb7a7681dc7b5aa1cc07203b7161a0218d56a88aa475763ba2529f091'}]}, 'timestamp': '2025-11-22 08:50:36.967646', '_unique_id': 'ba1b9e533a8d41c083edf0b33f176e07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.969 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.970 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f00492bb-08a0-45c5-9004-a8c4c398b500', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.970437', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '508defca-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': 'c9f2d32f369c7bfe6bac5b03ec0405e7bffeb90750f77669d92e40d51d6fdb7b'}]}, 'timestamp': '2025-11-22 08:50:36.970716', '_unique_id': 'ae4906c23c16418e9ace94a75be38bbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.971 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.972 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff3f43bc-c696-4fc3-b6df-fd74a64425c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.972024', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '508e2e54-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': 'd42bcdde8a875431993eb3af60b61e50fcd1edeeaafea9929eb524f4f254ac8c'}]}, 'timestamp': '2025-11-22 08:50:36.972345', '_unique_id': 'a7696ddf45fc4da9b34788b9d66130be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.973 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.974 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.incoming.packets volume: 101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85f8294d-eabe-4000-8e8b-fa3de8afb334', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 101, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.974182', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '508e819c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': '5bb189979512631cdb677f1387f238f6f44f946e6d32e850d341cd0475ee866c'}]}, 'timestamp': '2025-11-22 08:50:36.974451', '_unique_id': 'c9a1f60103944cd8b43720c4956dafc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.write.bytes volume: 73150464 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.975 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6104a61-4f33-449a-8b35-6cb861d9258c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73150464, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.975708', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '508ebd2e-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': '72bec1cd12e9f354df2d015eb7baecc999e87d42cfd7b1b1ff81454bac7f7f6d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.975708', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '508ec512-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': '2bc78900fbba58a0dc617003e305b7052a8fb0ecdb086eb1ec9e92b7167e5b9b'}]}, 'timestamp': '2025-11-22 08:50:36.976146', '_unique_id': '697d766a33c44bb8a69d4e929f6f3e1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.976 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.977 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9111bb31-b969-4d36-a935-31d8f8bb01f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.977698', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '508f0a9a-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': '048eeb0f8da286170bb90e7ad2484944cd27fd53c2a2c07299f155a0b574fd3f'}]}, 'timestamp': '2025-11-22 08:50:36.977986', '_unique_id': '91dbbf445575435f8bf7f0a7585d067c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.978 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.979 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.979 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.979 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5f75a69-ae2d-43df-9354-b99a8a68e715', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.979198', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '508f4708-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.561517653, 'message_signature': '60eea25015c3dd54e0be0cb498babf93f72d6e8ed85492bcdd275876dcbe80d0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.979198', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '508f52b6-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.561517653, 'message_signature': 'f68700a4b834fa1bd3d83a17d534d8dc837d4e42042b0abecf0cab431759153d'}]}, 'timestamp': '2025-11-22 08:50:36.979798', '_unique_id': '4da0df491c0648119314f0fd6955d67c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.980 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.981 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.981 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/cpu volume: 13480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f39bd33-cb95-4bf9-bff7-95788405e39b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13480000000, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'timestamp': '2025-11-22T08:50:36.981261', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '508f9870-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.661192895, 'message_signature': '7876855ca542bed58b6203bbe3500632fe51d4f7cbb31bee8f00124b99d703d8'}]}, 'timestamp': '2025-11-22 08:50:36.981658', '_unique_id': 'a7ffa4720ba04c48883f457d1a79a936'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd056378d-2136-4cb8-b8e0-ea9321b383ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b3-ee55d0b1-1b51-43ec-9130-fcd07598b09d-tap771762db-e4', 'timestamp': '2025-11-22T08:50:36.983032', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'tap771762db-e4', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:af:4d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap771762db-e4'}, 'message_id': '508fdbdc-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.578978182, 'message_signature': 'f41cf0ffd1eafec3e8aaec14d50e512253b0db4661512668ae79e963a91b024d'}]}, 'timestamp': '2025-11-22 08:50:36.983337', '_unique_id': '4bb9f29298bc4676b5d248f946f8f8c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.983 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.984 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.984 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.write.latency volume: 143518522630 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.984 12 DEBUG ceilometer.compute.pollsters [-] ee55d0b1-1b51-43ec-9130-fcd07598b09d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d2f8382-65ff-48b5-a0c2-17dc75d5435d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 143518522630, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-vda', 'timestamp': '2025-11-22T08:50:36.984649', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '509019a8-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': 'fb91dccd985f0349c12f73f809a3c55c41afb40c410ab854d3623dfa1d6a21ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d-sda', 'timestamp': '2025-11-22T08:50:36.984649', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267', 'name': 'instance-000000b3', 'instance_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '50902470-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8177.593315595, 'message_signature': '8de83bf0a7a6a1b5c547c2231a59a5b2f1d9b8d01f440fc497398341f5942bed'}]}, 'timestamp': '2025-11-22 08:50:36.985181', '_unique_id': '067b8b1da2414a4e979faf631d1771c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.985 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.986 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:50:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:50:36.986 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267>]
Nov 22 03:50:39 np0005531888 nova_compute[186788]: 2025-11-22 08:50:39.650 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:39 np0005531888 nova_compute[186788]: 2025-11-22 08:50:39.682 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.607 186792 DEBUG nova.compute.manager [req-b01a2924-8000-4102-8042-a791a59887b1 req-0cf3c438-924c-47fe-8155-c4d34b358aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-changed-771762db-e480-4c22-adb2-5add2ca49ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.608 186792 DEBUG nova.compute.manager [req-b01a2924-8000-4102-8042-a791a59887b1 req-0cf3c438-924c-47fe-8155-c4d34b358aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Refreshing instance network info cache due to event network-changed-771762db-e480-4c22-adb2-5add2ca49ca1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.608 186792 DEBUG oslo_concurrency.lockutils [req-b01a2924-8000-4102-8042-a791a59887b1 req-0cf3c438-924c-47fe-8155-c4d34b358aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.608 186792 DEBUG oslo_concurrency.lockutils [req-b01a2924-8000-4102-8042-a791a59887b1 req-0cf3c438-924c-47fe-8155-c4d34b358aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.608 186792 DEBUG nova.network.neutron [req-b01a2924-8000-4102-8042-a791a59887b1 req-0cf3c438-924c-47fe-8155-c4d34b358aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Refreshing network info cache for port 771762db-e480-4c22-adb2-5add2ca49ca1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.675 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.675 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.676 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.676 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.676 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.683 186792 INFO nova.compute.manager [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Terminating instance#033[00m
Nov 22 03:50:40 np0005531888 podman[251960]: 2025-11-22 08:50:40.684954369 +0000 UTC m=+0.058266164 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:50:40 np0005531888 podman[251959]: 2025-11-22 08:50:40.688065216 +0000 UTC m=+0.063728679 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.688 186792 DEBUG nova.compute.manager [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:50:40 np0005531888 kernel: tap771762db-e4 (unregistering): left promiscuous mode
Nov 22 03:50:40 np0005531888 NetworkManager[55166]: <info>  [1763801440.7162] device (tap771762db-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:50:40 np0005531888 podman[251961]: 2025-11-22 08:50:40.722653337 +0000 UTC m=+0.093827519 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:50:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:50:40Z|00755|binding|INFO|Releasing lport 771762db-e480-4c22-adb2-5add2ca49ca1 from this chassis (sb_readonly=0)
Nov 22 03:50:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:50:40Z|00756|binding|INFO|Setting lport 771762db-e480-4c22-adb2-5add2ca49ca1 down in Southbound
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.725 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:40 np0005531888 ovn_controller[95067]: 2025-11-22T08:50:40Z|00757|binding|INFO|Removing iface tap771762db-e4 ovn-installed in OVS
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.727 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:40.734 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:af:4d 10.100.0.3'], port_security=['fa:16:3e:83:af:4d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ee55d0b1-1b51-43ec-9130-fcd07598b09d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce8ebe40-99cf-4666-80d8-abaaccce68fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '4', 'neutron:security_group_ids': '715876a5-b868-42b2-a805-7eead09bd16c f416bbe7-d5d7-442c-98e9-b8655a8e5fb3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=37c2f4c5-2df7-42e5-bbdc-62b417459328, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=771762db-e480-4c22-adb2-5add2ca49ca1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:50:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:40.736 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 771762db-e480-4c22-adb2-5add2ca49ca1 in datapath ce8ebe40-99cf-4666-80d8-abaaccce68fa unbound from our chassis#033[00m
Nov 22 03:50:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:40.739 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce8ebe40-99cf-4666-80d8-abaaccce68fa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:50:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:40.740 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[7717ac6d-e5d7-4e82-ac84-a5b41b58cec9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:50:40 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:40.741 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa namespace which is not needed anymore#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.743 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:40 np0005531888 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Nov 22 03:50:40 np0005531888 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b3.scope: Consumed 18.268s CPU time.
Nov 22 03:50:40 np0005531888 systemd-machined[153106]: Machine qemu-86-instance-000000b3 terminated.
Nov 22 03:50:40 np0005531888 kernel: tap771762db-e4: entered promiscuous mode
Nov 22 03:50:40 np0005531888 kernel: tap771762db-e4 (unregistering): left promiscuous mode
Nov 22 03:50:40 np0005531888 NetworkManager[55166]: <info>  [1763801440.9111] manager: (tap771762db-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/358)
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.916 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.942 186792 INFO nova.virt.libvirt.driver [-] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Instance destroyed successfully.#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.943 186792 DEBUG nova.objects.instance [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'resources' on Instance uuid ee55d0b1-1b51-43ec-9130-fcd07598b09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.954 186792 DEBUG nova.virt.libvirt.vif [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-362933267',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=179,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHijZSELA8UT9gszsLo3RsTctjovO/NtSWYTVXotL5SJYEd8P/VZKPX7fPbirjPiSQmt0NO+JaNFVLlHWa5XGAcq5av/AMCam8IbhgKAnkOspqonr6bwIW7QLasEA2sUlw==',key_name='tempest-TestSecurityGroupsBasicOps-1510272773',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:48:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-ek9chz33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:48:56Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=ee55d0b1-1b51-43ec-9130-fcd07598b09d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.954 186792 DEBUG nova.network.os_vif_util [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.955 186792 DEBUG nova.network.os_vif_util [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:af:4d,bridge_name='br-int',has_traffic_filtering=True,id=771762db-e480-4c22-adb2-5add2ca49ca1,network=Network(ce8ebe40-99cf-4666-80d8-abaaccce68fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap771762db-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.955 186792 DEBUG os_vif [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:af:4d,bridge_name='br-int',has_traffic_filtering=True,id=771762db-e480-4c22-adb2-5add2ca49ca1,network=Network(ce8ebe40-99cf-4666-80d8-abaaccce68fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap771762db-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.957 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.957 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap771762db-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.959 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.960 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.963 186792 INFO os_vif [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:af:4d,bridge_name='br-int',has_traffic_filtering=True,id=771762db-e480-4c22-adb2-5add2ca49ca1,network=Network(ce8ebe40-99cf-4666-80d8-abaaccce68fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap771762db-e4')#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.964 186792 INFO nova.virt.libvirt.driver [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Deleting instance files /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d_del#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.965 186792 INFO nova.virt.libvirt.driver [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Deletion of /var/lib/nova/instances/ee55d0b1-1b51-43ec-9130-fcd07598b09d_del complete#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.986 186792 DEBUG nova.compute.manager [req-14c22f9b-0d6c-4864-8070-29b518550d8a req-76182779-e4bf-4232-88ea-f45be8cd5895 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-vif-unplugged-771762db-e480-4c22-adb2-5add2ca49ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.986 186792 DEBUG oslo_concurrency.lockutils [req-14c22f9b-0d6c-4864-8070-29b518550d8a req-76182779-e4bf-4232-88ea-f45be8cd5895 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.986 186792 DEBUG oslo_concurrency.lockutils [req-14c22f9b-0d6c-4864-8070-29b518550d8a req-76182779-e4bf-4232-88ea-f45be8cd5895 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.986 186792 DEBUG oslo_concurrency.lockutils [req-14c22f9b-0d6c-4864-8070-29b518550d8a req-76182779-e4bf-4232-88ea-f45be8cd5895 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.986 186792 DEBUG nova.compute.manager [req-14c22f9b-0d6c-4864-8070-29b518550d8a req-76182779-e4bf-4232-88ea-f45be8cd5895 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] No waiting events found dispatching network-vif-unplugged-771762db-e480-4c22-adb2-5add2ca49ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:50:40 np0005531888 nova_compute[186788]: 2025-11-22 08:50:40.987 186792 DEBUG nova.compute.manager [req-14c22f9b-0d6c-4864-8070-29b518550d8a req-76182779-e4bf-4232-88ea-f45be8cd5895 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-vif-unplugged-771762db-e480-4c22-adb2-5add2ca49ca1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:50:41 np0005531888 nova_compute[186788]: 2025-11-22 08:50:41.045 186792 INFO nova.compute.manager [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:50:41 np0005531888 nova_compute[186788]: 2025-11-22 08:50:41.046 186792 DEBUG oslo.service.loopingcall [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:50:41 np0005531888 nova_compute[186788]: 2025-11-22 08:50:41.046 186792 DEBUG nova.compute.manager [-] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:50:41 np0005531888 nova_compute[186788]: 2025-11-22 08:50:41.046 186792 DEBUG nova.network.neutron [-] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:50:41 np0005531888 neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa[251432]: [NOTICE]   (251436) : haproxy version is 2.8.14-c23fe91
Nov 22 03:50:41 np0005531888 neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa[251432]: [NOTICE]   (251436) : path to executable is /usr/sbin/haproxy
Nov 22 03:50:41 np0005531888 neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa[251432]: [WARNING]  (251436) : Exiting Master process...
Nov 22 03:50:41 np0005531888 neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa[251432]: [WARNING]  (251436) : Exiting Master process...
Nov 22 03:50:41 np0005531888 neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa[251432]: [ALERT]    (251436) : Current worker (251438) exited with code 143 (Terminated)
Nov 22 03:50:41 np0005531888 neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa[251432]: [WARNING]  (251436) : All workers exited. Exiting... (0)
Nov 22 03:50:41 np0005531888 systemd[1]: libpod-6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474.scope: Deactivated successfully.
Nov 22 03:50:41 np0005531888 podman[252050]: 2025-11-22 08:50:41.145862078 +0000 UTC m=+0.327928638 container died 6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:50:41 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474-userdata-shm.mount: Deactivated successfully.
Nov 22 03:50:41 np0005531888 systemd[1]: var-lib-containers-storage-overlay-6dc3e43da3460363e8ef7c18befb52b70c8392fef6afeb1a55809b96ff21b036-merged.mount: Deactivated successfully.
Nov 22 03:50:41 np0005531888 podman[252050]: 2025-11-22 08:50:41.934982432 +0000 UTC m=+1.117048992 container cleanup 6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:50:41 np0005531888 systemd[1]: libpod-conmon-6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474.scope: Deactivated successfully.
Nov 22 03:50:41 np0005531888 nova_compute[186788]: 2025-11-22 08:50:41.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:42 np0005531888 podman[252097]: 2025-11-22 08:50:42.302770351 +0000 UTC m=+0.346839575 container remove 6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.310 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ff30bb34-5961-40fb-9913-8c7356d07684]: (4, ('Sat Nov 22 08:50:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa (6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474)\n6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474\nSat Nov 22 08:50:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa (6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474)\n6573a5cc5b8163e6966e562048f61e783503064883112af71e456375637e0474\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.312 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6076fe-6950-4809-8a1d-827a69486f19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.312 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce8ebe40-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:50:42 np0005531888 nova_compute[186788]: 2025-11-22 08:50:42.314 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:42 np0005531888 kernel: tapce8ebe40-90: left promiscuous mode
Nov 22 03:50:42 np0005531888 nova_compute[186788]: 2025-11-22 08:50:42.325 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:42 np0005531888 nova_compute[186788]: 2025-11-22 08:50:42.326 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.328 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[206007be-37a7-4157-803a-ff05784143b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.347 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a32ec464-6685-41f8-a7dd-6f19f271ca7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.348 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3eed1204-d7dd-4fdd-bc83-a6ea8c7eed2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.367 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[92e04c52-7fa9-40fb-a185-5a3e5157d187]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 807649, 'reachable_time': 28413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252114, 'error': None, 'target': 'ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.369 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce8ebe40-99cf-4666-80d8-abaaccce68fa deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:50:42 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:50:42.370 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4990da-af95-4e19-8cb2-4986970d0627]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:50:42 np0005531888 systemd[1]: run-netns-ovnmeta\x2dce8ebe40\x2d99cf\x2d4666\x2d80d8\x2dabaaccce68fa.mount: Deactivated successfully.
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.471 186792 DEBUG nova.network.neutron [req-b01a2924-8000-4102-8042-a791a59887b1 req-0cf3c438-924c-47fe-8155-c4d34b358aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updated VIF entry in instance network info cache for port 771762db-e480-4c22-adb2-5add2ca49ca1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.472 186792 DEBUG nova.network.neutron [req-b01a2924-8000-4102-8042-a791a59887b1 req-0cf3c438-924c-47fe-8155-c4d34b358aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updating instance_info_cache with network_info: [{"id": "771762db-e480-4c22-adb2-5add2ca49ca1", "address": "fa:16:3e:83:af:4d", "network": {"id": "ce8ebe40-99cf-4666-80d8-abaaccce68fa", "bridge": "br-int", "label": "tempest-network-smoke--958489049", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap771762db-e4", "ovs_interfaceid": "771762db-e480-4c22-adb2-5add2ca49ca1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.505 186792 DEBUG oslo_concurrency.lockutils [req-b01a2924-8000-4102-8042-a791a59887b1 req-0cf3c438-924c-47fe-8155-c4d34b358aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ee55d0b1-1b51-43ec-9130-fcd07598b09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.544 186792 DEBUG nova.compute.manager [req-ea5ebb90-8ea9-4336-a844-2eae7151bc44 req-4fce3bf8-0082-4ab1-9dd0-edeb70c65a66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.544 186792 DEBUG oslo_concurrency.lockutils [req-ea5ebb90-8ea9-4336-a844-2eae7151bc44 req-4fce3bf8-0082-4ab1-9dd0-edeb70c65a66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.546 186792 DEBUG oslo_concurrency.lockutils [req-ea5ebb90-8ea9-4336-a844-2eae7151bc44 req-4fce3bf8-0082-4ab1-9dd0-edeb70c65a66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.546 186792 DEBUG oslo_concurrency.lockutils [req-ea5ebb90-8ea9-4336-a844-2eae7151bc44 req-4fce3bf8-0082-4ab1-9dd0-edeb70c65a66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.546 186792 DEBUG nova.compute.manager [req-ea5ebb90-8ea9-4336-a844-2eae7151bc44 req-4fce3bf8-0082-4ab1-9dd0-edeb70c65a66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] No waiting events found dispatching network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.546 186792 WARNING nova.compute.manager [req-ea5ebb90-8ea9-4336-a844-2eae7151bc44 req-4fce3bf8-0082-4ab1-9dd0-edeb70c65a66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received unexpected event network-vif-plugged-771762db-e480-4c22-adb2-5add2ca49ca1 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.662 186792 DEBUG nova.network.neutron [-] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.690 186792 INFO nova.compute.manager [-] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Took 2.64 seconds to deallocate network for instance.#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.752 186792 DEBUG nova.compute.manager [req-6c9bdaf4-cfd1-4fc8-b45f-2761f2774fc7 req-fae1a780-55c7-4a78-a927-ba90eaeea648 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Received event network-vif-deleted-771762db-e480-4c22-adb2-5add2ca49ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.755 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.755 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.784 186792 DEBUG nova.scheduler.client.report [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.799 186792 DEBUG nova.scheduler.client.report [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.800 186792 DEBUG nova.compute.provider_tree [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.813 186792 DEBUG nova.scheduler.client.report [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.836 186792 DEBUG nova.scheduler.client.report [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.875 186792 DEBUG nova.compute.provider_tree [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.891 186792 DEBUG nova.scheduler.client.report [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.910 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.933 186792 INFO nova.scheduler.client.report [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Deleted allocations for instance ee55d0b1-1b51-43ec-9130-fcd07598b09d#033[00m
Nov 22 03:50:43 np0005531888 nova_compute[186788]: 2025-11-22 08:50:43.992 186792 DEBUG oslo_concurrency.lockutils [None req-20926c5c-61f0-48c0-adda-e2de5ab47b97 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "ee55d0b1-1b51-43ec-9130-fcd07598b09d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:44 np0005531888 nova_compute[186788]: 2025-11-22 08:50:44.683 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:45 np0005531888 nova_compute[186788]: 2025-11-22 08:50:45.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:45 np0005531888 nova_compute[186788]: 2025-11-22 08:50:45.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:45 np0005531888 nova_compute[186788]: 2025-11-22 08:50:45.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:50:45 np0005531888 nova_compute[186788]: 2025-11-22 08:50:45.960 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:46 np0005531888 nova_compute[186788]: 2025-11-22 08:50:46.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:48 np0005531888 nova_compute[186788]: 2025-11-22 08:50:48.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:48 np0005531888 nova_compute[186788]: 2025-11-22 08:50:48.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:48 np0005531888 nova_compute[186788]: 2025-11-22 08:50:48.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:48 np0005531888 nova_compute[186788]: 2025-11-22 08:50:48.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:48 np0005531888 nova_compute[186788]: 2025-11-22 08:50:48.983 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.140 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.141 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.25897598266602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.141 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.141 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.260 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.260 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.299 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.314 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.333 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.333 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:49 np0005531888 nova_compute[186788]: 2025-11-22 08:50:49.684 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:50 np0005531888 nova_compute[186788]: 2025-11-22 08:50:50.334 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:50 np0005531888 nova_compute[186788]: 2025-11-22 08:50:50.850 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:50 np0005531888 nova_compute[186788]: 2025-11-22 08:50:50.918 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:50 np0005531888 nova_compute[186788]: 2025-11-22 08:50:50.962 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:54 np0005531888 nova_compute[186788]: 2025-11-22 08:50:54.686 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:55 np0005531888 podman[252117]: 2025-11-22 08:50:55.674879174 +0000 UTC m=+0.049183181 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:50:55 np0005531888 podman[252118]: 2025-11-22 08:50:55.703522409 +0000 UTC m=+0.077310003 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:50:55 np0005531888 nova_compute[186788]: 2025-11-22 08:50:55.942 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801440.9407685, ee55d0b1-1b51-43ec-9130-fcd07598b09d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:50:55 np0005531888 nova_compute[186788]: 2025-11-22 08:50:55.943 186792 INFO nova.compute.manager [-] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:50:55 np0005531888 nova_compute[186788]: 2025-11-22 08:50:55.964 186792 DEBUG nova.compute.manager [None req-031c2755-b901-4e40-aabf-b3b1a2fcbb55 - - - - - -] [instance: ee55d0b1-1b51-43ec-9130-fcd07598b09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:50:55 np0005531888 nova_compute[186788]: 2025-11-22 08:50:55.964 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:59 np0005531888 nova_compute[186788]: 2025-11-22 08:50:59.689 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:00 np0005531888 nova_compute[186788]: 2025-11-22 08:51:00.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:04 np0005531888 nova_compute[186788]: 2025-11-22 08:51:04.691 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:05 np0005531888 nova_compute[186788]: 2025-11-22 08:51:05.969 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:06 np0005531888 podman[252161]: 2025-11-22 08:51:06.674240105 +0000 UTC m=+0.048795112 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:51:06 np0005531888 podman[252160]: 2025-11-22 08:51:06.701132446 +0000 UTC m=+0.078166044 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:51:09 np0005531888 nova_compute[186788]: 2025-11-22 08:51:09.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:10 np0005531888 nova_compute[186788]: 2025-11-22 08:51:10.971 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:11 np0005531888 podman[252205]: 2025-11-22 08:51:11.691670531 +0000 UTC m=+0.060965241 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:51:11 np0005531888 podman[252204]: 2025-11-22 08:51:11.691638399 +0000 UTC m=+0.065384349 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_id=edpm, name=ubi9-minimal, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc.)
Nov 22 03:51:11 np0005531888 podman[252206]: 2025-11-22 08:51:11.736416591 +0000 UTC m=+0.103050836 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:51:12 np0005531888 nova_compute[186788]: 2025-11-22 08:51:12.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:14 np0005531888 nova_compute[186788]: 2025-11-22 08:51:14.695 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:15 np0005531888 nova_compute[186788]: 2025-11-22 08:51:15.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:19 np0005531888 nova_compute[186788]: 2025-11-22 08:51:19.696 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:20 np0005531888 nova_compute[186788]: 2025-11-22 08:51:20.977 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:24 np0005531888 nova_compute[186788]: 2025-11-22 08:51:24.699 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:25 np0005531888 nova_compute[186788]: 2025-11-22 08:51:25.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:26 np0005531888 podman[252269]: 2025-11-22 08:51:26.674270045 +0000 UTC m=+0.050568975 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:51:26 np0005531888 podman[252268]: 2025-11-22 08:51:26.697373744 +0000 UTC m=+0.076414602 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:51:29 np0005531888 nova_compute[186788]: 2025-11-22 08:51:29.700 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:30 np0005531888 nova_compute[186788]: 2025-11-22 08:51:30.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:32 np0005531888 nova_compute[186788]: 2025-11-22 08:51:32.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:33 np0005531888 nova_compute[186788]: 2025-11-22 08:51:33.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:33 np0005531888 nova_compute[186788]: 2025-11-22 08:51:33.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:51:33 np0005531888 nova_compute[186788]: 2025-11-22 08:51:33.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:51:33 np0005531888 nova_compute[186788]: 2025-11-22 08:51:33.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:51:34 np0005531888 nova_compute[186788]: 2025-11-22 08:51:34.703 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:35 np0005531888 ovn_controller[95067]: 2025-11-22T08:51:35Z|00758|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 22 03:51:35 np0005531888 nova_compute[186788]: 2025-11-22 08:51:35.963 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:35 np0005531888 nova_compute[186788]: 2025-11-22 08:51:35.986 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:36.874 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:36.874 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:36.874 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:37 np0005531888 podman[252310]: 2025-11-22 08:51:37.669032734 +0000 UTC m=+0.042681692 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:51:37 np0005531888 podman[252309]: 2025-11-22 08:51:37.677274787 +0000 UTC m=+0.054818890 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:51:39 np0005531888 nova_compute[186788]: 2025-11-22 08:51:39.705 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:40 np0005531888 nova_compute[186788]: 2025-11-22 08:51:40.988 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:41 np0005531888 nova_compute[186788]: 2025-11-22 08:51:41.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:42 np0005531888 podman[252350]: 2025-11-22 08:51:42.69545437 +0000 UTC m=+0.065319558 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:51:42 np0005531888 podman[252351]: 2025-11-22 08:51:42.702279298 +0000 UTC m=+0.063255807 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:51:42 np0005531888 podman[252357]: 2025-11-22 08:51:42.730369189 +0000 UTC m=+0.087264578 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:51:44 np0005531888 nova_compute[186788]: 2025-11-22 08:51:44.707 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:45 np0005531888 nova_compute[186788]: 2025-11-22 08:51:45.992 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:46 np0005531888 nova_compute[186788]: 2025-11-22 08:51:46.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:46 np0005531888 nova_compute[186788]: 2025-11-22 08:51:46.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:51:47 np0005531888 nova_compute[186788]: 2025-11-22 08:51:47.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:48 np0005531888 nova_compute[186788]: 2025-11-22 08:51:48.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:49 np0005531888 nova_compute[186788]: 2025-11-22 08:51:49.709 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:49 np0005531888 nova_compute[186788]: 2025-11-22 08:51:49.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:50 np0005531888 nova_compute[186788]: 2025-11-22 08:51:50.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:50 np0005531888 nova_compute[186788]: 2025-11-22 08:51:50.973 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:50 np0005531888 nova_compute[186788]: 2025-11-22 08:51:50.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:50 np0005531888 nova_compute[186788]: 2025-11-22 08:51:50.974 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:50 np0005531888 nova_compute[186788]: 2025-11-22 08:51:50.974 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:51:50 np0005531888 nova_compute[186788]: 2025-11-22 08:51:50.995 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.154 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.156 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5727MB free_disk=73.25895690917969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.156 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.157 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.221 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.222 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.292 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.308 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.310 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:51:51 np0005531888 nova_compute[186788]: 2025-11-22 08:51:51.310 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.358 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.358 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.376 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.523 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.524 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.530 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.531 186792 INFO nova.compute.claims [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.635 186792 DEBUG nova.compute.provider_tree [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.647 186792 DEBUG nova.scheduler.client.report [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.665 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.666 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.713 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.714 186792 DEBUG nova.network.neutron [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.728 186792 INFO nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.739 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.821 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.823 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.823 186792 INFO nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Creating image(s)#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.824 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "/var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.824 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.824 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.835 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.893 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.894 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.895 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.912 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.970 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:51:53 np0005531888 nova_compute[186788]: 2025-11-22 08:51:53.971 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.159 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk 1073741824" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.162 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.163 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.237 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.239 186792 DEBUG nova.virt.disk.api [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Checking if we can resize image /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.239 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.313 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.315 186792 DEBUG nova.virt.disk.api [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Cannot resize image /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.316 186792 DEBUG nova.objects.instance [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'migration_context' on Instance uuid 17df6007-a93e-4318-a7a4-c3bc20dfd8f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.334 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.335 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Ensure instance console log exists: /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.335 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.336 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.336 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.561 186792 DEBUG nova.policy [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:51:54 np0005531888 nova_compute[186788]: 2025-11-22 08:51:54.714 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:55 np0005531888 nova_compute[186788]: 2025-11-22 08:51:55.973 186792 DEBUG nova.network.neutron [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Successfully created port: 51161e59-b6f9-460b-bd0f-76262417cbcc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:51:55 np0005531888 nova_compute[186788]: 2025-11-22 08:51:55.998 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:56 np0005531888 nova_compute[186788]: 2025-11-22 08:51:56.686 186792 DEBUG nova.network.neutron [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Successfully updated port: 51161e59-b6f9-460b-bd0f-76262417cbcc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:51:56 np0005531888 nova_compute[186788]: 2025-11-22 08:51:56.697 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "refresh_cache-17df6007-a93e-4318-a7a4-c3bc20dfd8f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:51:56 np0005531888 nova_compute[186788]: 2025-11-22 08:51:56.698 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquired lock "refresh_cache-17df6007-a93e-4318-a7a4-c3bc20dfd8f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:51:56 np0005531888 nova_compute[186788]: 2025-11-22 08:51:56.698 186792 DEBUG nova.network.neutron [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:51:56 np0005531888 nova_compute[186788]: 2025-11-22 08:51:56.770 186792 DEBUG nova.compute.manager [req-4717d545-18c0-4cd2-816c-62fb48253a60 req-05cf9cbc-436c-42a2-8a63-a1e000e58304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received event network-changed-51161e59-b6f9-460b-bd0f-76262417cbcc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:51:56 np0005531888 nova_compute[186788]: 2025-11-22 08:51:56.771 186792 DEBUG nova.compute.manager [req-4717d545-18c0-4cd2-816c-62fb48253a60 req-05cf9cbc-436c-42a2-8a63-a1e000e58304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Refreshing instance network info cache due to event network-changed-51161e59-b6f9-460b-bd0f-76262417cbcc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:51:56 np0005531888 nova_compute[186788]: 2025-11-22 08:51:56.771 186792 DEBUG oslo_concurrency.lockutils [req-4717d545-18c0-4cd2-816c-62fb48253a60 req-05cf9cbc-436c-42a2-8a63-a1e000e58304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-17df6007-a93e-4318-a7a4-c3bc20dfd8f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:51:57 np0005531888 nova_compute[186788]: 2025-11-22 08:51:57.490 186792 DEBUG nova.network.neutron [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:51:57 np0005531888 podman[252431]: 2025-11-22 08:51:57.677868119 +0000 UTC m=+0.052265797 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:51:57 np0005531888 podman[252432]: 2025-11-22 08:51:57.707713904 +0000 UTC m=+0.081741283 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.758 186792 DEBUG nova.network.neutron [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Updating instance_info_cache with network_info: [{"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.776 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Releasing lock "refresh_cache-17df6007-a93e-4318-a7a4-c3bc20dfd8f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.776 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Instance network_info: |[{"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.777 186792 DEBUG oslo_concurrency.lockutils [req-4717d545-18c0-4cd2-816c-62fb48253a60 req-05cf9cbc-436c-42a2-8a63-a1e000e58304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-17df6007-a93e-4318-a7a4-c3bc20dfd8f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.777 186792 DEBUG nova.network.neutron [req-4717d545-18c0-4cd2-816c-62fb48253a60 req-05cf9cbc-436c-42a2-8a63-a1e000e58304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Refreshing network info cache for port 51161e59-b6f9-460b-bd0f-76262417cbcc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.780 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Start _get_guest_xml network_info=[{"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.784 186792 WARNING nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.792 186792 DEBUG nova.virt.libvirt.host [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.792 186792 DEBUG nova.virt.libvirt.host [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.796 186792 DEBUG nova.virt.libvirt.host [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.797 186792 DEBUG nova.virt.libvirt.host [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.798 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.799 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.799 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.799 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.800 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.800 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.800 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.800 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.800 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.801 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.801 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.801 186792 DEBUG nova.virt.hardware [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.804 186792 DEBUG nova.virt.libvirt.vif [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:51:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-0-1024383164',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-0-1024383164',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=182,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiBPbwkfvp292DdxiAKQErFTXea99iSJ0iNktwUcZngbhgOVkJ8LcCoiBgeLTItTjMH75el9p0D+2vq3rbfgroqRlNCO9aORnrX2+bekE/q3IKlWTvN5P4p1NDQcXdubA==',key_name='tempest-TestSecurityGroupsBasicOps-1102160890',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-jz68ypq7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:51:53Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=17df6007-a93e-4318-a7a4-c3bc20dfd8f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.805 186792 DEBUG nova.network.os_vif_util [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.805 186792 DEBUG nova.network.os_vif_util [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:5a:ba,bridge_name='br-int',has_traffic_filtering=True,id=51161e59-b6f9-460b-bd0f-76262417cbcc,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51161e59-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.806 186792 DEBUG nova.objects.instance [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17df6007-a93e-4318-a7a4-c3bc20dfd8f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.818 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <uuid>17df6007-a93e-4318-a7a4-c3bc20dfd8f4</uuid>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <name>instance-000000b6</name>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-0-1024383164</nova:name>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:51:58</nova:creationTime>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        <nova:user uuid="7bb85b33f2b44468ab5d86bf5ba98421">tempest-TestSecurityGroupsBasicOps-588574044-project-member</nova:user>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        <nova:project uuid="b5da13b07bb34fc3b4cd1452f7dd6971">tempest-TestSecurityGroupsBasicOps-588574044</nova:project>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        <nova:port uuid="51161e59-b6f9-460b-bd0f-76262417cbcc">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <entry name="serial">17df6007-a93e-4318-a7a4-c3bc20dfd8f4</entry>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <entry name="uuid">17df6007-a93e-4318-a7a4-c3bc20dfd8f4</entry>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk.config"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:a3:5a:ba"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <target dev="tap51161e59-b6"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/console.log" append="off"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:51:58 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:51:58 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:51:58 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:51:58 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.820 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Preparing to wait for external event network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.820 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.820 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.820 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.821 186792 DEBUG nova.virt.libvirt.vif [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:51:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-0-1024383164',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-0-1024383164',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=182,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiBPbwkfvp292DdxiAKQErFTXea99iSJ0iNktwUcZngbhgOVkJ8LcCoiBgeLTItTjMH75el9p0D+2vq3rbfgroqRlNCO9aORnrX2+bekE/q3IKlWTvN5P4p1NDQcXdubA==',key_name='tempest-TestSecurityGroupsBasicOps-1102160890',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-jz68ypq7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:51:53Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=17df6007-a93e-4318-a7a4-c3bc20dfd8f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.821 186792 DEBUG nova.network.os_vif_util [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.822 186792 DEBUG nova.network.os_vif_util [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:5a:ba,bridge_name='br-int',has_traffic_filtering=True,id=51161e59-b6f9-460b-bd0f-76262417cbcc,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51161e59-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.822 186792 DEBUG os_vif [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:5a:ba,bridge_name='br-int',has_traffic_filtering=True,id=51161e59-b6f9-460b-bd0f-76262417cbcc,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51161e59-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.822 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.823 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.823 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.826 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.826 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51161e59-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.827 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap51161e59-b6, col_values=(('external_ids', {'iface-id': '51161e59-b6f9-460b-bd0f-76262417cbcc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:5a:ba', 'vm-uuid': '17df6007-a93e-4318-a7a4-c3bc20dfd8f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.828 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:58 np0005531888 NetworkManager[55166]: <info>  [1763801518.8293] manager: (tap51161e59-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.831 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.835 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:58 np0005531888 nova_compute[186788]: 2025-11-22 08:51:58.835 186792 INFO os_vif [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:5a:ba,bridge_name='br-int',has_traffic_filtering=True,id=51161e59-b6f9-460b-bd0f-76262417cbcc,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51161e59-b6')#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.071 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.072 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.072 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No VIF found with MAC fa:16:3e:a3:5a:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.073 186792 INFO nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Using config drive#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.672 186792 INFO nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Creating config drive at /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk.config#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.679 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpea98e1yw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.715 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.808 186792 DEBUG oslo_concurrency.processutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpea98e1yw" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:51:59 np0005531888 kernel: tap51161e59-b6: entered promiscuous mode
Nov 22 03:51:59 np0005531888 NetworkManager[55166]: <info>  [1763801519.8579] manager: (tap51161e59-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/360)
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.858 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:51:59Z|00759|binding|INFO|Claiming lport 51161e59-b6f9-460b-bd0f-76262417cbcc for this chassis.
Nov 22 03:51:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:51:59Z|00760|binding|INFO|51161e59-b6f9-460b-bd0f-76262417cbcc: Claiming fa:16:3e:a3:5a:ba 10.100.0.12
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.862 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.866 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.869 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:59 np0005531888 NetworkManager[55166]: <info>  [1763801519.8697] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Nov 22 03:51:59 np0005531888 NetworkManager[55166]: <info>  [1763801519.8703] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Nov 22 03:51:59 np0005531888 systemd-udevd[252492]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:51:59 np0005531888 systemd-machined[153106]: New machine qemu-87-instance-000000b6.
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.889 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:5a:ba 10.100.0.12'], port_security=['fa:16:3e:a3:5a:ba 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '17df6007-a93e-4318-a7a4-c3bc20dfd8f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e6b1900-aaed-4594-90b0-32a5351bb717', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b91ecf2-a2ff-45e5-963e-c9d8b70b8af0, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=51161e59-b6f9-460b-bd0f-76262417cbcc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.891 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 51161e59-b6f9-460b-bd0f-76262417cbcc in datapath 6462ae38-eefd-46f7-8dfd-98d64cb746b6 bound to our chassis#033[00m
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.892 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6462ae38-eefd-46f7-8dfd-98d64cb746b6#033[00m
Nov 22 03:51:59 np0005531888 NetworkManager[55166]: <info>  [1763801519.9002] device (tap51161e59-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:51:59 np0005531888 NetworkManager[55166]: <info>  [1763801519.9012] device (tap51161e59-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.904 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a50173a5-a755-444c-b0d9-8bc745b2b2b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.905 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6462ae38-e1 in ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.907 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6462ae38-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.907 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fe66874d-44b9-411d-8585-113de4fbc79d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.908 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e6ef788d-11ab-4913-b585-f9d96609eeec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:51:59 np0005531888 systemd[1]: Started Virtual Machine qemu-87-instance-000000b6.
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.920 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[128a0aaa-8608-4ad8-8170-14619175899b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.937 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.946 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e86f02e5-4da8-46e2-ace8-b2187e928fc1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.947 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:51:59Z|00761|binding|INFO|Setting lport 51161e59-b6f9-460b-bd0f-76262417cbcc ovn-installed in OVS
Nov 22 03:51:59 np0005531888 ovn_controller[95067]: 2025-11-22T08:51:59Z|00762|binding|INFO|Setting lport 51161e59-b6f9-460b-bd0f-76262417cbcc up in Southbound
Nov 22 03:51:59 np0005531888 nova_compute[186788]: 2025-11-22 08:51:59.958 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.973 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[8ccbea2c-50d9-4564-9600-34879f93cff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:51:59 np0005531888 NetworkManager[55166]: <info>  [1763801519.9803] manager: (tap6462ae38-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/363)
Nov 22 03:51:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:51:59.979 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[fd650bbf-c4e4-40e9-b6c6-f0a077a684fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.006 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ed432c-b131-41e8-8a2c-73dfa999129d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.009 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[f52ec9a2-8020-44c2-93f1-f058b1949d2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 NetworkManager[55166]: <info>  [1763801520.0286] device (tap6462ae38-e0): carrier: link connected
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.033 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e5220bfc-19d4-4101-9255-d5fcc5e5c5c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.047 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[bc97febf-6aaa-4ac9-bba8-1e9a4387aed7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6462ae38-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:92:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 826067, 'reachable_time': 38418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252526, 'error': None, 'target': 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.061 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[23dd7507-6a84-44ba-be7e-9f78783224b4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:927b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 826067, 'tstamp': 826067}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252527, 'error': None, 'target': 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.074 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c9b3538d-d615-4fef-80fa-b277e9971d4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6462ae38-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:92:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 826067, 'reachable_time': 38418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252528, 'error': None, 'target': 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.099 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a3350cc5-d687-4c52-806b-1077d73e802c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.152 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e741e6f1-1df1-448d-867e-4950cd8c2c33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.154 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6462ae38-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.154 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.155 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6462ae38-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:52:00 np0005531888 kernel: tap6462ae38-e0: entered promiscuous mode
Nov 22 03:52:00 np0005531888 NetworkManager[55166]: <info>  [1763801520.1597] manager: (tap6462ae38-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.163 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.165 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6462ae38-e0, col_values=(('external_ids', {'iface-id': '36a49105-890a-4b11-8fcd-be2c813442f9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:52:00 np0005531888 ovn_controller[95067]: 2025-11-22T08:52:00Z|00763|binding|INFO|Releasing lport 36a49105-890a-4b11-8fcd-be2c813442f9 from this chassis (sb_readonly=0)
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.170 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6462ae38-eefd-46f7-8dfd-98d64cb746b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6462ae38-eefd-46f7-8dfd-98d64cb746b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.171 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0b75c19c-025e-4665-b0eb-4d63b7930799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.172 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-6462ae38-eefd-46f7-8dfd-98d64cb746b6
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/6462ae38-eefd-46f7-8dfd-98d64cb746b6.pid.haproxy
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 6462ae38-eefd-46f7-8dfd-98d64cb746b6
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.174 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'env', 'PROCESS_TAG=haproxy-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6462ae38-eefd-46f7-8dfd-98d64cb746b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.181 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.349 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801520.348726, 17df6007-a93e-4318-a7a4-c3bc20dfd8f4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.349 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] VM Started (Lifecycle Event)#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.370 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.375 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801520.3488245, 17df6007-a93e-4318-a7a4-c3bc20dfd8f4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.375 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.394 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.398 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.422 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:52:00 np0005531888 podman[252567]: 2025-11-22 08:52:00.541832537 +0000 UTC m=+0.062086199 container create 704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 03:52:00 np0005531888 systemd[1]: Started libpod-conmon-704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796.scope.
Nov 22 03:52:00 np0005531888 podman[252567]: 2025-11-22 08:52:00.506192 +0000 UTC m=+0.026445692 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:52:00 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:52:00 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37e0646b28ab0ad1f4679cf8ea2d689c0a254623aa724e8304307a61ab632590/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:52:00 np0005531888 podman[252567]: 2025-11-22 08:52:00.639115879 +0000 UTC m=+0.159369561 container init 704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:52:00 np0005531888 podman[252567]: 2025-11-22 08:52:00.644478441 +0000 UTC m=+0.164732123 container start 704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:52:00 np0005531888 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[252582]: [NOTICE]   (252586) : New worker (252588) forked
Nov 22 03:52:00 np0005531888 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[252582]: [NOTICE]   (252586) : Loading success.
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.700 186792 DEBUG nova.compute.manager [req-3bf1e60c-033c-4f1a-baa9-d4e41add62f7 req-4733dcdb-bdb2-41af-abd2-9612c17a4345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received event network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.701 186792 DEBUG oslo_concurrency.lockutils [req-3bf1e60c-033c-4f1a-baa9-d4e41add62f7 req-4733dcdb-bdb2-41af-abd2-9612c17a4345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.701 186792 DEBUG oslo_concurrency.lockutils [req-3bf1e60c-033c-4f1a-baa9-d4e41add62f7 req-4733dcdb-bdb2-41af-abd2-9612c17a4345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.702 186792 DEBUG oslo_concurrency.lockutils [req-3bf1e60c-033c-4f1a-baa9-d4e41add62f7 req-4733dcdb-bdb2-41af-abd2-9612c17a4345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.702 186792 DEBUG nova.compute.manager [req-3bf1e60c-033c-4f1a-baa9-d4e41add62f7 req-4733dcdb-bdb2-41af-abd2-9612c17a4345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Processing event network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.703 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.707 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801520.7068894, 17df6007-a93e-4318-a7a4-c3bc20dfd8f4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.707 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.708 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.711 186792 INFO nova.virt.libvirt.driver [-] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Instance spawned successfully.#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.711 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.727 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.734 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.739 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.740 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.740 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.741 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.741 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.741 186792 DEBUG nova.virt.libvirt.driver [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.767 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.796 186792 DEBUG nova.network.neutron [req-4717d545-18c0-4cd2-816c-62fb48253a60 req-05cf9cbc-436c-42a2-8a63-a1e000e58304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Updated VIF entry in instance network info cache for port 51161e59-b6f9-460b-bd0f-76262417cbcc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.797 186792 DEBUG nova.network.neutron [req-4717d545-18c0-4cd2-816c-62fb48253a60 req-05cf9cbc-436c-42a2-8a63-a1e000e58304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Updating instance_info_cache with network_info: [{"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.811 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.812 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:00 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:00.813 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.820 186792 INFO nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Took 7.00 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.820 186792 DEBUG nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.821 186792 DEBUG oslo_concurrency.lockutils [req-4717d545-18c0-4cd2-816c-62fb48253a60 req-05cf9cbc-436c-42a2-8a63-a1e000e58304 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-17df6007-a93e-4318-a7a4-c3bc20dfd8f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.903 186792 INFO nova.compute.manager [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Took 7.46 seconds to build instance.#033[00m
Nov 22 03:52:00 np0005531888 nova_compute[186788]: 2025-11-22 08:52:00.920 186792 DEBUG oslo_concurrency.lockutils [None req-01dcf103-2860-471b-9959-0cf88e4bbeee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:01.816 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:52:02 np0005531888 nova_compute[186788]: 2025-11-22 08:52:02.788 186792 DEBUG nova.compute.manager [req-64376714-3126-46f5-a60c-4af49effda9d req-d7f84201-eeac-464d-b559-2acee3da9718 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received event network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:52:02 np0005531888 nova_compute[186788]: 2025-11-22 08:52:02.788 186792 DEBUG oslo_concurrency.lockutils [req-64376714-3126-46f5-a60c-4af49effda9d req-d7f84201-eeac-464d-b559-2acee3da9718 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:02 np0005531888 nova_compute[186788]: 2025-11-22 08:52:02.788 186792 DEBUG oslo_concurrency.lockutils [req-64376714-3126-46f5-a60c-4af49effda9d req-d7f84201-eeac-464d-b559-2acee3da9718 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:02 np0005531888 nova_compute[186788]: 2025-11-22 08:52:02.788 186792 DEBUG oslo_concurrency.lockutils [req-64376714-3126-46f5-a60c-4af49effda9d req-d7f84201-eeac-464d-b559-2acee3da9718 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:02 np0005531888 nova_compute[186788]: 2025-11-22 08:52:02.789 186792 DEBUG nova.compute.manager [req-64376714-3126-46f5-a60c-4af49effda9d req-d7f84201-eeac-464d-b559-2acee3da9718 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] No waiting events found dispatching network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:52:02 np0005531888 nova_compute[186788]: 2025-11-22 08:52:02.789 186792 WARNING nova.compute.manager [req-64376714-3126-46f5-a60c-4af49effda9d req-d7f84201-eeac-464d-b559-2acee3da9718 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received unexpected event network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:52:03 np0005531888 nova_compute[186788]: 2025-11-22 08:52:03.828 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:04 np0005531888 nova_compute[186788]: 2025-11-22 08:52:04.717 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:08 np0005531888 podman[252600]: 2025-11-22 08:52:08.68254684 +0000 UTC m=+0.050855831 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:52:08 np0005531888 podman[252599]: 2025-11-22 08:52:08.686739364 +0000 UTC m=+0.057362753 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:52:08 np0005531888 nova_compute[186788]: 2025-11-22 08:52:08.831 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:09 np0005531888 nova_compute[186788]: 2025-11-22 08:52:09.719 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:13 np0005531888 podman[252659]: 2025-11-22 08:52:13.724314287 +0000 UTC m=+0.063539104 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 22 03:52:13 np0005531888 podman[252658]: 2025-11-22 08:52:13.727197198 +0000 UTC m=+0.065102922 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 22 03:52:13 np0005531888 podman[252660]: 2025-11-22 08:52:13.749595139 +0000 UTC m=+0.087540624 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:52:13 np0005531888 nova_compute[186788]: 2025-11-22 08:52:13.834 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:14 np0005531888 nova_compute[186788]: 2025-11-22 08:52:14.722 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:18 np0005531888 nova_compute[186788]: 2025-11-22 08:52:18.835 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:52:18Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:5a:ba 10.100.0.12
Nov 22 03:52:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:52:18Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:5a:ba 10.100.0.12
Nov 22 03:52:19 np0005531888 nova_compute[186788]: 2025-11-22 08:52:19.725 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:23 np0005531888 nova_compute[186788]: 2025-11-22 08:52:23.838 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:24 np0005531888 nova_compute[186788]: 2025-11-22 08:52:24.727 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.433 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.433 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.433 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.434 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.434 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.441 186792 INFO nova.compute.manager [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Terminating instance#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.448 186792 DEBUG nova.compute.manager [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:52:25 np0005531888 kernel: tap51161e59-b6 (unregistering): left promiscuous mode
Nov 22 03:52:25 np0005531888 NetworkManager[55166]: <info>  [1763801545.4749] device (tap51161e59-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:52:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:52:25Z|00764|binding|INFO|Releasing lport 51161e59-b6f9-460b-bd0f-76262417cbcc from this chassis (sb_readonly=0)
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.485 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:52:25Z|00765|binding|INFO|Setting lport 51161e59-b6f9-460b-bd0f-76262417cbcc down in Southbound
Nov 22 03:52:25 np0005531888 ovn_controller[95067]: 2025-11-22T08:52:25Z|00766|binding|INFO|Removing iface tap51161e59-b6 ovn-installed in OVS
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.488 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:25.495 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:5a:ba 10.100.0.12'], port_security=['fa:16:3e:a3:5a:ba 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '17df6007-a93e-4318-a7a4-c3bc20dfd8f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e6b1900-aaed-4594-90b0-32a5351bb717', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b91ecf2-a2ff-45e5-963e-c9d8b70b8af0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=51161e59-b6f9-460b-bd0f-76262417cbcc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:52:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:25.498 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 51161e59-b6f9-460b-bd0f-76262417cbcc in datapath 6462ae38-eefd-46f7-8dfd-98d64cb746b6 unbound from our chassis#033[00m
Nov 22 03:52:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:25.500 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6462ae38-eefd-46f7-8dfd-98d64cb746b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:52:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:25.502 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[b9dc552a-bb47-4c50-8552-e9fe3165e2e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:25.502 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 namespace which is not needed anymore#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.512 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531888 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b6.scope: Deactivated successfully.
Nov 22 03:52:25 np0005531888 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b6.scope: Consumed 14.023s CPU time.
Nov 22 03:52:25 np0005531888 systemd-machined[153106]: Machine qemu-87-instance-000000b6 terminated.
Nov 22 03:52:25 np0005531888 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[252582]: [NOTICE]   (252586) : haproxy version is 2.8.14-c23fe91
Nov 22 03:52:25 np0005531888 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[252582]: [NOTICE]   (252586) : path to executable is /usr/sbin/haproxy
Nov 22 03:52:25 np0005531888 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[252582]: [WARNING]  (252586) : Exiting Master process...
Nov 22 03:52:25 np0005531888 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[252582]: [WARNING]  (252586) : Exiting Master process...
Nov 22 03:52:25 np0005531888 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[252582]: [ALERT]    (252586) : Current worker (252588) exited with code 143 (Terminated)
Nov 22 03:52:25 np0005531888 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[252582]: [WARNING]  (252586) : All workers exited. Exiting... (0)
Nov 22 03:52:25 np0005531888 NetworkManager[55166]: <info>  [1763801545.6730] manager: (tap51161e59-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/365)
Nov 22 03:52:25 np0005531888 systemd[1]: libpod-704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796.scope: Deactivated successfully.
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.675 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.680 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531888 podman[252751]: 2025-11-22 08:52:25.681597505 +0000 UTC m=+0.084203062 container died 704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.728 186792 INFO nova.virt.libvirt.driver [-] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Instance destroyed successfully.#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.729 186792 DEBUG nova.objects.instance [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'resources' on Instance uuid 17df6007-a93e-4318-a7a4-c3bc20dfd8f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:52:25 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796-userdata-shm.mount: Deactivated successfully.
Nov 22 03:52:25 np0005531888 systemd[1]: var-lib-containers-storage-overlay-37e0646b28ab0ad1f4679cf8ea2d689c0a254623aa724e8304307a61ab632590-merged.mount: Deactivated successfully.
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.747 186792 DEBUG nova.virt.libvirt.vif [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:51:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-0-1024383164',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-0-1024383164',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=182,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiBPbwkfvp292DdxiAKQErFTXea99iSJ0iNktwUcZngbhgOVkJ8LcCoiBgeLTItTjMH75el9p0D+2vq3rbfgroqRlNCO9aORnrX2+bekE/q3IKlWTvN5P4p1NDQcXdubA==',key_name='tempest-TestSecurityGroupsBasicOps-1102160890',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:52:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-jz68ypq7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:52:00Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=17df6007-a93e-4318-a7a4-c3bc20dfd8f4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.748 186792 DEBUG nova.network.os_vif_util [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "51161e59-b6f9-460b-bd0f-76262417cbcc", "address": "fa:16:3e:a3:5a:ba", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51161e59-b6", "ovs_interfaceid": "51161e59-b6f9-460b-bd0f-76262417cbcc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.749 186792 DEBUG nova.network.os_vif_util [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:5a:ba,bridge_name='br-int',has_traffic_filtering=True,id=51161e59-b6f9-460b-bd0f-76262417cbcc,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51161e59-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.749 186792 DEBUG os_vif [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:5a:ba,bridge_name='br-int',has_traffic_filtering=True,id=51161e59-b6f9-460b-bd0f-76262417cbcc,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51161e59-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.750 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.751 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51161e59-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.753 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.755 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.757 186792 INFO os_vif [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:5a:ba,bridge_name='br-int',has_traffic_filtering=True,id=51161e59-b6f9-460b-bd0f-76262417cbcc,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51161e59-b6')#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.758 186792 INFO nova.virt.libvirt.driver [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Deleting instance files /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4_del#033[00m
Nov 22 03:52:25 np0005531888 nova_compute[186788]: 2025-11-22 08:52:25.759 186792 INFO nova.virt.libvirt.driver [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Deletion of /var/lib/nova/instances/17df6007-a93e-4318-a7a4-c3bc20dfd8f4_del complete#033[00m
Nov 22 03:52:25 np0005531888 podman[252751]: 2025-11-22 08:52:25.764854384 +0000 UTC m=+0.167459931 container cleanup 704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:52:25 np0005531888 systemd[1]: libpod-conmon-704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796.scope: Deactivated successfully.
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.038 186792 INFO nova.compute.manager [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.039 186792 DEBUG oslo.service.loopingcall [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.039 186792 DEBUG nova.compute.manager [-] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.040 186792 DEBUG nova.network.neutron [-] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.204 186792 DEBUG nova.compute.manager [req-1f2bdbaa-cbf4-45e0-8eaf-1e176fcc3d98 req-7d8e92ec-ef30-42fd-8af8-806f25cde411 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received event network-vif-unplugged-51161e59-b6f9-460b-bd0f-76262417cbcc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.204 186792 DEBUG oslo_concurrency.lockutils [req-1f2bdbaa-cbf4-45e0-8eaf-1e176fcc3d98 req-7d8e92ec-ef30-42fd-8af8-806f25cde411 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.205 186792 DEBUG oslo_concurrency.lockutils [req-1f2bdbaa-cbf4-45e0-8eaf-1e176fcc3d98 req-7d8e92ec-ef30-42fd-8af8-806f25cde411 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.205 186792 DEBUG oslo_concurrency.lockutils [req-1f2bdbaa-cbf4-45e0-8eaf-1e176fcc3d98 req-7d8e92ec-ef30-42fd-8af8-806f25cde411 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.205 186792 DEBUG nova.compute.manager [req-1f2bdbaa-cbf4-45e0-8eaf-1e176fcc3d98 req-7d8e92ec-ef30-42fd-8af8-806f25cde411 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] No waiting events found dispatching network-vif-unplugged-51161e59-b6f9-460b-bd0f-76262417cbcc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.205 186792 DEBUG nova.compute.manager [req-1f2bdbaa-cbf4-45e0-8eaf-1e176fcc3d98 req-7d8e92ec-ef30-42fd-8af8-806f25cde411 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received event network-vif-unplugged-51161e59-b6f9-460b-bd0f-76262417cbcc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:52:26 np0005531888 podman[252798]: 2025-11-22 08:52:26.237104841 +0000 UTC m=+0.449626302 container remove 704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.242 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[68ba56fe-ab65-4e64-a79d-d485061648f8]: (4, ('Sat Nov 22 08:52:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 (704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796)\n704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796\nSat Nov 22 08:52:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 (704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796)\n704a7864fb1e7cb8074c10c98ea3f6e15a63de4b814e0e65a4a87d0927259796\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.244 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[c411adc9-7d99-45ed-8005-2332547854c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.244 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6462ae38-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.246 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:26 np0005531888 kernel: tap6462ae38-e0: left promiscuous mode
Nov 22 03:52:26 np0005531888 nova_compute[186788]: 2025-11-22 08:52:26.258 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.260 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[40daa8b0-658b-4d82-8c3c-1beed2c2c98c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.274 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[60cbe7de-f00b-4227-a4ca-ef795060c35e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.274 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d5914fa1-9746-411f-b597-6b6edc1d3081]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.289 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[34c51ee8-5018-49f1-8500-60a7b1c363e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 826061, 'reachable_time': 40984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252815, 'error': None, 'target': 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.291 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:52:26 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:26.291 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8cf1c6-f822-499c-ab76-65fa2534a1d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:52:26 np0005531888 systemd[1]: run-netns-ovnmeta\x2d6462ae38\x2deefd\x2d46f7\x2d8dfd\x2d98d64cb746b6.mount: Deactivated successfully.
Nov 22 03:52:28 np0005531888 nova_compute[186788]: 2025-11-22 08:52:28.284 186792 DEBUG nova.compute.manager [req-fb155eb8-30fc-41b6-bd64-3d48c35d8398 req-acf0286e-5730-4fbe-9e91-76f2b75c43f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received event network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:52:28 np0005531888 nova_compute[186788]: 2025-11-22 08:52:28.285 186792 DEBUG oslo_concurrency.lockutils [req-fb155eb8-30fc-41b6-bd64-3d48c35d8398 req-acf0286e-5730-4fbe-9e91-76f2b75c43f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:28 np0005531888 nova_compute[186788]: 2025-11-22 08:52:28.285 186792 DEBUG oslo_concurrency.lockutils [req-fb155eb8-30fc-41b6-bd64-3d48c35d8398 req-acf0286e-5730-4fbe-9e91-76f2b75c43f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:28 np0005531888 nova_compute[186788]: 2025-11-22 08:52:28.285 186792 DEBUG oslo_concurrency.lockutils [req-fb155eb8-30fc-41b6-bd64-3d48c35d8398 req-acf0286e-5730-4fbe-9e91-76f2b75c43f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:28 np0005531888 nova_compute[186788]: 2025-11-22 08:52:28.285 186792 DEBUG nova.compute.manager [req-fb155eb8-30fc-41b6-bd64-3d48c35d8398 req-acf0286e-5730-4fbe-9e91-76f2b75c43f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] No waiting events found dispatching network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:52:28 np0005531888 nova_compute[186788]: 2025-11-22 08:52:28.286 186792 WARNING nova.compute.manager [req-fb155eb8-30fc-41b6-bd64-3d48c35d8398 req-acf0286e-5730-4fbe-9e91-76f2b75c43f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received unexpected event network-vif-plugged-51161e59-b6f9-460b-bd0f-76262417cbcc for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:52:28 np0005531888 podman[252816]: 2025-11-22 08:52:28.691091613 +0000 UTC m=+0.057167757 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:52:28 np0005531888 podman[252817]: 2025-11-22 08:52:28.692098817 +0000 UTC m=+0.054381169 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.551 186792 DEBUG nova.network.neutron [-] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.574 186792 INFO nova.compute.manager [-] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Took 3.53 seconds to deallocate network for instance.#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.649 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.649 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.681 186792 DEBUG nova.compute.manager [req-2815ade8-dc42-4b13-a2b6-28eec656844a req-96efb4b9-1b29-4e2a-a654-8499ff84392c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Received event network-vif-deleted-51161e59-b6f9-460b-bd0f-76262417cbcc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.704 186792 DEBUG nova.compute.provider_tree [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.715 186792 DEBUG nova.scheduler.client.report [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.729 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.736 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.764 186792 INFO nova.scheduler.client.report [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Deleted allocations for instance 17df6007-a93e-4318-a7a4-c3bc20dfd8f4#033[00m
Nov 22 03:52:29 np0005531888 nova_compute[186788]: 2025-11-22 08:52:29.823 186792 DEBUG oslo_concurrency.lockutils [None req-36bc52ce-200a-425c-9e07-78eff309a7ee 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "17df6007-a93e-4318-a7a4-c3bc20dfd8f4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:30 np0005531888 nova_compute[186788]: 2025-11-22 08:52:30.753 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:33 np0005531888 nova_compute[186788]: 2025-11-22 08:52:33.312 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:34 np0005531888 nova_compute[186788]: 2025-11-22 08:52:34.732 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:34 np0005531888 nova_compute[186788]: 2025-11-22 08:52:34.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:34 np0005531888 nova_compute[186788]: 2025-11-22 08:52:34.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:52:34 np0005531888 nova_compute[186788]: 2025-11-22 08:52:34.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:52:34 np0005531888 nova_compute[186788]: 2025-11-22 08:52:34.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:52:35 np0005531888 nova_compute[186788]: 2025-11-22 08:52:35.755 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:35 np0005531888 nova_compute[186788]: 2025-11-22 08:52:35.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.859 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:52:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:36.875 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:36.875 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:52:36.875 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:39 np0005531888 podman[252860]: 2025-11-22 08:52:39.684348673 +0000 UTC m=+0.053982628 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 03:52:39 np0005531888 podman[252861]: 2025-11-22 08:52:39.684400064 +0000 UTC m=+0.051206880 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:52:39 np0005531888 nova_compute[186788]: 2025-11-22 08:52:39.734 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:39 np0005531888 nova_compute[186788]: 2025-11-22 08:52:39.913 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:39 np0005531888 nova_compute[186788]: 2025-11-22 08:52:39.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:40 np0005531888 nova_compute[186788]: 2025-11-22 08:52:40.726 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801545.7248297, 17df6007-a93e-4318-a7a4-c3bc20dfd8f4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:52:40 np0005531888 nova_compute[186788]: 2025-11-22 08:52:40.726 186792 INFO nova.compute.manager [-] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:52:40 np0005531888 nova_compute[186788]: 2025-11-22 08:52:40.742 186792 DEBUG nova.compute.manager [None req-75d83353-8b9a-4a0a-bea3-bb513e84e12c - - - - - -] [instance: 17df6007-a93e-4318-a7a4-c3bc20dfd8f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:52:40 np0005531888 nova_compute[186788]: 2025-11-22 08:52:40.758 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:41 np0005531888 nova_compute[186788]: 2025-11-22 08:52:41.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:44 np0005531888 podman[252901]: 2025-11-22 08:52:44.694596503 +0000 UTC m=+0.067723217 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Nov 22 03:52:44 np0005531888 podman[252902]: 2025-11-22 08:52:44.732758692 +0000 UTC m=+0.100259858 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:52:44 np0005531888 nova_compute[186788]: 2025-11-22 08:52:44.735 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:44 np0005531888 podman[252903]: 2025-11-22 08:52:44.759043519 +0000 UTC m=+0.122849113 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:52:45 np0005531888 nova_compute[186788]: 2025-11-22 08:52:45.762 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:47 np0005531888 nova_compute[186788]: 2025-11-22 08:52:47.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:47 np0005531888 nova_compute[186788]: 2025-11-22 08:52:47.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:52:48 np0005531888 nova_compute[186788]: 2025-11-22 08:52:48.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:49 np0005531888 nova_compute[186788]: 2025-11-22 08:52:49.736 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:49 np0005531888 nova_compute[186788]: 2025-11-22 08:52:49.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:50 np0005531888 nova_compute[186788]: 2025-11-22 08:52:50.764 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:50 np0005531888 nova_compute[186788]: 2025-11-22 08:52:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:51 np0005531888 nova_compute[186788]: 2025-11-22 08:52:51.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:51 np0005531888 nova_compute[186788]: 2025-11-22 08:52:51.973 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:51 np0005531888 nova_compute[186788]: 2025-11-22 08:52:51.973 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:51 np0005531888 nova_compute[186788]: 2025-11-22 08:52:51.973 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:51 np0005531888 nova_compute[186788]: 2025-11-22 08:52:51.974 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.148 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.150 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5711MB free_disk=73.25884628295898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.150 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.150 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.202 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.202 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.227 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.240 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.259 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:52:52 np0005531888 nova_compute[186788]: 2025-11-22 08:52:52.260 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:54 np0005531888 nova_compute[186788]: 2025-11-22 08:52:54.738 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:55 np0005531888 nova_compute[186788]: 2025-11-22 08:52:55.766 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:59 np0005531888 podman[252964]: 2025-11-22 08:52:59.678458939 +0000 UTC m=+0.050724839 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:52:59 np0005531888 podman[252965]: 2025-11-22 08:52:59.691330495 +0000 UTC m=+0.057303441 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:52:59 np0005531888 nova_compute[186788]: 2025-11-22 08:52:59.740 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:00 np0005531888 nova_compute[186788]: 2025-11-22 08:53:00.770 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:04 np0005531888 nova_compute[186788]: 2025-11-22 08:53:04.742 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:05 np0005531888 nova_compute[186788]: 2025-11-22 08:53:05.774 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:09 np0005531888 nova_compute[186788]: 2025-11-22 08:53:09.744 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:10 np0005531888 podman[253008]: 2025-11-22 08:53:10.688659898 +0000 UTC m=+0.059532946 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:53:10 np0005531888 podman[253009]: 2025-11-22 08:53:10.694240185 +0000 UTC m=+0.056146662 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:53:10 np0005531888 nova_compute[186788]: 2025-11-22 08:53:10.776 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:53:12.867 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:53:12 np0005531888 nova_compute[186788]: 2025-11-22 08:53:12.867 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:53:12.868 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:53:14 np0005531888 nova_compute[186788]: 2025-11-22 08:53:14.746 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:14 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:53:14.870 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:53:15 np0005531888 podman[253053]: 2025-11-22 08:53:15.684240316 +0000 UTC m=+0.060281844 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:53:15 np0005531888 podman[253052]: 2025-11-22 08:53:15.700791464 +0000 UTC m=+0.080755588 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git)
Nov 22 03:53:15 np0005531888 podman[253054]: 2025-11-22 08:53:15.736390609 +0000 UTC m=+0.109320980 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:53:15 np0005531888 nova_compute[186788]: 2025-11-22 08:53:15.778 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:18 np0005531888 nova_compute[186788]: 2025-11-22 08:53:18.255 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:18 np0005531888 ovn_controller[95067]: 2025-11-22T08:53:18Z|00767|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 22 03:53:19 np0005531888 nova_compute[186788]: 2025-11-22 08:53:19.749 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:20 np0005531888 nova_compute[186788]: 2025-11-22 08:53:20.780 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:24 np0005531888 nova_compute[186788]: 2025-11-22 08:53:24.752 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:25 np0005531888 nova_compute[186788]: 2025-11-22 08:53:25.782 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:29 np0005531888 nova_compute[186788]: 2025-11-22 08:53:29.753 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:30 np0005531888 podman[253116]: 2025-11-22 08:53:30.671208269 +0000 UTC m=+0.048116876 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:53:30 np0005531888 podman[253117]: 2025-11-22 08:53:30.679484522 +0000 UTC m=+0.050484383 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:53:30 np0005531888 nova_compute[186788]: 2025-11-22 08:53:30.785 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:34 np0005531888 nova_compute[186788]: 2025-11-22 08:53:34.756 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:34 np0005531888 nova_compute[186788]: 2025-11-22 08:53:34.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:34 np0005531888 nova_compute[186788]: 2025-11-22 08:53:34.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:53:34 np0005531888 nova_compute[186788]: 2025-11-22 08:53:34.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:53:34 np0005531888 nova_compute[186788]: 2025-11-22 08:53:34.974 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:53:34 np0005531888 nova_compute[186788]: 2025-11-22 08:53:34.974 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:35 np0005531888 nova_compute[186788]: 2025-11-22 08:53:35.788 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:53:36.875 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:53:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:53:36.875 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:53:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:53:36.875 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:53:36 np0005531888 nova_compute[186788]: 2025-11-22 08:53:36.970 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:39 np0005531888 nova_compute[186788]: 2025-11-22 08:53:39.757 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:40 np0005531888 nova_compute[186788]: 2025-11-22 08:53:40.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:41 np0005531888 podman[253158]: 2025-11-22 08:53:41.682487853 +0000 UTC m=+0.060081259 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:53:41 np0005531888 podman[253159]: 2025-11-22 08:53:41.687873446 +0000 UTC m=+0.056857211 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:53:43 np0005531888 nova_compute[186788]: 2025-11-22 08:53:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:44 np0005531888 nova_compute[186788]: 2025-11-22 08:53:44.760 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:45 np0005531888 nova_compute[186788]: 2025-11-22 08:53:45.792 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:46 np0005531888 podman[253200]: 2025-11-22 08:53:46.689472383 +0000 UTC m=+0.060640293 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 03:53:46 np0005531888 podman[253199]: 2025-11-22 08:53:46.689540994 +0000 UTC m=+0.062036876 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Nov 22 03:53:46 np0005531888 podman[253201]: 2025-11-22 08:53:46.720519076 +0000 UTC m=+0.089596355 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 03:53:48 np0005531888 nova_compute[186788]: 2025-11-22 08:53:48.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:48 np0005531888 nova_compute[186788]: 2025-11-22 08:53:48.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:48 np0005531888 nova_compute[186788]: 2025-11-22 08:53:48.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:53:49 np0005531888 nova_compute[186788]: 2025-11-22 08:53:49.762 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:50 np0005531888 nova_compute[186788]: 2025-11-22 08:53:50.796 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:50 np0005531888 nova_compute[186788]: 2025-11-22 08:53:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:51 np0005531888 nova_compute[186788]: 2025-11-22 08:53:51.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:53 np0005531888 nova_compute[186788]: 2025-11-22 08:53:53.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:53 np0005531888 nova_compute[186788]: 2025-11-22 08:53:53.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:53:53 np0005531888 nova_compute[186788]: 2025-11-22 08:53:53.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:53:53 np0005531888 nova_compute[186788]: 2025-11-22 08:53:53.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:53:53 np0005531888 nova_compute[186788]: 2025-11-22 08:53:53.990 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.132 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.133 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5717MB free_disk=73.25884628295898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.133 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.133 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.251 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.251 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.307 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.348 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.350 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.350 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:53:54 np0005531888 nova_compute[186788]: 2025-11-22 08:53:54.763 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:55 np0005531888 nova_compute[186788]: 2025-11-22 08:53:55.799 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:59 np0005531888 nova_compute[186788]: 2025-11-22 08:53:59.765 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:00 np0005531888 nova_compute[186788]: 2025-11-22 08:54:00.802 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:01 np0005531888 podman[253262]: 2025-11-22 08:54:01.669260571 +0000 UTC m=+0.046084125 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:54:01 np0005531888 podman[253263]: 2025-11-22 08:54:01.669683801 +0000 UTC m=+0.044355602 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:54:04 np0005531888 nova_compute[186788]: 2025-11-22 08:54:04.767 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:05 np0005531888 nova_compute[186788]: 2025-11-22 08:54:05.806 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.644 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "90907795-67f7-464a-824a-fcd6047735a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.644 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.668 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.769 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.837 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.838 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.847 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.848 186792 INFO nova.compute.claims [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:54:09 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.987 186792 DEBUG nova.compute.provider_tree [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:09.999 186792 DEBUG nova.scheduler.client.report [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.024 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.025 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.097 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.097 186792 DEBUG nova.network.neutron [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.118 186792 INFO nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.142 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.266 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.268 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.268 186792 INFO nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Creating image(s)#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.269 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "/var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.270 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.271 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.291 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.356 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.357 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.357 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.368 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.422 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.423 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.714 186792 DEBUG nova.policy [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:54:10 np0005531888 nova_compute[186788]: 2025-11-22 08:54:10.809 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.040 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk 1073741824" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.041 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.041 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.111 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.112 186792 DEBUG nova.virt.disk.api [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Checking if we can resize image /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.113 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.175 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.177 186792 DEBUG nova.virt.disk.api [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Cannot resize image /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.177 186792 DEBUG nova.objects.instance [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'migration_context' on Instance uuid 90907795-67f7-464a-824a-fcd6047735a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.190 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.190 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Ensure instance console log exists: /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.191 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.191 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:11 np0005531888 nova_compute[186788]: 2025-11-22 08:54:11.191 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:12 np0005531888 podman[253317]: 2025-11-22 08:54:12.680852642 +0000 UTC m=+0.056974612 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:54:12 np0005531888 podman[253318]: 2025-11-22 08:54:12.681323634 +0000 UTC m=+0.054801529 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:54:12 np0005531888 nova_compute[186788]: 2025-11-22 08:54:12.837 186792 DEBUG nova.network.neutron [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Successfully created port: 4ba6d1e2-e54f-4374-b3a4-34f95f06745a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.109 186792 DEBUG nova.network.neutron [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Successfully updated port: 4ba6d1e2-e54f-4374-b3a4-34f95f06745a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.142 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.142 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquired lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.142 186792 DEBUG nova.network.neutron [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.325 186792 DEBUG nova.compute.manager [req-516624ce-d81b-4424-a8bd-db39abdf43be req-80311e17-35fa-401d-b8eb-3c6bf7957296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-changed-4ba6d1e2-e54f-4374-b3a4-34f95f06745a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.325 186792 DEBUG nova.compute.manager [req-516624ce-d81b-4424-a8bd-db39abdf43be req-80311e17-35fa-401d-b8eb-3c6bf7957296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Refreshing instance network info cache due to event network-changed-4ba6d1e2-e54f-4374-b3a4-34f95f06745a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.325 186792 DEBUG oslo_concurrency.lockutils [req-516624ce-d81b-4424-a8bd-db39abdf43be req-80311e17-35fa-401d-b8eb-3c6bf7957296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.710 186792 DEBUG nova.network.neutron [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:54:14 np0005531888 nova_compute[186788]: 2025-11-22 08:54:14.770 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.812 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.898 186792 DEBUG nova.network.neutron [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updating instance_info_cache with network_info: [{"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.962 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Releasing lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.963 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Instance network_info: |[{"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.963 186792 DEBUG oslo_concurrency.lockutils [req-516624ce-d81b-4424-a8bd-db39abdf43be req-80311e17-35fa-401d-b8eb-3c6bf7957296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.963 186792 DEBUG nova.network.neutron [req-516624ce-d81b-4424-a8bd-db39abdf43be req-80311e17-35fa-401d-b8eb-3c6bf7957296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Refreshing network info cache for port 4ba6d1e2-e54f-4374-b3a4-34f95f06745a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.966 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Start _get_guest_xml network_info=[{"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.970 186792 WARNING nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.974 186792 DEBUG nova.virt.libvirt.host [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.975 186792 DEBUG nova.virt.libvirt.host [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.980 186792 DEBUG nova.virt.libvirt.host [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.981 186792 DEBUG nova.virt.libvirt.host [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.983 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.983 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.983 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.983 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.984 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.984 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.984 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.984 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.984 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.985 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.985 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.985 186792 DEBUG nova.virt.hardware [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.988 186792 DEBUG nova.virt.libvirt.vif [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:54:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=184,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAw4mkfFiRvcPYUE0PIsjDRQ2l9YGHYiiINwqgLaZjT4Jz+8V9nq9XzUIN6IBe3EfaIEfpC2/icXPG/z2BuoLG3JQ3o5sNQAUv0uO8d73RniLqWnU/BDSHGzBNmpYdI7sw==',key_name='tempest-TestSecurityGroupsBasicOps-2127958203',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-u3849c8o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:54:10Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=90907795-67f7-464a-824a-fcd6047735a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.989 186792 DEBUG nova.network.os_vif_util [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.989 186792 DEBUG nova.network.os_vif_util [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:77:f2,bridge_name='br-int',has_traffic_filtering=True,id=4ba6d1e2-e54f-4374-b3a4-34f95f06745a,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba6d1e2-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:54:15 np0005531888 nova_compute[186788]: 2025-11-22 08:54:15.990 186792 DEBUG nova.objects.instance [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'pci_devices' on Instance uuid 90907795-67f7-464a-824a-fcd6047735a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.003 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <uuid>90907795-67f7-464a-824a-fcd6047735a9</uuid>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <name>instance-000000b8</name>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004</nova:name>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:54:15</nova:creationTime>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        <nova:user uuid="7bb85b33f2b44468ab5d86bf5ba98421">tempest-TestSecurityGroupsBasicOps-588574044-project-member</nova:user>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        <nova:project uuid="b5da13b07bb34fc3b4cd1452f7dd6971">tempest-TestSecurityGroupsBasicOps-588574044</nova:project>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        <nova:port uuid="4ba6d1e2-e54f-4374-b3a4-34f95f06745a">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <entry name="serial">90907795-67f7-464a-824a-fcd6047735a9</entry>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <entry name="uuid">90907795-67f7-464a-824a-fcd6047735a9</entry>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk.config"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:24:77:f2"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <target dev="tap4ba6d1e2-e5"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/console.log" append="off"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:54:16 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:54:16 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:54:16 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:54:16 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.005 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Preparing to wait for external event network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.005 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "90907795-67f7-464a-824a-fcd6047735a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.005 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.005 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.006 186792 DEBUG nova.virt.libvirt.vif [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:54:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=184,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAw4mkfFiRvcPYUE0PIsjDRQ2l9YGHYiiINwqgLaZjT4Jz+8V9nq9XzUIN6IBe3EfaIEfpC2/icXPG/z2BuoLG3JQ3o5sNQAUv0uO8d73RniLqWnU/BDSHGzBNmpYdI7sw==',key_name='tempest-TestSecurityGroupsBasicOps-2127958203',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-u3849c8o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:54:10Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=90907795-67f7-464a-824a-fcd6047735a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.006 186792 DEBUG nova.network.os_vif_util [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.007 186792 DEBUG nova.network.os_vif_util [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:77:f2,bridge_name='br-int',has_traffic_filtering=True,id=4ba6d1e2-e54f-4374-b3a4-34f95f06745a,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba6d1e2-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.007 186792 DEBUG os_vif [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:77:f2,bridge_name='br-int',has_traffic_filtering=True,id=4ba6d1e2-e54f-4374-b3a4-34f95f06745a,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba6d1e2-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.008 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.008 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.008 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.011 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.011 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ba6d1e2-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.011 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ba6d1e2-e5, col_values=(('external_ids', {'iface-id': '4ba6d1e2-e54f-4374-b3a4-34f95f06745a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:77:f2', 'vm-uuid': '90907795-67f7-464a-824a-fcd6047735a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:54:16 np0005531888 NetworkManager[55166]: <info>  [1763801656.0137] manager: (tap4ba6d1e2-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.016 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.019 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.020 186792 INFO os_vif [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:77:f2,bridge_name='br-int',has_traffic_filtering=True,id=4ba6d1e2-e54f-4374-b3a4-34f95f06745a,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba6d1e2-e5')#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.240 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.241 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.241 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No VIF found with MAC fa:16:3e:24:77:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:54:16 np0005531888 nova_compute[186788]: 2025-11-22 08:54:16.242 186792 INFO nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Using config drive#033[00m
Nov 22 03:54:17 np0005531888 podman[253364]: 2025-11-22 08:54:17.677206761 +0000 UTC m=+0.051931269 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:54:17 np0005531888 podman[253363]: 2025-11-22 08:54:17.678280587 +0000 UTC m=+0.053820986 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Nov 22 03:54:17 np0005531888 podman[253365]: 2025-11-22 08:54:17.70850706 +0000 UTC m=+0.078547373 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller)
Nov 22 03:54:19 np0005531888 nova_compute[186788]: 2025-11-22 08:54:19.736 186792 INFO nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Creating config drive at /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk.config#033[00m
Nov 22 03:54:19 np0005531888 nova_compute[186788]: 2025-11-22 08:54:19.742 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzmc8udvv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:19 np0005531888 nova_compute[186788]: 2025-11-22 08:54:19.772 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:19 np0005531888 nova_compute[186788]: 2025-11-22 08:54:19.867 186792 DEBUG oslo_concurrency.processutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzmc8udvv" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:19 np0005531888 kernel: tap4ba6d1e2-e5: entered promiscuous mode
Nov 22 03:54:19 np0005531888 NetworkManager[55166]: <info>  [1763801659.9296] manager: (tap4ba6d1e2-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/367)
Nov 22 03:54:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:19Z|00768|binding|INFO|Claiming lport 4ba6d1e2-e54f-4374-b3a4-34f95f06745a for this chassis.
Nov 22 03:54:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:19Z|00769|binding|INFO|4ba6d1e2-e54f-4374-b3a4-34f95f06745a: Claiming fa:16:3e:24:77:f2 10.100.0.11
Nov 22 03:54:19 np0005531888 nova_compute[186788]: 2025-11-22 08:54:19.929 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:19 np0005531888 systemd-udevd[253445]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:54:19 np0005531888 systemd-machined[153106]: New machine qemu-88-instance-000000b8.
Nov 22 03:54:19 np0005531888 NetworkManager[55166]: <info>  [1763801659.9721] device (tap4ba6d1e2-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:54:19 np0005531888 NetworkManager[55166]: <info>  [1763801659.9728] device (tap4ba6d1e2-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:54:19 np0005531888 nova_compute[186788]: 2025-11-22 08:54:19.987 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:19 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:19Z|00770|binding|INFO|Setting lport 4ba6d1e2-e54f-4374-b3a4-34f95f06745a ovn-installed in OVS
Nov 22 03:54:19 np0005531888 nova_compute[186788]: 2025-11-22 08:54:19.992 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:19 np0005531888 systemd[1]: Started Virtual Machine qemu-88-instance-000000b8.
Nov 22 03:54:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:20Z|00771|binding|INFO|Setting lport 4ba6d1e2-e54f-4374-b3a4-34f95f06745a up in Southbound
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.053 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:77:f2 10.100.0.11'], port_security=['fa:16:3e:24:77:f2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '90907795-67f7-464a-824a-fcd6047735a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '2', 'neutron:security_group_ids': '984f816a-7307-432f-9913-aaee1df5bf08 b3726af1-bbad-4493-94ab-7644017fbf88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=809b4a06-a3bb-45c6-b4f6-e663731ee64f, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=4ba6d1e2-e54f-4374-b3a4-34f95f06745a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.054 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 4ba6d1e2-e54f-4374-b3a4-34f95f06745a in datapath 4d568a4f-3fc1-4760-b924-569e98e1b4a7 bound to our chassis#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.055 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4d568a4f-3fc1-4760-b924-569e98e1b4a7#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.066 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[59639a0c-d729-4901-b4f6-c66cf9952918]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.067 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4d568a4f-31 in ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.069 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4d568a4f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.069 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[0aae2302-0897-4dd4-ac2e-b7b81b410641]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.070 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[d23c2dc7-d51a-4264-a226-ce30994c0a52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.082 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[7740e38a-9440-4e4d-9d4e-d96101ea5936]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.094 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[775dc6d8-d1f1-4f20-90f3-b7d41385787a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.122 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[5e835e01-a35a-4b11-8953-0386143098c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 NetworkManager[55166]: <info>  [1763801660.1290] manager: (tap4d568a4f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/368)
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.132 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2da0c1fc-63a4-417d-ad72-ef138d589578]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.178 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[72f3513b-a180-47f2-a13d-31bcb3a7b33b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.182 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[727d348a-a7d1-484e-82fc-a2a38bce7feb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 NetworkManager[55166]: <info>  [1763801660.2103] device (tap4d568a4f-30): carrier: link connected
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.216 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[92fef916-94bd-4207-bafd-c4fdb6e1b1b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.237 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[edbe02cf-21a5-43d3-89ab-4fdad862550e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d568a4f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:a1:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 840085, 'reachable_time': 37930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253480, 'error': None, 'target': 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.255 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[97badee2-5cfd-41de-ba03-ce210d137d13]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe74:a182'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 840085, 'tstamp': 840085}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253481, 'error': None, 'target': 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.282 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f48362b3-d911-4f94-b4c0-72559e08ea4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d568a4f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:a1:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 840085, 'reachable_time': 37930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253482, 'error': None, 'target': 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.322 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e20ddb8d-7900-4780-8e2a-5c79b8850f4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.385 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f76d77-3c15-4d82-af32-893579baa0f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.387 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d568a4f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.388 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.388 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d568a4f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.390 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:20 np0005531888 NetworkManager[55166]: <info>  [1763801660.3909] manager: (tap4d568a4f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Nov 22 03:54:20 np0005531888 kernel: tap4d568a4f-30: entered promiscuous mode
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.392 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4d568a4f-30, col_values=(('external_ids', {'iface-id': '70dec18e-900f-4578-a135-56e2da0bb3bd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:54:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:20Z|00772|binding|INFO|Releasing lport 70dec18e-900f-4578-a135-56e2da0bb3bd from this chassis (sb_readonly=0)
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.393 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.405 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.406 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4d568a4f-3fc1-4760-b924-569e98e1b4a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4d568a4f-3fc1-4760-b924-569e98e1b4a7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.407 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2bac7d3a-79bc-4533-bb5c-baa21e2b9c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.408 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-4d568a4f-3fc1-4760-b924-569e98e1b4a7
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/4d568a4f-3fc1-4760-b924-569e98e1b4a7.pid.haproxy
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 4d568a4f-3fc1-4760-b924-569e98e1b4a7
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:54:20 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:20.408 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'env', 'PROCESS_TAG=haproxy-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4d568a4f-3fc1-4760-b924-569e98e1b4a7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.430 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801660.4301174, 90907795-67f7-464a-824a-fcd6047735a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.431 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] VM Started (Lifecycle Event)#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.450 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.454 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801660.4313023, 90907795-67f7-464a-824a-fcd6047735a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.454 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.483 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.486 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:54:20 np0005531888 nova_compute[186788]: 2025-11-22 08:54:20.523 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:54:20 np0005531888 podman[253521]: 2025-11-22 08:54:20.750063918 +0000 UTC m=+0.032724136 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:54:20 np0005531888 podman[253521]: 2025-11-22 08:54:20.843740243 +0000 UTC m=+0.126400441 container create 4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:54:20 np0005531888 systemd[1]: Started libpod-conmon-4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3.scope.
Nov 22 03:54:20 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:54:20 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c0ffd2156db8f50409e720985cf860d316f1fc39b98096a13619a5a09fe2f15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.013 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:21 np0005531888 podman[253521]: 2025-11-22 08:54:21.014145434 +0000 UTC m=+0.296805662 container init 4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:54:21 np0005531888 podman[253521]: 2025-11-22 08:54:21.020603073 +0000 UTC m=+0.303263271 container start 4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:54:21 np0005531888 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[253536]: [NOTICE]   (253540) : New worker (253542) forked
Nov 22 03:54:21 np0005531888 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[253536]: [NOTICE]   (253540) : Loading success.
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.221 186792 DEBUG nova.compute.manager [req-a6d5ca5a-f160-40c5-8212-a730592f9ecf req-4ca50de8-2fc3-419f-a565-632961a3429a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.221 186792 DEBUG oslo_concurrency.lockutils [req-a6d5ca5a-f160-40c5-8212-a730592f9ecf req-4ca50de8-2fc3-419f-a565-632961a3429a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "90907795-67f7-464a-824a-fcd6047735a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.221 186792 DEBUG oslo_concurrency.lockutils [req-a6d5ca5a-f160-40c5-8212-a730592f9ecf req-4ca50de8-2fc3-419f-a565-632961a3429a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.221 186792 DEBUG oslo_concurrency.lockutils [req-a6d5ca5a-f160-40c5-8212-a730592f9ecf req-4ca50de8-2fc3-419f-a565-632961a3429a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.222 186792 DEBUG nova.compute.manager [req-a6d5ca5a-f160-40c5-8212-a730592f9ecf req-4ca50de8-2fc3-419f-a565-632961a3429a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Processing event network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.222 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.227 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801661.2272966, 90907795-67f7-464a-824a-fcd6047735a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.227 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.230 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.233 186792 INFO nova.virt.libvirt.driver [-] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Instance spawned successfully.#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.233 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.254 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.254 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.254 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.255 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.255 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.255 186792 DEBUG nova.virt.libvirt.driver [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.262 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.265 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.285 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.432 186792 INFO nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Took 11.16 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.432 186792 DEBUG nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.472 186792 DEBUG nova.network.neutron [req-516624ce-d81b-4424-a8bd-db39abdf43be req-80311e17-35fa-401d-b8eb-3c6bf7957296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updated VIF entry in instance network info cache for port 4ba6d1e2-e54f-4374-b3a4-34f95f06745a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.472 186792 DEBUG nova.network.neutron [req-516624ce-d81b-4424-a8bd-db39abdf43be req-80311e17-35fa-401d-b8eb-3c6bf7957296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updating instance_info_cache with network_info: [{"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.513 186792 DEBUG oslo_concurrency.lockutils [req-516624ce-d81b-4424-a8bd-db39abdf43be req-80311e17-35fa-401d-b8eb-3c6bf7957296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.556 186792 INFO nova.compute.manager [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Took 11.82 seconds to build instance.#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.649 186792 DEBUG oslo_concurrency.lockutils [None req-74c8cd9e-852e-4d8f-8e46-bc7f3adf68b4 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.954 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.954 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.955 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.955 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.955 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.956 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.975 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.987 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.987 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Image id eb6eb4ac-7956-4021-b3a0-d612ae61d38c yields fingerprint 169b85625b85d2ad681b52460a0c196a18b2a726 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.987 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] image eb6eb4ac-7956-4021-b3a0-d612ae61d38c at (/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726): checking#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.988 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] image eb6eb4ac-7956-4021-b3a0-d612ae61d38c at (/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.989 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.990 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] 90907795-67f7-464a-824a-fcd6047735a9 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.990 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] 90907795-67f7-464a-824a-fcd6047735a9 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129#033[00m
Nov 22 03:54:21 np0005531888 nova_compute[186788]: 2025-11-22 08:54:21.990 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.053 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.054 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 90907795-67f7-464a-824a-fcd6047735a9 is backed by 169b85625b85d2ad681b52460a0c196a18b2a726 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.054 186792 WARNING nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.054 186792 WARNING nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.054 186792 WARNING nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.055 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Active base files: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.055 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Removable base files: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53 /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.055 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.055 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/91b50f83eaa261e984af5000107bf50c6f917c53#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.055 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/0ee57c243eb3946217335143ef545d69665f34f2#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.056 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.056 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.056 186792 DEBUG nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 22 03:54:22 np0005531888 nova_compute[186788]: 2025-11-22 08:54:22.056 186792 INFO nova.virt.libvirt.imagecache [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Nov 22 03:54:23 np0005531888 nova_compute[186788]: 2025-11-22 08:54:23.474 186792 DEBUG nova.compute.manager [req-c667e793-c856-43c8-8422-84a9f1f71c55 req-ab72d12e-d0ad-4287-a597-c4625c99827d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:54:23 np0005531888 nova_compute[186788]: 2025-11-22 08:54:23.474 186792 DEBUG oslo_concurrency.lockutils [req-c667e793-c856-43c8-8422-84a9f1f71c55 req-ab72d12e-d0ad-4287-a597-c4625c99827d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "90907795-67f7-464a-824a-fcd6047735a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:23 np0005531888 nova_compute[186788]: 2025-11-22 08:54:23.474 186792 DEBUG oslo_concurrency.lockutils [req-c667e793-c856-43c8-8422-84a9f1f71c55 req-ab72d12e-d0ad-4287-a597-c4625c99827d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:23 np0005531888 nova_compute[186788]: 2025-11-22 08:54:23.475 186792 DEBUG oslo_concurrency.lockutils [req-c667e793-c856-43c8-8422-84a9f1f71c55 req-ab72d12e-d0ad-4287-a597-c4625c99827d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:23 np0005531888 nova_compute[186788]: 2025-11-22 08:54:23.476 186792 DEBUG nova.compute.manager [req-c667e793-c856-43c8-8422-84a9f1f71c55 req-ab72d12e-d0ad-4287-a597-c4625c99827d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] No waiting events found dispatching network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:54:23 np0005531888 nova_compute[186788]: 2025-11-22 08:54:23.476 186792 WARNING nova.compute.manager [req-c667e793-c856-43c8-8422-84a9f1f71c55 req-ab72d12e-d0ad-4287-a597-c4625c99827d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received unexpected event network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a for instance with vm_state active and task_state None.#033[00m
Nov 22 03:54:24 np0005531888 nova_compute[186788]: 2025-11-22 08:54:24.773 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:25 np0005531888 nova_compute[186788]: 2025-11-22 08:54:25.423 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:25.425 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:54:25 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:25.426 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:54:26 np0005531888 nova_compute[186788]: 2025-11-22 08:54:26.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:26Z|00773|binding|INFO|Releasing lport 70dec18e-900f-4578-a135-56e2da0bb3bd from this chassis (sb_readonly=0)
Nov 22 03:54:26 np0005531888 nova_compute[186788]: 2025-11-22 08:54:26.456 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:26 np0005531888 NetworkManager[55166]: <info>  [1763801666.4670] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/370)
Nov 22 03:54:26 np0005531888 NetworkManager[55166]: <info>  [1763801666.4680] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Nov 22 03:54:26 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:26Z|00774|binding|INFO|Releasing lport 70dec18e-900f-4578-a135-56e2da0bb3bd from this chassis (sb_readonly=0)
Nov 22 03:54:26 np0005531888 nova_compute[186788]: 2025-11-22 08:54:26.490 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:26 np0005531888 nova_compute[186788]: 2025-11-22 08:54:26.495 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:27 np0005531888 nova_compute[186788]: 2025-11-22 08:54:27.117 186792 DEBUG nova.compute.manager [req-34a811c3-d99b-4c93-bd45-07ceb616b6c5 req-7998d8fe-275e-4172-ae24-0a97328e4291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-changed-4ba6d1e2-e54f-4374-b3a4-34f95f06745a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:54:27 np0005531888 nova_compute[186788]: 2025-11-22 08:54:27.117 186792 DEBUG nova.compute.manager [req-34a811c3-d99b-4c93-bd45-07ceb616b6c5 req-7998d8fe-275e-4172-ae24-0a97328e4291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Refreshing instance network info cache due to event network-changed-4ba6d1e2-e54f-4374-b3a4-34f95f06745a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:54:27 np0005531888 nova_compute[186788]: 2025-11-22 08:54:27.117 186792 DEBUG oslo_concurrency.lockutils [req-34a811c3-d99b-4c93-bd45-07ceb616b6c5 req-7998d8fe-275e-4172-ae24-0a97328e4291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:54:27 np0005531888 nova_compute[186788]: 2025-11-22 08:54:27.117 186792 DEBUG oslo_concurrency.lockutils [req-34a811c3-d99b-4c93-bd45-07ceb616b6c5 req-7998d8fe-275e-4172-ae24-0a97328e4291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:54:27 np0005531888 nova_compute[186788]: 2025-11-22 08:54:27.118 186792 DEBUG nova.network.neutron [req-34a811c3-d99b-4c93-bd45-07ceb616b6c5 req-7998d8fe-275e-4172-ae24-0a97328e4291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Refreshing network info cache for port 4ba6d1e2-e54f-4374-b3a4-34f95f06745a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:54:29 np0005531888 nova_compute[186788]: 2025-11-22 08:54:29.776 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:30 np0005531888 nova_compute[186788]: 2025-11-22 08:54:30.232 186792 DEBUG nova.network.neutron [req-34a811c3-d99b-4c93-bd45-07ceb616b6c5 req-7998d8fe-275e-4172-ae24-0a97328e4291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updated VIF entry in instance network info cache for port 4ba6d1e2-e54f-4374-b3a4-34f95f06745a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:54:30 np0005531888 nova_compute[186788]: 2025-11-22 08:54:30.233 186792 DEBUG nova.network.neutron [req-34a811c3-d99b-4c93-bd45-07ceb616b6c5 req-7998d8fe-275e-4172-ae24-0a97328e4291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updating instance_info_cache with network_info: [{"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:54:30 np0005531888 nova_compute[186788]: 2025-11-22 08:54:30.259 186792 DEBUG oslo_concurrency.lockutils [req-34a811c3-d99b-4c93-bd45-07ceb616b6c5 req-7998d8fe-275e-4172-ae24-0a97328e4291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:54:31 np0005531888 nova_compute[186788]: 2025-11-22 08:54:31.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:32 np0005531888 podman[253557]: 2025-11-22 08:54:32.701354658 +0000 UTC m=+0.048153345 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:54:32 np0005531888 podman[253556]: 2025-11-22 08:54:32.702291541 +0000 UTC m=+0.055925697 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:54:33 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:33.429 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:54:34 np0005531888 nova_compute[186788]: 2025-11-22 08:54:34.777 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:35 np0005531888 nova_compute[186788]: 2025-11-22 08:54:35.056 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:35 np0005531888 nova_compute[186788]: 2025-11-22 08:54:35.057 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:54:35 np0005531888 nova_compute[186788]: 2025-11-22 08:54:35.057 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:54:35 np0005531888 nova_compute[186788]: 2025-11-22 08:54:35.730 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:54:35 np0005531888 nova_compute[186788]: 2025-11-22 08:54:35.730 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:54:35 np0005531888 nova_compute[186788]: 2025-11-22 08:54:35.730 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:54:35 np0005531888 nova_compute[186788]: 2025-11-22 08:54:35.731 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 90907795-67f7-464a-824a-fcd6047735a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:54:36 np0005531888 nova_compute[186788]: 2025-11-22 08:54:36.019 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.863 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '90907795-67f7-464a-824a-fcd6047735a9', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b8', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'hostId': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.867 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 90907795-67f7-464a-824a-fcd6047735a9 / tap4ba6d1e2-e5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.868 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00387a3d-4f1f-42a1-81eb-a0dde790ce53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.864551', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df8b7828-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '08e24383a33d1bd912e3c8f01673ce34a96a565bdd684caa6aed42e9e26c1c69'}]}, 'timestamp': '2025-11-22 08:54:36.868828', '_unique_id': '5c4564a42e4a48beaa0b1f45323ed4f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.869 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:36.877 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:36.877 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:54:36.878 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.885 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.allocation volume: 28254208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.886 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b16f482-136b-4c70-a87d-da58e1445cb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28254208, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.870845', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df8e1984-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.570339558, 'message_signature': 'bb04532e36f3d51a820c57c4142487e0f703301b4b65bedb8296d4e7f3b0fba3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.870845', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df8e249c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.570339558, 'message_signature': '5f8ccacf791f0d64f60176215cccf4d13c07b4ca679f6d2f47b8f58b49cb83c3'}]}, 'timestamp': '2025-11-22 08:54:36.886244', '_unique_id': 'dcf0480c2cd34be987379edc91e61b69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.888 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cfe43ca-c0db-4c09-8329-46e39a1de2b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.888072', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df8e76cc-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '19546a7cd5b617d6103720f3566adbc849fb3e778fa840cca925bdcff422bc57'}]}, 'timestamp': '2025-11-22 08:54:36.888387', '_unique_id': '867cc662fba54b949b0654098a3b24d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.889 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cafd30c6-0073-42c8-a823-27a4d2cc7dfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.889888', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df8ebdbc-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '7a3642b7964b1ff95b809543aa0aed456aeffc4a48f0603f8fab4bd8f9db13bb'}]}, 'timestamp': '2025-11-22 08:54:36.890170', '_unique_id': '666d9dabe3994d1c8484502b103a8c35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.890 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.891 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.891 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004>]
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.892 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3c4878b-ea71-4f29-b95e-87563a0fc2ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.892133', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df8f153c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '94e44d9240281e8c33c9aad7183861706521b6f155736792a7766ba08719bfe7'}]}, 'timestamp': '2025-11-22 08:54:36.892457', '_unique_id': '30803ed6690e4f98b8fdabb845801786'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.893 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.894 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.911 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/cpu volume: 12410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '565943c2-5049-4f2b-b283-abd207aa720b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12410000000, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9', 'timestamp': '2025-11-22T08:54:36.894121', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'df920896-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.610588748, 'message_signature': 'ca19a8d1aba6a72674078d2a2b10d41c062e0d176eaaee11d12df3d0d173ceed'}]}, 'timestamp': '2025-11-22 08:54:36.911850', '_unique_id': '3cfc733a99ce47d7b4bd8c32e2d44df3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.914 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cdee37b-0879-4043-9e4c-5c5965a5b8b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.914069', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df926ee4-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '2f28a2d8499e28b2d6d47599d41d60c7a60216ee419895bb6a6841a320b273ca'}]}, 'timestamp': '2025-11-22 08:54:36.914409', '_unique_id': '1d09b02d09d5459bb5431738f5c5270f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.942 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.read.latency volume: 1426092885 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.942 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.read.latency volume: 457041917 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe47c692-3ea1-43b9-a245-deb3fe0ac77d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1426092885, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.916015', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df96c7c8-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': 'a42b6ea7a71669fbe9729f3583971fd4a56743b359d9fa23d1bf0a73a4fffb16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 457041917, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.916015', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df96d272-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': '9df04dc13272ef04dd19b9f55b3627813c638bce5b8083adacbd4fe9d27211af'}]}, 'timestamp': '2025-11-22 08:54:36.943122', '_unique_id': '382f032ee95c4a0baf55da4a90ce15c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.945 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.945 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d63cc6c-64b7-48f1-8bc3-3e337f313945', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.945044', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df97279a-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.570339558, 'message_signature': '0979b727d258011693c43f25485f1ea6863e29096e71a0fbf1050659e8e8c2d6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.945044', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df973276-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.570339558, 'message_signature': 'bb62f03df2b91267c2285ae9758c22e9b2f2b40fef64de8b3ca2c1d47fac84f4'}]}, 'timestamp': '2025-11-22 08:54:36.945615', '_unique_id': '3440e833e37d4115a9701f05cfa4b82e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.946 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.947 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.read.bytes volume: 27200512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.947 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.read.bytes volume: 209214 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a72d1b36-5afb-4df2-81f7-dc1d7c80150a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27200512, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.946989', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df977380-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': '121f17ca07fa3675cb4f7adf8947523a97d0841b12f4f2a1946e244d8af48663'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 209214, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.946989', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df977d9e-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': '47e22fbb69adedec030227e593c17ad3c54c2c9a820db15c0895a83e8122ead6'}]}, 'timestamp': '2025-11-22 08:54:36.947523', '_unique_id': '991e058693a440da8a1640eb08647abe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.948 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004>]
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98750e09-976d-4ba6-8685-1f4ba4db759a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.949227', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df97ca2e-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '3ecd5054860df92bd272d72701766857ebbe7b1bcb2192679186aff7b06bd928'}]}, 'timestamp': '2025-11-22 08:54:36.949467', '_unique_id': '63bc6c86e9e9416bb018c95c765fa901'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.949 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.950 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff1b8345-49d1-4ac9-a7a6-abd51ca79da7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.950664', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df980214-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': 'c10bb70a9fb030d6884a239971d20efe132d8af026bf3b1820781cc9f9444a34'}]}, 'timestamp': '2025-11-22 08:54:36.950895', '_unique_id': '31458b85149b4ce58b7d6de69e23af5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.951 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.952 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.952 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a08422d-0774-418b-a73a-b860a826d360', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.952323', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df98440e-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '75ea324cb18a1909566ad79d70ad72b6697b3a42ef8761be86c3bba2e347419a'}]}, 'timestamp': '2025-11-22 08:54:36.952643', '_unique_id': '22c599f1d26742ffbb3c406af2520434'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.953 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.954 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.954 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.write.requests volume: 234 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.954 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b452a953-60d1-48cd-9cd3-630ae2e21b44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 234, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.954075', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df988770-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': '0698efd1ed5ae2a3db93804a5335182a4b3829799033f44d208b8998b45ddc3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.954075', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df989224-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': '9e5605f00d8fd94496049f2183df9ee1fcc3337f1ae24332dc94a79c4b207a1e'}]}, 'timestamp': '2025-11-22 08:54:36.954627', '_unique_id': '9745e81d646f4f2dbdc5f3c4e351896c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.955 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9f260f1-f79d-42b1-aa94-fb87e97ae187', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9', 'timestamp': '2025-11-22T08:54:36.956008', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'df98d2f2-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.610588748, 'message_signature': '2216a2d6e2e2113513dfa685e7f069b26eff02607c2c56b38f905b751914b700'}]}, 'timestamp': '2025-11-22 08:54:36.956234', '_unique_id': '820b232da0cb4db9878d277028cf41cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.956 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.957 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.957 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004>]
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.957 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbe9e43d-dc2f-4e56-b41b-212dde7760c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.957797', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df991960-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': 'db67f7e0093e2752a4bd9114d54fff7a5fdd4de16a518e19829f8266db65f952'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.957797', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df992414-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': 'c4fc43073fb9415d022c6f6d963121bba5138b9bdeee658122da39bfebe18fde'}]}, 'timestamp': '2025-11-22 08:54:36.958339', '_unique_id': 'd2db08c4125043c89ea7d659d385e774'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.958 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.959 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f01225ba-45a8-4cba-8760-636ab6dd753e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.959937', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df996e7e-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '58be7c0d575c06156cd8a75215359e22a7bc976d23b17b4106c27f5cdbba903a'}]}, 'timestamp': '2025-11-22 08:54:36.960242', '_unique_id': 'd166fcb409d04b1cbb37e590459b383e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.960 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.961 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.961 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0a1e5ca-9e96-4821-a6e2-63fa3759f103', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000b8-90907795-67f7-464a-824a-fcd6047735a9-tap4ba6d1e2-e5', 'timestamp': '2025-11-22T08:54:36.961652', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'tap4ba6d1e2-e5', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:77:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ba6d1e2-e5'}, 'message_id': 'df99b08c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.564075394, 'message_signature': '6bdd3d2709ce632229d32c9a9b4cf4541080eaf07d6f124f41c767d692fdbe66'}]}, 'timestamp': '2025-11-22 08:54:36.961924', '_unique_id': 'b76cc3601db7401a96d3519bc9cb036d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.962 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.963 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.963 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'add5ac3c-b7c8-49ba-b350-12fe18de1875', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.963050', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df99e5a2-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.570339558, 'message_signature': 'e220c07168562fab7d911230df5916cf689769a03865e05eabdec02f75b7ba7a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.963050', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df99ed36-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.570339558, 'message_signature': '072ef03ad1e40bce27b837ab69c16a9b0ab8b053ce2003d1f389c1b6affd3b04'}]}, 'timestamp': '2025-11-22 08:54:36.963449', '_unique_id': 'b631e365e9d744c48652433713e4144d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.read.requests volume: 926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.964 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.read.requests volume: 89 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '308bba36-3b14-4c20-8345-cc77236bd805', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 926, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.964737', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df9a2792-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': '324473a57ee081bf517f66e955be520dcb12bf7aa9a755b136ca7ab0992a09b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 89, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.964737', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df9a314c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': '70892a9f44e374411aa0066f60fb3d1b444f25611d1decd5fe33c7b0d74af0b9'}]}, 'timestamp': '2025-11-22 08:54:36.965232', '_unique_id': '4e8fef2f7164475e93c9a4ac6f81af37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.965 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.966 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.966 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.write.latency volume: 74213887997 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.966 12 DEBUG ceilometer.compute.pollsters [-] 90907795-67f7-464a-824a-fcd6047735a9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c54aafc4-fb8e-41a5-b7f8-4e13c2c55121', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 74213887997, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-vda', 'timestamp': '2025-11-22T08:54:36.966496', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'df9a6d1a-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': 'c3d97480401864120d919ec303d925c96eb8248cee37225ee189705855c2ca2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': '90907795-67f7-464a-824a-fcd6047735a9-sda', 'timestamp': '2025-11-22T08:54:36.966496', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004', 'name': 'instance-000000b8', 'instance_id': '90907795-67f7-464a-824a-fcd6047735a9', 'instance_type': 'm1.nano', 'host': 'a10bd81c4c1c647c22bdb223dfc75d5f385d9b05ffd1c7ca61ac9d3f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'df9a760c-c780-11f0-941d-fa163e6775e5', 'monotonic_time': 8417.615532821, 'message_signature': '73736d2379622bfa13fc4a2da91023bacfd259fba3710c424fdcffdab70df66c'}]}, 'timestamp': '2025-11-22 08:54:36.966988', '_unique_id': '018b66a8572d43e99531c75d8308f4b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.967 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.968 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:54:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:54:36.968 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004>]
Nov 22 03:54:37 np0005531888 nova_compute[186788]: 2025-11-22 08:54:37.165 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updating instance_info_cache with network_info: [{"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:54:37 np0005531888 nova_compute[186788]: 2025-11-22 08:54:37.178 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:54:37 np0005531888 nova_compute[186788]: 2025-11-22 08:54:37.179 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:54:37 np0005531888 nova_compute[186788]: 2025-11-22 08:54:37.179 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:37Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:77:f2 10.100.0.11
Nov 22 03:54:37 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:37Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:77:f2 10.100.0.11
Nov 22 03:54:39 np0005531888 nova_compute[186788]: 2025-11-22 08:54:39.778 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:41 np0005531888 nova_compute[186788]: 2025-11-22 08:54:41.022 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:41 np0005531888 nova_compute[186788]: 2025-11-22 08:54:41.071 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:43 np0005531888 podman[253616]: 2025-11-22 08:54:43.687083547 +0000 UTC m=+0.054050491 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:54:43 np0005531888 podman[253617]: 2025-11-22 08:54:43.700762052 +0000 UTC m=+0.055785673 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:54:44 np0005531888 nova_compute[186788]: 2025-11-22 08:54:44.781 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:45 np0005531888 nova_compute[186788]: 2025-11-22 08:54:45.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:46 np0005531888 nova_compute[186788]: 2025-11-22 08:54:46.024 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:48 np0005531888 podman[253660]: 2025-11-22 08:54:48.686897589 +0000 UTC m=+0.062094168 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:54:48 np0005531888 podman[253661]: 2025-11-22 08:54:48.723360957 +0000 UTC m=+0.091884601 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:54:48 np0005531888 podman[253662]: 2025-11-22 08:54:48.730457962 +0000 UTC m=+0.095939902 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:54:48 np0005531888 nova_compute[186788]: 2025-11-22 08:54:48.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:49 np0005531888 nova_compute[186788]: 2025-11-22 08:54:49.782 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:49 np0005531888 nova_compute[186788]: 2025-11-22 08:54:49.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:49 np0005531888 nova_compute[186788]: 2025-11-22 08:54:49.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:54:50 np0005531888 nova_compute[186788]: 2025-11-22 08:54:50.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:51 np0005531888 nova_compute[186788]: 2025-11-22 08:54:51.026 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:52 np0005531888 nova_compute[186788]: 2025-11-22 08:54:52.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:54 np0005531888 nova_compute[186788]: 2025-11-22 08:54:54.785 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:55 np0005531888 nova_compute[186788]: 2025-11-22 08:54:55.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:55 np0005531888 nova_compute[186788]: 2025-11-22 08:54:55.985 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:55 np0005531888 nova_compute[186788]: 2025-11-22 08:54:55.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:55 np0005531888 nova_compute[186788]: 2025-11-22 08:54:55.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:55 np0005531888 nova_compute[186788]: 2025-11-22 08:54:55.986 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.028 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.068 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.150 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.151 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.216 186792 DEBUG oslo_concurrency.processutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.403 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.405 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5530MB free_disk=73.23012161254883GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.405 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.405 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:56 np0005531888 ovn_controller[95067]: 2025-11-22T08:54:56Z|00775|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.510 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance 90907795-67f7-464a-824a-fcd6047735a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.510 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.510 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.569 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.584 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.609 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:54:56 np0005531888 nova_compute[186788]: 2025-11-22 08:54:56.610 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:59 np0005531888 nova_compute[186788]: 2025-11-22 08:54:59.788 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:00 np0005531888 nova_compute[186788]: 2025-11-22 08:55:00.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:00 np0005531888 nova_compute[186788]: 2025-11-22 08:55:00.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:55:01 np0005531888 nova_compute[186788]: 2025-11-22 08:55:01.030 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:03 np0005531888 podman[253734]: 2025-11-22 08:55:03.676870287 +0000 UTC m=+0.052768179 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Nov 22 03:55:03 np0005531888 podman[253733]: 2025-11-22 08:55:03.69855197 +0000 UTC m=+0.077625200 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:55:04 np0005531888 nova_compute[186788]: 2025-11-22 08:55:04.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:06 np0005531888 nova_compute[186788]: 2025-11-22 08:55:06.031 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:06 np0005531888 nova_compute[186788]: 2025-11-22 08:55:06.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:08.802 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:55:08 np0005531888 nova_compute[186788]: 2025-11-22 08:55:08.803 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:08 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:08.804 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:55:09 np0005531888 nova_compute[186788]: 2025-11-22 08:55:09.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:10 np0005531888 nova_compute[186788]: 2025-11-22 08:55:10.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:10 np0005531888 nova_compute[186788]: 2025-11-22 08:55:10.965 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:55:10 np0005531888 nova_compute[186788]: 2025-11-22 08:55:10.983 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:55:11 np0005531888 nova_compute[186788]: 2025-11-22 08:55:11.034 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:12 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:12.806 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:55:14 np0005531888 podman[253774]: 2025-11-22 08:55:14.710343271 +0000 UTC m=+0.084698045 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:55:14 np0005531888 podman[253775]: 2025-11-22 08:55:14.711469289 +0000 UTC m=+0.080001260 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:55:14 np0005531888 nova_compute[186788]: 2025-11-22 08:55:14.792 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:16 np0005531888 nova_compute[186788]: 2025-11-22 08:55:16.036 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:19 np0005531888 podman[253816]: 2025-11-22 08:55:19.690545792 +0000 UTC m=+0.064143260 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Nov 22 03:55:19 np0005531888 podman[253817]: 2025-11-22 08:55:19.714772067 +0000 UTC m=+0.081753901 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 03:55:19 np0005531888 podman[253818]: 2025-11-22 08:55:19.742841538 +0000 UTC m=+0.107962637 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:55:19 np0005531888 nova_compute[186788]: 2025-11-22 08:55:19.794 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:19 np0005531888 nova_compute[186788]: 2025-11-22 08:55:19.969 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:21 np0005531888 nova_compute[186788]: 2025-11-22 08:55:21.039 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:24 np0005531888 nova_compute[186788]: 2025-11-22 08:55:24.798 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:26 np0005531888 nova_compute[186788]: 2025-11-22 08:55:26.041 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:29 np0005531888 nova_compute[186788]: 2025-11-22 08:55:29.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:31 np0005531888 nova_compute[186788]: 2025-11-22 08:55:31.045 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:34 np0005531888 podman[253885]: 2025-11-22 08:55:34.680316666 +0000 UTC m=+0.051001326 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:55:34 np0005531888 podman[253884]: 2025-11-22 08:55:34.704622354 +0000 UTC m=+0.077271342 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:55:34 np0005531888 nova_compute[186788]: 2025-11-22 08:55:34.804 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:35 np0005531888 nova_compute[186788]: 2025-11-22 08:55:35.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:35 np0005531888 nova_compute[186788]: 2025-11-22 08:55:35.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:55:35 np0005531888 nova_compute[186788]: 2025-11-22 08:55:35.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:55:36 np0005531888 nova_compute[186788]: 2025-11-22 08:55:36.047 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:36 np0005531888 nova_compute[186788]: 2025-11-22 08:55:36.203 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:55:36 np0005531888 nova_compute[186788]: 2025-11-22 08:55:36.203 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquired lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:55:36 np0005531888 nova_compute[186788]: 2025-11-22 08:55:36.203 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:55:36 np0005531888 nova_compute[186788]: 2025-11-22 08:55:36.203 186792 DEBUG nova.objects.instance [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 90907795-67f7-464a-824a-fcd6047735a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:55:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:36.877 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:36.878 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:36.878 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:39 np0005531888 nova_compute[186788]: 2025-11-22 08:55:39.805 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:39 np0005531888 nova_compute[186788]: 2025-11-22 08:55:39.836 186792 DEBUG nova.network.neutron [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updating instance_info_cache with network_info: [{"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:55:39 np0005531888 nova_compute[186788]: 2025-11-22 08:55:39.858 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Releasing lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:55:39 np0005531888 nova_compute[186788]: 2025-11-22 08:55:39.859 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:55:39 np0005531888 nova_compute[186788]: 2025-11-22 08:55:39.859 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:40 np0005531888 nova_compute[186788]: 2025-11-22 08:55:40.901 186792 DEBUG nova.compute.manager [req-24101bf1-fd79-41d0-b7ef-65e2554c1495 req-e8131da2-76a1-4963-9aed-9abaf59f9100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-changed-4ba6d1e2-e54f-4374-b3a4-34f95f06745a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:55:40 np0005531888 nova_compute[186788]: 2025-11-22 08:55:40.901 186792 DEBUG nova.compute.manager [req-24101bf1-fd79-41d0-b7ef-65e2554c1495 req-e8131da2-76a1-4963-9aed-9abaf59f9100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Refreshing instance network info cache due to event network-changed-4ba6d1e2-e54f-4374-b3a4-34f95f06745a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:55:40 np0005531888 nova_compute[186788]: 2025-11-22 08:55:40.901 186792 DEBUG oslo_concurrency.lockutils [req-24101bf1-fd79-41d0-b7ef-65e2554c1495 req-e8131da2-76a1-4963-9aed-9abaf59f9100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:55:40 np0005531888 nova_compute[186788]: 2025-11-22 08:55:40.901 186792 DEBUG oslo_concurrency.lockutils [req-24101bf1-fd79-41d0-b7ef-65e2554c1495 req-e8131da2-76a1-4963-9aed-9abaf59f9100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:55:40 np0005531888 nova_compute[186788]: 2025-11-22 08:55:40.902 186792 DEBUG nova.network.neutron [req-24101bf1-fd79-41d0-b7ef-65e2554c1495 req-e8131da2-76a1-4963-9aed-9abaf59f9100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Refreshing network info cache for port 4ba6d1e2-e54f-4374-b3a4-34f95f06745a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.049 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.263 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "90907795-67f7-464a-824a-fcd6047735a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.264 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.264 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "90907795-67f7-464a-824a-fcd6047735a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.264 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.265 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.276 186792 INFO nova.compute.manager [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Terminating instance#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.284 186792 DEBUG nova.compute.manager [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:55:41 np0005531888 kernel: tap4ba6d1e2-e5 (unregistering): left promiscuous mode
Nov 22 03:55:41 np0005531888 NetworkManager[55166]: <info>  [1763801741.3062] device (tap4ba6d1e2-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.315 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:55:41Z|00776|binding|INFO|Releasing lport 4ba6d1e2-e54f-4374-b3a4-34f95f06745a from this chassis (sb_readonly=0)
Nov 22 03:55:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:55:41Z|00777|binding|INFO|Setting lport 4ba6d1e2-e54f-4374-b3a4-34f95f06745a down in Southbound
Nov 22 03:55:41 np0005531888 ovn_controller[95067]: 2025-11-22T08:55:41Z|00778|binding|INFO|Removing iface tap4ba6d1e2-e5 ovn-installed in OVS
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.332 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:41 np0005531888 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b8.scope: Deactivated successfully.
Nov 22 03:55:41 np0005531888 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b8.scope: Consumed 17.575s CPU time.
Nov 22 03:55:41 np0005531888 systemd-machined[153106]: Machine qemu-88-instance-000000b8 terminated.
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.515 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:77:f2 10.100.0.11'], port_security=['fa:16:3e:24:77:f2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '90907795-67f7-464a-824a-fcd6047735a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '4', 'neutron:security_group_ids': '984f816a-7307-432f-9913-aaee1df5bf08 b3726af1-bbad-4493-94ab-7644017fbf88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=809b4a06-a3bb-45c6-b4f6-e663731ee64f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=4ba6d1e2-e54f-4374-b3a4-34f95f06745a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.517 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 4ba6d1e2-e54f-4374-b3a4-34f95f06745a in datapath 4d568a4f-3fc1-4760-b924-569e98e1b4a7 unbound from our chassis#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.519 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d568a4f-3fc1-4760-b924-569e98e1b4a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.521 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[803e9818-8085-4f45-a5ea-b8b2618c59f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.521 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 namespace which is not needed anymore#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.557 186792 INFO nova.virt.libvirt.driver [-] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Instance destroyed successfully.#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.557 186792 DEBUG nova.objects.instance [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'resources' on Instance uuid 90907795-67f7-464a-824a-fcd6047735a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.571 186792 DEBUG nova.virt.libvirt.vif [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:54:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-1839192004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=184,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAw4mkfFiRvcPYUE0PIsjDRQ2l9YGHYiiINwqgLaZjT4Jz+8V9nq9XzUIN6IBe3EfaIEfpC2/icXPG/z2BuoLG3JQ3o5sNQAUv0uO8d73RniLqWnU/BDSHGzBNmpYdI7sw==',key_name='tempest-TestSecurityGroupsBasicOps-2127958203',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:54:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-u3849c8o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:54:21Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=90907795-67f7-464a-824a-fcd6047735a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.572 186792 DEBUG nova.network.os_vif_util [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.573 186792 DEBUG nova.network.os_vif_util [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:77:f2,bridge_name='br-int',has_traffic_filtering=True,id=4ba6d1e2-e54f-4374-b3a4-34f95f06745a,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba6d1e2-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.573 186792 DEBUG os_vif [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:77:f2,bridge_name='br-int',has_traffic_filtering=True,id=4ba6d1e2-e54f-4374-b3a4-34f95f06745a,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba6d1e2-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.575 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.575 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ba6d1e2-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.577 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.580 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.582 186792 INFO os_vif [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:77:f2,bridge_name='br-int',has_traffic_filtering=True,id=4ba6d1e2-e54f-4374-b3a4-34f95f06745a,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba6d1e2-e5')#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.583 186792 INFO nova.virt.libvirt.driver [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Deleting instance files /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9_del#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.584 186792 INFO nova.virt.libvirt.driver [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Deletion of /var/lib/nova/instances/90907795-67f7-464a-824a-fcd6047735a9_del complete#033[00m
Nov 22 03:55:41 np0005531888 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[253536]: [NOTICE]   (253540) : haproxy version is 2.8.14-c23fe91
Nov 22 03:55:41 np0005531888 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[253536]: [NOTICE]   (253540) : path to executable is /usr/sbin/haproxy
Nov 22 03:55:41 np0005531888 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[253536]: [WARNING]  (253540) : Exiting Master process...
Nov 22 03:55:41 np0005531888 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[253536]: [ALERT]    (253540) : Current worker (253542) exited with code 143 (Terminated)
Nov 22 03:55:41 np0005531888 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[253536]: [WARNING]  (253540) : All workers exited. Exiting... (0)
Nov 22 03:55:41 np0005531888 systemd[1]: libpod-4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3.scope: Deactivated successfully.
Nov 22 03:55:41 np0005531888 podman[253965]: 2025-11-22 08:55:41.665147571 +0000 UTC m=+0.048450112 container died 4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:55:41 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3-userdata-shm.mount: Deactivated successfully.
Nov 22 03:55:41 np0005531888 systemd[1]: var-lib-containers-storage-overlay-5c0ffd2156db8f50409e720985cf860d316f1fc39b98096a13619a5a09fe2f15-merged.mount: Deactivated successfully.
Nov 22 03:55:41 np0005531888 podman[253965]: 2025-11-22 08:55:41.723270422 +0000 UTC m=+0.106572963 container cleanup 4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:55:41 np0005531888 systemd[1]: libpod-conmon-4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3.scope: Deactivated successfully.
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.802 186792 INFO nova.compute.manager [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Took 0.52 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.803 186792 DEBUG oslo.service.loopingcall [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.803 186792 DEBUG nova.compute.manager [-] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.803 186792 DEBUG nova.network.neutron [-] [instance: 90907795-67f7-464a-824a-fcd6047735a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:55:41 np0005531888 podman[253993]: 2025-11-22 08:55:41.808365665 +0000 UTC m=+0.064336433 container remove 4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.815 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2d974d23-65c6-48ac-8f3e-d4ab7ecc08f3]: (4, ('Sat Nov 22 08:55:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 (4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3)\n4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3\nSat Nov 22 08:55:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 (4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3)\n4604f7721c8f5268958701fd409d638d0ab4671b891617d193b16735d12ef9c3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.817 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[3cdcca8c-1c69-4380-b4a5-aef3ca0e58f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.818 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d568a4f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.820 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:41 np0005531888 kernel: tap4d568a4f-30: left promiscuous mode
Nov 22 03:55:41 np0005531888 nova_compute[186788]: 2025-11-22 08:55:41.833 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.837 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[95cfb46a-df4b-478b-9ac2-d9e3d636b4f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.853 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[a2998fa7-f0f0-407c-a356-085ab46d43b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.855 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[08158ac2-10ed-499f-8f4c-1e10e51b4515]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.870 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f759aec5-b394-4f18-ae3a-94794ec4a2ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 840076, 'reachable_time': 34886, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254008, 'error': None, 'target': 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.873 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:55:41 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:55:41.873 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f870c5-dc17-4761-bba9-98cab879a792]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:55:41 np0005531888 systemd[1]: run-netns-ovnmeta\x2d4d568a4f\x2d3fc1\x2d4760\x2db924\x2d569e98e1b4a7.mount: Deactivated successfully.
Nov 22 03:55:42 np0005531888 nova_compute[186788]: 2025-11-22 08:55:42.641 186792 DEBUG nova.network.neutron [req-24101bf1-fd79-41d0-b7ef-65e2554c1495 req-e8131da2-76a1-4963-9aed-9abaf59f9100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updated VIF entry in instance network info cache for port 4ba6d1e2-e54f-4374-b3a4-34f95f06745a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:55:42 np0005531888 nova_compute[186788]: 2025-11-22 08:55:42.641 186792 DEBUG nova.network.neutron [req-24101bf1-fd79-41d0-b7ef-65e2554c1495 req-e8131da2-76a1-4963-9aed-9abaf59f9100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updating instance_info_cache with network_info: [{"id": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "address": "fa:16:3e:24:77:f2", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba6d1e2-e5", "ovs_interfaceid": "4ba6d1e2-e54f-4374-b3a4-34f95f06745a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:55:42 np0005531888 nova_compute[186788]: 2025-11-22 08:55:42.853 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:42 np0005531888 nova_compute[186788]: 2025-11-22 08:55:42.938 186792 DEBUG oslo_concurrency.lockutils [req-24101bf1-fd79-41d0-b7ef-65e2554c1495 req-e8131da2-76a1-4963-9aed-9abaf59f9100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-90907795-67f7-464a-824a-fcd6047735a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.002 186792 DEBUG nova.compute.manager [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-vif-unplugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.002 186792 DEBUG oslo_concurrency.lockutils [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "90907795-67f7-464a-824a-fcd6047735a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.002 186792 DEBUG oslo_concurrency.lockutils [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.002 186792 DEBUG oslo_concurrency.lockutils [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.002 186792 DEBUG nova.compute.manager [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] No waiting events found dispatching network-vif-unplugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.003 186792 DEBUG nova.compute.manager [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-vif-unplugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.003 186792 DEBUG nova.compute.manager [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.003 186792 DEBUG oslo_concurrency.lockutils [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "90907795-67f7-464a-824a-fcd6047735a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.003 186792 DEBUG oslo_concurrency.lockutils [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.003 186792 DEBUG oslo_concurrency.lockutils [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.003 186792 DEBUG nova.compute.manager [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] No waiting events found dispatching network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.003 186792 WARNING nova.compute.manager [req-f927c120-f75b-4ade-a242-409e4a636564 req-db342abd-8b9d-42f8-8c3d-29720e880356 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received unexpected event network-vif-plugged-4ba6d1e2-e54f-4374-b3a4-34f95f06745a for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.345 186792 DEBUG nova.network.neutron [-] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.425 186792 INFO nova.compute.manager [-] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Took 1.62 seconds to deallocate network for instance.#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.594 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.595 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.650 186792 DEBUG nova.compute.provider_tree [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.664 186792 DEBUG nova.scheduler.client.report [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.729 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:43 np0005531888 nova_compute[186788]: 2025-11-22 08:55:43.938 186792 INFO nova.scheduler.client.report [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Deleted allocations for instance 90907795-67f7-464a-824a-fcd6047735a9#033[00m
Nov 22 03:55:44 np0005531888 nova_compute[186788]: 2025-11-22 08:55:44.123 186792 DEBUG oslo_concurrency.lockutils [None req-bce86b05-8f50-491f-a61c-cd67e8daef5d 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "90907795-67f7-464a-824a-fcd6047735a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.859s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:44 np0005531888 nova_compute[186788]: 2025-11-22 08:55:44.806 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:45 np0005531888 nova_compute[186788]: 2025-11-22 08:55:45.086 186792 DEBUG nova.compute.manager [req-6d62efad-9686-416f-9832-0472ab437b7e req-442de6c2-a76a-4d25-82db-ef0103d769c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Received event network-vif-deleted-4ba6d1e2-e54f-4374-b3a4-34f95f06745a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:55:45 np0005531888 podman[254010]: 2025-11-22 08:55:45.680404894 +0000 UTC m=+0.052411870 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:55:45 np0005531888 podman[254009]: 2025-11-22 08:55:45.682418073 +0000 UTC m=+0.055373163 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 03:55:46 np0005531888 nova_compute[186788]: 2025-11-22 08:55:46.578 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:46 np0005531888 nova_compute[186788]: 2025-11-22 08:55:46.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:48 np0005531888 nova_compute[186788]: 2025-11-22 08:55:48.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:49 np0005531888 nova_compute[186788]: 2025-11-22 08:55:49.808 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:50 np0005531888 podman[254048]: 2025-11-22 08:55:50.688608513 +0000 UTC m=+0.059482634 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:55:50 np0005531888 podman[254049]: 2025-11-22 08:55:50.700333381 +0000 UTC m=+0.065336567 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:55:50 np0005531888 podman[254050]: 2025-11-22 08:55:50.729442368 +0000 UTC m=+0.090144739 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:55:50 np0005531888 nova_compute[186788]: 2025-11-22 08:55:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:51 np0005531888 nova_compute[186788]: 2025-11-22 08:55:51.580 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:51 np0005531888 nova_compute[186788]: 2025-11-22 08:55:51.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:51 np0005531888 nova_compute[186788]: 2025-11-22 08:55:51.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:55:52 np0005531888 nova_compute[186788]: 2025-11-22 08:55:52.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:53 np0005531888 nova_compute[186788]: 2025-11-22 08:55:53.057 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:53 np0005531888 nova_compute[186788]: 2025-11-22 08:55:53.148 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:54 np0005531888 nova_compute[186788]: 2025-11-22 08:55:54.811 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.554 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801741.5517294, 90907795-67f7-464a-824a-fcd6047735a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.555 186792 INFO nova.compute.manager [-] [instance: 90907795-67f7-464a-824a-fcd6047735a9] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.581 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.592 186792 DEBUG nova.compute.manager [None req-e37b0fbf-8f27-4bbd-b4b9-73f449dd1e93 - - - - - -] [instance: 90907795-67f7-464a-824a-fcd6047735a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.985 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.985 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:56 np0005531888 nova_compute[186788]: 2025-11-22 08:55:56.985 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.208 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.210 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5702MB free_disk=73.25945663452148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.211 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.211 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.302 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.302 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.316 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.345 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.346 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.376 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.405 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.438 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.453 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.475 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:55:57 np0005531888 nova_compute[186788]: 2025-11-22 08:55:57.475 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:59 np0005531888 nova_compute[186788]: 2025-11-22 08:55:59.813 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:01 np0005531888 nova_compute[186788]: 2025-11-22 08:56:01.584 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:04 np0005531888 nova_compute[186788]: 2025-11-22 08:56:04.814 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:05 np0005531888 podman[254114]: 2025-11-22 08:56:05.686024934 +0000 UTC m=+0.053374814 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 03:56:05 np0005531888 podman[254113]: 2025-11-22 08:56:05.687985303 +0000 UTC m=+0.058365588 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:56:06 np0005531888 nova_compute[186788]: 2025-11-22 08:56:06.587 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:09 np0005531888 nova_compute[186788]: 2025-11-22 08:56:09.815 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:11 np0005531888 nova_compute[186788]: 2025-11-22 08:56:11.589 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:14 np0005531888 nova_compute[186788]: 2025-11-22 08:56:14.817 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:56:15.075 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:56:15 np0005531888 nova_compute[186788]: 2025-11-22 08:56:15.076 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:15 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:56:15.077 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:56:16 np0005531888 nova_compute[186788]: 2025-11-22 08:56:16.590 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:16 np0005531888 podman[254154]: 2025-11-22 08:56:16.676415205 +0000 UTC m=+0.048695500 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:56:16 np0005531888 podman[254153]: 2025-11-22 08:56:16.688695477 +0000 UTC m=+0.063882183 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 22 03:56:19 np0005531888 nova_compute[186788]: 2025-11-22 08:56:19.818 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:21 np0005531888 nova_compute[186788]: 2025-11-22 08:56:21.593 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:21 np0005531888 podman[254195]: 2025-11-22 08:56:21.67749519 +0000 UTC m=+0.054805831 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, version=9.6, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc.)
Nov 22 03:56:21 np0005531888 podman[254196]: 2025-11-22 08:56:21.68725693 +0000 UTC m=+0.055040826 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:56:21 np0005531888 podman[254197]: 2025-11-22 08:56:21.722140968 +0000 UTC m=+0.088829607 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 22 03:56:22 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:56:22.080 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:56:24 np0005531888 nova_compute[186788]: 2025-11-22 08:56:24.820 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:26 np0005531888 nova_compute[186788]: 2025-11-22 08:56:26.401 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:26 np0005531888 nova_compute[186788]: 2025-11-22 08:56:26.595 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:29 np0005531888 nova_compute[186788]: 2025-11-22 08:56:29.821 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:31 np0005531888 nova_compute[186788]: 2025-11-22 08:56:31.597 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:34 np0005531888 nova_compute[186788]: 2025-11-22 08:56:34.823 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:35 np0005531888 nova_compute[186788]: 2025-11-22 08:56:35.978 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:36 np0005531888 ovn_controller[95067]: 2025-11-22T08:56:36Z|00779|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Nov 22 03:56:36 np0005531888 nova_compute[186788]: 2025-11-22 08:56:36.599 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:36 np0005531888 podman[254259]: 2025-11-22 08:56:36.681742645 +0000 UTC m=+0.058334236 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:56:36 np0005531888 podman[254260]: 2025-11-22 08:56:36.682225097 +0000 UTC m=+0.053044266 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.860 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:56:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:56:36.878 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:56:36.879 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:56:36.879 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:36 np0005531888 nova_compute[186788]: 2025-11-22 08:56:36.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:36 np0005531888 nova_compute[186788]: 2025-11-22 08:56:36.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:56:36 np0005531888 nova_compute[186788]: 2025-11-22 08:56:36.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:56:36 np0005531888 nova_compute[186788]: 2025-11-22 08:56:36.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:56:39 np0005531888 nova_compute[186788]: 2025-11-22 08:56:39.825 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:40 np0005531888 nova_compute[186788]: 2025-11-22 08:56:40.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:41 np0005531888 nova_compute[186788]: 2025-11-22 08:56:41.601 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:44 np0005531888 nova_compute[186788]: 2025-11-22 08:56:44.827 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:46 np0005531888 nova_compute[186788]: 2025-11-22 08:56:46.603 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:47 np0005531888 podman[254304]: 2025-11-22 08:56:47.682805218 +0000 UTC m=+0.049331164 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:56:47 np0005531888 podman[254303]: 2025-11-22 08:56:47.694979708 +0000 UTC m=+0.064072437 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:56:47 np0005531888 nova_compute[186788]: 2025-11-22 08:56:47.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:49 np0005531888 nova_compute[186788]: 2025-11-22 08:56:49.829 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:49 np0005531888 nova_compute[186788]: 2025-11-22 08:56:49.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:50 np0005531888 nova_compute[186788]: 2025-11-22 08:56:50.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:51 np0005531888 nova_compute[186788]: 2025-11-22 08:56:51.606 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:52 np0005531888 podman[254346]: 2025-11-22 08:56:52.686378015 +0000 UTC m=+0.060338056 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 22 03:56:52 np0005531888 podman[254345]: 2025-11-22 08:56:52.686473127 +0000 UTC m=+0.060847948 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, version=9.6, architecture=x86_64, config_id=edpm, vcs-type=git, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Nov 22 03:56:52 np0005531888 podman[254347]: 2025-11-22 08:56:52.72276891 +0000 UTC m=+0.086937300 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 03:56:52 np0005531888 nova_compute[186788]: 2025-11-22 08:56:52.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:52 np0005531888 nova_compute[186788]: 2025-11-22 08:56:52.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.254 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "eef702de-73fd-4f21-bb6c-40b922ce92b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.254 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.278 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.486 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.487 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.493 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.494 186792 INFO nova.compute.claims [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.664 186792 DEBUG nova.compute.provider_tree [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.680 186792 DEBUG nova.scheduler.client.report [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.704 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.706 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.759 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.760 186792 DEBUG nova.network.neutron [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.777 186792 INFO nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.804 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.830 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.923 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.925 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.926 186792 INFO nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Creating image(s)#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.927 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "/var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.928 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.929 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.957 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.959 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:56:54 np0005531888 nova_compute[186788]: 2025-11-22 08:56:54.991 186792 DEBUG nova.policy [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.024 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.025 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.025 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.041 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.098 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.100 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.293 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk 1073741824" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.294 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.294 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.354 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.355 186792 DEBUG nova.virt.disk.api [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Checking if we can resize image /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.355 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.413 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.414 186792 DEBUG nova.virt.disk.api [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Cannot resize image /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.414 186792 DEBUG nova.objects.instance [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'migration_context' on Instance uuid eef702de-73fd-4f21-bb6c-40b922ce92b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.454 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.454 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Ensure instance console log exists: /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.455 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.455 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.456 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:55 np0005531888 nova_compute[186788]: 2025-11-22 08:56:55.819 186792 DEBUG nova.network.neutron [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Successfully created port: 71c0be23-6331-45a3-8a79-5b58adf7f6bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:56:56 np0005531888 nova_compute[186788]: 2025-11-22 08:56:56.608 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:56 np0005531888 nova_compute[186788]: 2025-11-22 08:56:56.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:56 np0005531888 nova_compute[186788]: 2025-11-22 08:56:56.985 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:56 np0005531888 nova_compute[186788]: 2025-11-22 08:56:56.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:56 np0005531888 nova_compute[186788]: 2025-11-22 08:56:56.986 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:56 np0005531888 nova_compute[186788]: 2025-11-22 08:56:56.986 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.140 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.142 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5694MB free_disk=73.25924682617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.142 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.142 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.241 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Instance eef702de-73fd-4f21-bb6c-40b922ce92b7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.242 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.242 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.297 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.310 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.337 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:56:57 np0005531888 nova_compute[186788]: 2025-11-22 08:56:57.337 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:58 np0005531888 nova_compute[186788]: 2025-11-22 08:56:58.242 186792 DEBUG nova.network.neutron [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Successfully updated port: 71c0be23-6331-45a3-8a79-5b58adf7f6bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:56:58 np0005531888 nova_compute[186788]: 2025-11-22 08:56:58.262 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:56:58 np0005531888 nova_compute[186788]: 2025-11-22 08:56:58.263 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquired lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:56:58 np0005531888 nova_compute[186788]: 2025-11-22 08:56:58.263 186792 DEBUG nova.network.neutron [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:56:58 np0005531888 nova_compute[186788]: 2025-11-22 08:56:58.456 186792 DEBUG nova.compute.manager [req-8088317d-460a-44e1-96a2-e97a60e45c22 req-af841589-1ba0-4f4b-9b4c-40335784f580 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received event network-changed-71c0be23-6331-45a3-8a79-5b58adf7f6bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:56:58 np0005531888 nova_compute[186788]: 2025-11-22 08:56:58.456 186792 DEBUG nova.compute.manager [req-8088317d-460a-44e1-96a2-e97a60e45c22 req-af841589-1ba0-4f4b-9b4c-40335784f580 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Refreshing instance network info cache due to event network-changed-71c0be23-6331-45a3-8a79-5b58adf7f6bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:56:58 np0005531888 nova_compute[186788]: 2025-11-22 08:56:58.456 186792 DEBUG oslo_concurrency.lockutils [req-8088317d-460a-44e1-96a2-e97a60e45c22 req-af841589-1ba0-4f4b-9b4c-40335784f580 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:56:58 np0005531888 nova_compute[186788]: 2025-11-22 08:56:58.514 186792 DEBUG nova.network.neutron [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:56:59 np0005531888 nova_compute[186788]: 2025-11-22 08:56:59.832 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.030 186792 DEBUG nova.network.neutron [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Updating instance_info_cache with network_info: [{"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.051 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Releasing lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.052 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Instance network_info: |[{"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.052 186792 DEBUG oslo_concurrency.lockutils [req-8088317d-460a-44e1-96a2-e97a60e45c22 req-af841589-1ba0-4f4b-9b4c-40335784f580 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.052 186792 DEBUG nova.network.neutron [req-8088317d-460a-44e1-96a2-e97a60e45c22 req-af841589-1ba0-4f4b-9b4c-40335784f580 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Refreshing network info cache for port 71c0be23-6331-45a3-8a79-5b58adf7f6bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.055 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Start _get_guest_xml network_info=[{"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'encryption_format': None, 'device_type': 'disk', 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.059 186792 WARNING nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.065 186792 DEBUG nova.virt.libvirt.host [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.065 186792 DEBUG nova.virt.libvirt.host [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.070 186792 DEBUG nova.virt.libvirt.host [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.070 186792 DEBUG nova.virt.libvirt.host [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.071 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.072 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.072 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.072 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.073 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.073 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.073 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.073 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.074 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.074 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.074 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.074 186792 DEBUG nova.virt.hardware [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.078 186792 DEBUG nova.virt.libvirt.vif [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:56:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-766773495',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-766773495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=187,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPHag9aku7GBAkvmU1fN0BEWX+GKKnlz+wxkaBB81usyGYFIYXdms2nPbWtkH6Pt7jECf5QAIbXgbB8vKLjCswltA0JVkNEQJtQT24F3RbwiywHh6gfKMrOlaLUdm6xTrw==',key_name='tempest-TestSecurityGroupsBasicOps-1099778393',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-4h80siie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:56:54Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=eef702de-73fd-4f21-bb6c-40b922ce92b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.078 186792 DEBUG nova.network.os_vif_util [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.079 186792 DEBUG nova.network.os_vif_util [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:1e:c7,bridge_name='br-int',has_traffic_filtering=True,id=71c0be23-6331-45a3-8a79-5b58adf7f6bd,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71c0be23-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.080 186792 DEBUG nova.objects.instance [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'pci_devices' on Instance uuid eef702de-73fd-4f21-bb6c-40b922ce92b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.094 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <uuid>eef702de-73fd-4f21-bb6c-40b922ce92b7</uuid>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <name>instance-000000bb</name>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <memory>131072</memory>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <vcpu>1</vcpu>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <metadata>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-766773495</nova:name>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <nova:creationTime>2025-11-22 08:57:00</nova:creationTime>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <nova:flavor name="m1.nano">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        <nova:memory>128</nova:memory>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        <nova:disk>1</nova:disk>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        <nova:swap>0</nova:swap>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      </nova:flavor>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <nova:owner>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        <nova:user uuid="7bb85b33f2b44468ab5d86bf5ba98421">tempest-TestSecurityGroupsBasicOps-588574044-project-member</nova:user>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        <nova:project uuid="b5da13b07bb34fc3b4cd1452f7dd6971">tempest-TestSecurityGroupsBasicOps-588574044</nova:project>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      </nova:owner>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <nova:ports>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        <nova:port uuid="71c0be23-6331-45a3-8a79-5b58adf7f6bd">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:        </nova:port>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      </nova:ports>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </nova:instance>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  </metadata>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <sysinfo type="smbios">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <system>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <entry name="serial">eef702de-73fd-4f21-bb6c-40b922ce92b7</entry>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <entry name="uuid">eef702de-73fd-4f21-bb6c-40b922ce92b7</entry>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </system>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  </sysinfo>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <os>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <boot dev="hd"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <smbios mode="sysinfo"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  </os>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <features>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <acpi/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <apic/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <vmcoreinfo/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  </features>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <clock offset="utc">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <timer name="hpet" present="no"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  </clock>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <cpu mode="custom" match="exact">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <model>Nehalem</model>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  </cpu>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  <devices>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <disk type="file" device="disk">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <target dev="vda" bus="virtio"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <disk type="file" device="cdrom">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <source file="/var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk.config"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <target dev="sda" bus="sata"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </disk>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <interface type="ethernet">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <mac address="fa:16:3e:b0:1e:c7"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <mtu size="1442"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <target dev="tap71c0be23-63"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </interface>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <serial type="pty">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <log file="/var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/console.log" append="off"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </serial>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <video>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <model type="virtio"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </video>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <input type="tablet" bus="usb"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <rng model="virtio">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </rng>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <controller type="usb" index="0"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    <memballoon model="virtio">
Nov 22 03:57:00 np0005531888 nova_compute[186788]:      <stats period="10"/>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:    </memballoon>
Nov 22 03:57:00 np0005531888 nova_compute[186788]:  </devices>
Nov 22 03:57:00 np0005531888 nova_compute[186788]: </domain>
Nov 22 03:57:00 np0005531888 nova_compute[186788]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.095 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Preparing to wait for external event network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.096 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.096 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.096 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.097 186792 DEBUG nova.virt.libvirt.vif [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:56:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-766773495',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-766773495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=187,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPHag9aku7GBAkvmU1fN0BEWX+GKKnlz+wxkaBB81usyGYFIYXdms2nPbWtkH6Pt7jECf5QAIbXgbB8vKLjCswltA0JVkNEQJtQT24F3RbwiywHh6gfKMrOlaLUdm6xTrw==',key_name='tempest-TestSecurityGroupsBasicOps-1099778393',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-4h80siie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:56:54Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=eef702de-73fd-4f21-bb6c-40b922ce92b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.097 186792 DEBUG nova.network.os_vif_util [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.098 186792 DEBUG nova.network.os_vif_util [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:1e:c7,bridge_name='br-int',has_traffic_filtering=True,id=71c0be23-6331-45a3-8a79-5b58adf7f6bd,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71c0be23-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.098 186792 DEBUG os_vif [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:1e:c7,bridge_name='br-int',has_traffic_filtering=True,id=71c0be23-6331-45a3-8a79-5b58adf7f6bd,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71c0be23-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.099 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.099 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.099 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.102 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.102 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71c0be23-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.103 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71c0be23-63, col_values=(('external_ids', {'iface-id': '71c0be23-6331-45a3-8a79-5b58adf7f6bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:1e:c7', 'vm-uuid': 'eef702de-73fd-4f21-bb6c-40b922ce92b7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.104 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:00 np0005531888 NetworkManager[55166]: <info>  [1763801820.1064] manager: (tap71c0be23-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.113 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.114 186792 INFO os_vif [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:1e:c7,bridge_name='br-int',has_traffic_filtering=True,id=71c0be23-6331-45a3-8a79-5b58adf7f6bd,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71c0be23-63')#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.280 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.280 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.280 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No VIF found with MAC fa:16:3e:b0:1e:c7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:57:00 np0005531888 nova_compute[186788]: 2025-11-22 08:57:00.281 186792 INFO nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Using config drive#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.008 186792 INFO nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Creating config drive at /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk.config#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.013 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo94e6wew execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.140 186792 DEBUG oslo_concurrency.processutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo94e6wew" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:57:01 np0005531888 kernel: tap71c0be23-63: entered promiscuous mode
Nov 22 03:57:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:01Z|00780|binding|INFO|Claiming lport 71c0be23-6331-45a3-8a79-5b58adf7f6bd for this chassis.
Nov 22 03:57:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:01Z|00781|binding|INFO|71c0be23-6331-45a3-8a79-5b58adf7f6bd: Claiming fa:16:3e:b0:1e:c7 10.100.0.11
Nov 22 03:57:01 np0005531888 NetworkManager[55166]: <info>  [1763801821.2033] manager: (tap71c0be23-63): new Tun device (/org/freedesktop/NetworkManager/Devices/373)
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.204 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.211 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 NetworkManager[55166]: <info>  [1763801821.2137] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.213 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 NetworkManager[55166]: <info>  [1763801821.2146] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.225 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:1e:c7 10.100.0.11'], port_security=['fa:16:3e:b0:1e:c7 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'eef702de-73fd-4f21-bb6c-40b922ce92b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9b6088d-9774-4171-9ae0-83f685fc1451', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71640610-572c-4a8b-b43c-9737c485843b, chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=71c0be23-6331-45a3-8a79-5b58adf7f6bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.227 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 71c0be23-6331-45a3-8a79-5b58adf7f6bd in datapath 13f06a8d-f6ae-46ea-b973-f89bfd41893c bound to our chassis#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.228 104023 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 13f06a8d-f6ae-46ea-b973-f89bfd41893c#033[00m
Nov 22 03:57:01 np0005531888 systemd-udevd[254445]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.239 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[41aff2c7-99b8-4f2a-a283-74529e746a5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.239 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap13f06a8d-f1 in ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.241 213587 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap13f06a8d-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.242 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7b78a1-ab7d-4588-ac62-d918f4b14338]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.242 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[05799dd2-8986-4df9-a091-91f97b7c97f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 systemd-machined[153106]: New machine qemu-89-instance-000000bb.
Nov 22 03:57:01 np0005531888 NetworkManager[55166]: <info>  [1763801821.2450] device (tap71c0be23-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:57:01 np0005531888 NetworkManager[55166]: <info>  [1763801821.2460] device (tap71c0be23-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.257 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f7baf3-3e91-42f7-8802-8851451fee49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 systemd[1]: Started Virtual Machine qemu-89-instance-000000bb.
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.279 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ffaceaa1-1859-45d0-8c81-c1a5f91ba21e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.284 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.292 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:01Z|00782|binding|INFO|Setting lport 71c0be23-6331-45a3-8a79-5b58adf7f6bd ovn-installed in OVS
Nov 22 03:57:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:01Z|00783|binding|INFO|Setting lport 71c0be23-6331-45a3-8a79-5b58adf7f6bd up in Southbound
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.300 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.312 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[793d3db4-1764-4b2e-9632-a4f01097c893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.317 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[35145312-d6d6-41b6-96c8-61a72738c513]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 NetworkManager[55166]: <info>  [1763801821.3183] manager: (tap13f06a8d-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/376)
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.354 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[63eefaea-e52b-4547-8d7a-61871b17e21a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.356 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[e8d44e81-a9c2-4d9e-b8e6-11af83293d71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 NetworkManager[55166]: <info>  [1763801821.3809] device (tap13f06a8d-f0): carrier: link connected
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.386 213625 DEBUG oslo.privsep.daemon [-] privsep: reply[333bc6f9-3890-4930-9599-f4059e93f9b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.405 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[79152310-e84e-4740-8aab-e10430f68bd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13f06a8d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:28:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856202, 'reachable_time': 31497, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254479, 'error': None, 'target': 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.418 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[09df2d00-359e-457e-ad51-5f40578d34ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:2818'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 856202, 'tstamp': 856202}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254480, 'error': None, 'target': 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.438 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[406ff54f-075c-4aa9-9f9c-d6138b4f8ec8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13f06a8d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:28:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 231], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856202, 'reachable_time': 31497, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254481, 'error': None, 'target': 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.466 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[39588593-9c2b-420e-981b-ab6b125d229c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.515 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4b6d45-6f6b-4c26-9d39-fa69db1c2e7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.516 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13f06a8d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.516 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.516 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13f06a8d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.518 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 kernel: tap13f06a8d-f0: entered promiscuous mode
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.520 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap13f06a8d-f0, col_values=(('external_ids', {'iface-id': '1131c2e2-47c4-45b6-8f6d-5a584a957856'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.521 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 NetworkManager[55166]: <info>  [1763801821.5221] manager: (tap13f06a8d-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/377)
Nov 22 03:57:01 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:01Z|00784|binding|INFO|Releasing lport 1131c2e2-47c4-45b6-8f6d-5a584a957856 from this chassis (sb_readonly=0)
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.535 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.536 104023 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/13f06a8d-f6ae-46ea-b973-f89bfd41893c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/13f06a8d-f6ae-46ea-b973-f89bfd41893c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.537 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[88a60d02-3c3c-4012-bf1e-3f75ec12314d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.539 104023 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: global
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    log         /dev/log local0 debug
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    log-tag     haproxy-metadata-proxy-13f06a8d-f6ae-46ea-b973-f89bfd41893c
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    user        root
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    group       root
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    maxconn     1024
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    pidfile     /var/lib/neutron/external/pids/13f06a8d-f6ae-46ea-b973-f89bfd41893c.pid.haproxy
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    daemon
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: defaults
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    log global
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    mode http
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    option httplog
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    option dontlognull
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    option http-server-close
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    option forwardfor
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    retries                 3
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    timeout http-request    30s
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    timeout connect         30s
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    timeout client          32s
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    timeout server          32s
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    timeout http-keep-alive 30s
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: listen listener
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    bind 169.254.169.254:80
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]:    http-request add-header X-OVN-Network-ID 13f06a8d-f6ae-46ea-b973-f89bfd41893c
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:57:01 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:01.540 104023 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'env', 'PROCESS_TAG=haproxy-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/13f06a8d-f6ae-46ea-b973-f89bfd41893c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.601 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801821.6003742, eef702de-73fd-4f21-bb6c-40b922ce92b7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.601 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] VM Started (Lifecycle Event)#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.628 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.633 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801821.6005228, eef702de-73fd-4f21-bb6c-40b922ce92b7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.633 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.650 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.654 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:57:01 np0005531888 nova_compute[186788]: 2025-11-22 08:57:01.675 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:57:01 np0005531888 podman[254520]: 2025-11-22 08:57:01.937480625 +0000 UTC m=+0.095727136 container create bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 03:57:01 np0005531888 podman[254520]: 2025-11-22 08:57:01.866331894 +0000 UTC m=+0.024578415 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:57:02 np0005531888 systemd[1]: Started libpod-conmon-bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe.scope.
Nov 22 03:57:02 np0005531888 systemd[1]: Started libcrun container.
Nov 22 03:57:02 np0005531888 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0fab27f2f0beee5629aa8d62405de905bb030ce8299b977641e7b3a7e87d68cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.070 186792 DEBUG nova.compute.manager [req-5b4daf82-e395-4745-a062-acc811e7531a req-156a5698-896a-4319-b122-edcc18e32f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received event network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.071 186792 DEBUG oslo_concurrency.lockutils [req-5b4daf82-e395-4745-a062-acc811e7531a req-156a5698-896a-4319-b122-edcc18e32f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.071 186792 DEBUG oslo_concurrency.lockutils [req-5b4daf82-e395-4745-a062-acc811e7531a req-156a5698-896a-4319-b122-edcc18e32f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.071 186792 DEBUG oslo_concurrency.lockutils [req-5b4daf82-e395-4745-a062-acc811e7531a req-156a5698-896a-4319-b122-edcc18e32f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.072 186792 DEBUG nova.compute.manager [req-5b4daf82-e395-4745-a062-acc811e7531a req-156a5698-896a-4319-b122-edcc18e32f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Processing event network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:57:02 np0005531888 podman[254520]: 2025-11-22 08:57:02.072605899 +0000 UTC m=+0.230852420 container init bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.072 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.076 186792 DEBUG nova.virt.driver [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] Emitting event <LifecycleEvent: 1763801822.0759642, eef702de-73fd-4f21-bb6c-40b922ce92b7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.076 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.078 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:57:02 np0005531888 podman[254520]: 2025-11-22 08:57:02.079045497 +0000 UTC m=+0.237292018 container start bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.081 186792 INFO nova.virt.libvirt.driver [-] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Instance spawned successfully.#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.082 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.098 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:57:02 np0005531888 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[254535]: [NOTICE]   (254539) : New worker (254541) forked
Nov 22 03:57:02 np0005531888 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[254535]: [NOTICE]   (254539) : Loading success.
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.104 186792 DEBUG nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.109 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.110 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.111 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.111 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.112 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.112 186792 DEBUG nova.virt.libvirt.driver [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.141 186792 INFO nova.compute.manager [None req-9068f025-28d3-4e43-ab0b-d3876189280c - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.202 186792 INFO nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Took 7.28 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.203 186792 DEBUG nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.298 186792 INFO nova.compute.manager [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Took 7.93 seconds to build instance.#033[00m
Nov 22 03:57:02 np0005531888 nova_compute[186788]: 2025-11-22 08:57:02.327 186792 DEBUG oslo_concurrency.lockutils [None req-5fbaf075-5f7d-4355-9f89-6d880cb183e0 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:03 np0005531888 nova_compute[186788]: 2025-11-22 08:57:03.855 186792 DEBUG nova.network.neutron [req-8088317d-460a-44e1-96a2-e97a60e45c22 req-af841589-1ba0-4f4b-9b4c-40335784f580 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Updated VIF entry in instance network info cache for port 71c0be23-6331-45a3-8a79-5b58adf7f6bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:57:03 np0005531888 nova_compute[186788]: 2025-11-22 08:57:03.857 186792 DEBUG nova.network.neutron [req-8088317d-460a-44e1-96a2-e97a60e45c22 req-af841589-1ba0-4f4b-9b4c-40335784f580 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Updating instance_info_cache with network_info: [{"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:57:03 np0005531888 nova_compute[186788]: 2025-11-22 08:57:03.873 186792 DEBUG oslo_concurrency.lockutils [req-8088317d-460a-44e1-96a2-e97a60e45c22 req-af841589-1ba0-4f4b-9b4c-40335784f580 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:57:04 np0005531888 nova_compute[186788]: 2025-11-22 08:57:04.198 186792 DEBUG nova.compute.manager [req-ad5256a3-9eec-4b26-bf4d-eeaea0b30f46 req-ca0b4900-5c5e-4811-9408-1aa3ce90db33 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received event network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:57:04 np0005531888 nova_compute[186788]: 2025-11-22 08:57:04.199 186792 DEBUG oslo_concurrency.lockutils [req-ad5256a3-9eec-4b26-bf4d-eeaea0b30f46 req-ca0b4900-5c5e-4811-9408-1aa3ce90db33 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:04 np0005531888 nova_compute[186788]: 2025-11-22 08:57:04.199 186792 DEBUG oslo_concurrency.lockutils [req-ad5256a3-9eec-4b26-bf4d-eeaea0b30f46 req-ca0b4900-5c5e-4811-9408-1aa3ce90db33 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:04 np0005531888 nova_compute[186788]: 2025-11-22 08:57:04.199 186792 DEBUG oslo_concurrency.lockutils [req-ad5256a3-9eec-4b26-bf4d-eeaea0b30f46 req-ca0b4900-5c5e-4811-9408-1aa3ce90db33 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:04 np0005531888 nova_compute[186788]: 2025-11-22 08:57:04.199 186792 DEBUG nova.compute.manager [req-ad5256a3-9eec-4b26-bf4d-eeaea0b30f46 req-ca0b4900-5c5e-4811-9408-1aa3ce90db33 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] No waiting events found dispatching network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:57:04 np0005531888 nova_compute[186788]: 2025-11-22 08:57:04.200 186792 WARNING nova.compute.manager [req-ad5256a3-9eec-4b26-bf4d-eeaea0b30f46 req-ca0b4900-5c5e-4811-9408-1aa3ce90db33 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received unexpected event network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd for instance with vm_state active and task_state None.#033[00m
Nov 22 03:57:04 np0005531888 nova_compute[186788]: 2025-11-22 08:57:04.834 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:05 np0005531888 nova_compute[186788]: 2025-11-22 08:57:05.105 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:07 np0005531888 podman[254551]: 2025-11-22 08:57:07.680659875 +0000 UTC m=+0.052910603 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:57:07 np0005531888 podman[254550]: 2025-11-22 08:57:07.706540601 +0000 UTC m=+0.081049495 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:57:08 np0005531888 nova_compute[186788]: 2025-11-22 08:57:08.389 186792 DEBUG nova.compute.manager [req-04571290-57cc-4ae2-99a5-e2d91373c48e req-e71b60ec-d636-401e-b2be-60121c12e2f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received event network-changed-71c0be23-6331-45a3-8a79-5b58adf7f6bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:57:08 np0005531888 nova_compute[186788]: 2025-11-22 08:57:08.390 186792 DEBUG nova.compute.manager [req-04571290-57cc-4ae2-99a5-e2d91373c48e req-e71b60ec-d636-401e-b2be-60121c12e2f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Refreshing instance network info cache due to event network-changed-71c0be23-6331-45a3-8a79-5b58adf7f6bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:57:08 np0005531888 nova_compute[186788]: 2025-11-22 08:57:08.390 186792 DEBUG oslo_concurrency.lockutils [req-04571290-57cc-4ae2-99a5-e2d91373c48e req-e71b60ec-d636-401e-b2be-60121c12e2f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:57:08 np0005531888 nova_compute[186788]: 2025-11-22 08:57:08.390 186792 DEBUG oslo_concurrency.lockutils [req-04571290-57cc-4ae2-99a5-e2d91373c48e req-e71b60ec-d636-401e-b2be-60121c12e2f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:57:08 np0005531888 nova_compute[186788]: 2025-11-22 08:57:08.390 186792 DEBUG nova.network.neutron [req-04571290-57cc-4ae2-99a5-e2d91373c48e req-e71b60ec-d636-401e-b2be-60121c12e2f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Refreshing network info cache for port 71c0be23-6331-45a3-8a79-5b58adf7f6bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:57:09 np0005531888 nova_compute[186788]: 2025-11-22 08:57:09.836 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:10 np0005531888 nova_compute[186788]: 2025-11-22 08:57:10.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:13 np0005531888 nova_compute[186788]: 2025-11-22 08:57:13.977 186792 DEBUG nova.network.neutron [req-04571290-57cc-4ae2-99a5-e2d91373c48e req-e71b60ec-d636-401e-b2be-60121c12e2f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Updated VIF entry in instance network info cache for port 71c0be23-6331-45a3-8a79-5b58adf7f6bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:57:13 np0005531888 nova_compute[186788]: 2025-11-22 08:57:13.977 186792 DEBUG nova.network.neutron [req-04571290-57cc-4ae2-99a5-e2d91373c48e req-e71b60ec-d636-401e-b2be-60121c12e2f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Updating instance_info_cache with network_info: [{"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:57:14 np0005531888 nova_compute[186788]: 2025-11-22 08:57:14.082 186792 DEBUG oslo_concurrency.lockutils [req-04571290-57cc-4ae2-99a5-e2d91373c48e req-e71b60ec-d636-401e-b2be-60121c12e2f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eef702de-73fd-4f21-bb6c-40b922ce92b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:57:14 np0005531888 nova_compute[186788]: 2025-11-22 08:57:14.838 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:15 np0005531888 nova_compute[186788]: 2025-11-22 08:57:15.110 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:18 np0005531888 podman[254611]: 2025-11-22 08:57:18.705638156 +0000 UTC m=+0.069570432 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 03:57:18 np0005531888 podman[254612]: 2025-11-22 08:57:18.72080137 +0000 UTC m=+0.085111155 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:57:19 np0005531888 nova_compute[186788]: 2025-11-22 08:57:19.840 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:20 np0005531888 nova_compute[186788]: 2025-11-22 08:57:20.112 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:20Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:1e:c7 10.100.0.11
Nov 22 03:57:20 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:20Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:1e:c7 10.100.0.11
Nov 22 03:57:22 np0005531888 nova_compute[186788]: 2025-11-22 08:57:22.333 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:23 np0005531888 podman[254654]: 2025-11-22 08:57:23.688241558 +0000 UTC m=+0.057412474 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 22 03:57:23 np0005531888 podman[254655]: 2025-11-22 08:57:23.688614257 +0000 UTC m=+0.056581594 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:57:23 np0005531888 podman[254656]: 2025-11-22 08:57:23.737891039 +0000 UTC m=+0.096913585 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 03:57:24 np0005531888 nova_compute[186788]: 2025-11-22 08:57:24.843 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:25 np0005531888 nova_compute[186788]: 2025-11-22 08:57:25.115 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.437 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "eef702de-73fd-4f21-bb6c-40b922ce92b7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.437 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.438 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.438 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.438 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.446 186792 INFO nova.compute.manager [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Terminating instance#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.456 186792 DEBUG nova.compute.manager [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:57:27 np0005531888 kernel: tap71c0be23-63 (unregistering): left promiscuous mode
Nov 22 03:57:27 np0005531888 NetworkManager[55166]: <info>  [1763801847.4788] device (tap71c0be23-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.487 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:27Z|00785|binding|INFO|Releasing lport 71c0be23-6331-45a3-8a79-5b58adf7f6bd from this chassis (sb_readonly=0)
Nov 22 03:57:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:27Z|00786|binding|INFO|Setting lport 71c0be23-6331-45a3-8a79-5b58adf7f6bd down in Southbound
Nov 22 03:57:27 np0005531888 ovn_controller[95067]: 2025-11-22T08:57:27Z|00787|binding|INFO|Removing iface tap71c0be23-63 ovn-installed in OVS
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.491 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.496 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:1e:c7 10.100.0.11'], port_security=['fa:16:3e:b0:1e:c7 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'eef702de-73fd-4f21-bb6c-40b922ce92b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9f305522-88f5-4c20-9009-29570bb0d99a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71640610-572c-4a8b-b43c-9737c485843b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>], logical_port=71c0be23-6331-45a3-8a79-5b58adf7f6bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f03cf796940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.497 104023 INFO neutron.agent.ovn.metadata.agent [-] Port 71c0be23-6331-45a3-8a79-5b58adf7f6bd in datapath 13f06a8d-f6ae-46ea-b973-f89bfd41893c unbound from our chassis#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.499 104023 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 13f06a8d-f6ae-46ea-b973-f89bfd41893c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.500 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a7b1ad-8a93-4801-aeee-6dedd73c0a08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.500 104023 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c namespace which is not needed anymore#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.507 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000bb.scope: Deactivated successfully.
Nov 22 03:57:27 np0005531888 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000bb.scope: Consumed 15.310s CPU time.
Nov 22 03:57:27 np0005531888 systemd-machined[153106]: Machine qemu-89-instance-000000bb terminated.
Nov 22 03:57:27 np0005531888 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[254535]: [NOTICE]   (254539) : haproxy version is 2.8.14-c23fe91
Nov 22 03:57:27 np0005531888 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[254535]: [NOTICE]   (254539) : path to executable is /usr/sbin/haproxy
Nov 22 03:57:27 np0005531888 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[254535]: [WARNING]  (254539) : Exiting Master process...
Nov 22 03:57:27 np0005531888 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[254535]: [ALERT]    (254539) : Current worker (254541) exited with code 143 (Terminated)
Nov 22 03:57:27 np0005531888 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[254535]: [WARNING]  (254539) : All workers exited. Exiting... (0)
Nov 22 03:57:27 np0005531888 systemd[1]: libpod-bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe.scope: Deactivated successfully.
Nov 22 03:57:27 np0005531888 podman[254745]: 2025-11-22 08:57:27.639466564 +0000 UTC m=+0.050547015 container died bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:57:27 np0005531888 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe-userdata-shm.mount: Deactivated successfully.
Nov 22 03:57:27 np0005531888 systemd[1]: var-lib-containers-storage-overlay-0fab27f2f0beee5629aa8d62405de905bb030ce8299b977641e7b3a7e87d68cb-merged.mount: Deactivated successfully.
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.683 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.688 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 podman[254745]: 2025-11-22 08:57:27.704694888 +0000 UTC m=+0.115775339 container cleanup bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:57:27 np0005531888 systemd[1]: libpod-conmon-bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe.scope: Deactivated successfully.
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.729 186792 INFO nova.virt.libvirt.driver [-] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Instance destroyed successfully.#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.729 186792 DEBUG nova.objects.instance [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'resources' on Instance uuid eef702de-73fd-4f21-bb6c-40b922ce92b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.741 186792 DEBUG nova.virt.libvirt.vif [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:56:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-766773495',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-766773495',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=187,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPHag9aku7GBAkvmU1fN0BEWX+GKKnlz+wxkaBB81usyGYFIYXdms2nPbWtkH6Pt7jECf5QAIbXgbB8vKLjCswltA0JVkNEQJtQT24F3RbwiywHh6gfKMrOlaLUdm6xTrw==',key_name='tempest-TestSecurityGroupsBasicOps-1099778393',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:57:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-4h80siie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:57:02Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=eef702de-73fd-4f21-bb6c-40b922ce92b7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.741 186792 DEBUG nova.network.os_vif_util [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "address": "fa:16:3e:b0:1e:c7", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71c0be23-63", "ovs_interfaceid": "71c0be23-6331-45a3-8a79-5b58adf7f6bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.742 186792 DEBUG nova.network.os_vif_util [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:1e:c7,bridge_name='br-int',has_traffic_filtering=True,id=71c0be23-6331-45a3-8a79-5b58adf7f6bd,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71c0be23-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.743 186792 DEBUG os_vif [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:1e:c7,bridge_name='br-int',has_traffic_filtering=True,id=71c0be23-6331-45a3-8a79-5b58adf7f6bd,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71c0be23-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.745 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.745 186792 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71c0be23-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.747 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.748 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.751 186792 INFO os_vif [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:1e:c7,bridge_name='br-int',has_traffic_filtering=True,id=71c0be23-6331-45a3-8a79-5b58adf7f6bd,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71c0be23-63')#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.752 186792 INFO nova.virt.libvirt.driver [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Deleting instance files /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7_del#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.752 186792 INFO nova.virt.libvirt.driver [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Deletion of /var/lib/nova/instances/eef702de-73fd-4f21-bb6c-40b922ce92b7_del complete#033[00m
Nov 22 03:57:27 np0005531888 podman[254791]: 2025-11-22 08:57:27.788250334 +0000 UTC m=+0.058270065 container remove bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.793 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a518cf-1aa0-4b68-b936-ddc6ec755802]: (4, ('Sat Nov 22 08:57:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c (bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe)\nbc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe\nSat Nov 22 08:57:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c (bc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe)\nbc9884670f759dfe0766e8417c7d905bbc5956b223522a19291db34b2cae48fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.795 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[4a891530-9979-47f8-a844-e96566af8375]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.796 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13f06a8d-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.799 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 kernel: tap13f06a8d-f0: left promiscuous mode
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.810 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.812 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.813 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[72638b54-b4b0-4d60-9a3f-af3392f9add9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.819 186792 INFO nova.compute.manager [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.820 186792 DEBUG oslo.service.loopingcall [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.821 186792 DEBUG nova.compute.manager [-] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:57:27 np0005531888 nova_compute[186788]: 2025-11-22 08:57:27.821 186792 DEBUG nova.network.neutron [-] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.830 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[1334a8de-cab8-49c6-baa1-d44fd4f8f6c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.832 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[e5820038-b802-43ae-bdfc-b97d15707f0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.847 213587 DEBUG oslo.privsep.daemon [-] privsep: reply[2001995c-11fb-4a91-b713-00f9c7d3ab74]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 856195, 'reachable_time': 18555, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254807, 'error': None, 'target': 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:27 np0005531888 systemd[1]: run-netns-ovnmeta\x2d13f06a8d\x2df6ae\x2d46ea\x2db973\x2df89bfd41893c.mount: Deactivated successfully.
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.850 104136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:57:27 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:27.850 104136 DEBUG oslo.privsep.daemon [-] privsep: reply[e019ab4c-ad2a-4a5f-b3ab-f9a9fe865877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.159 186792 DEBUG nova.compute.manager [req-62631e22-f4b1-4d8e-87cf-455f900aeeda req-18967f94-b9d4-43c8-a263-0014e0ce7b66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received event network-vif-unplugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.159 186792 DEBUG oslo_concurrency.lockutils [req-62631e22-f4b1-4d8e-87cf-455f900aeeda req-18967f94-b9d4-43c8-a263-0014e0ce7b66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.159 186792 DEBUG oslo_concurrency.lockutils [req-62631e22-f4b1-4d8e-87cf-455f900aeeda req-18967f94-b9d4-43c8-a263-0014e0ce7b66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.159 186792 DEBUG oslo_concurrency.lockutils [req-62631e22-f4b1-4d8e-87cf-455f900aeeda req-18967f94-b9d4-43c8-a263-0014e0ce7b66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.159 186792 DEBUG nova.compute.manager [req-62631e22-f4b1-4d8e-87cf-455f900aeeda req-18967f94-b9d4-43c8-a263-0014e0ce7b66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] No waiting events found dispatching network-vif-unplugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.160 186792 DEBUG nova.compute.manager [req-62631e22-f4b1-4d8e-87cf-455f900aeeda req-18967f94-b9d4-43c8-a263-0014e0ce7b66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received event network-vif-unplugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.889 186792 DEBUG nova.network.neutron [-] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.904 186792 INFO nova.compute.manager [-] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Took 1.08 seconds to deallocate network for instance.#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.994 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:28 np0005531888 nova_compute[186788]: 2025-11-22 08:57:28.995 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:29 np0005531888 nova_compute[186788]: 2025-11-22 08:57:29.064 186792 DEBUG nova.compute.provider_tree [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:57:29 np0005531888 nova_compute[186788]: 2025-11-22 08:57:29.077 186792 DEBUG nova.scheduler.client.report [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:57:29 np0005531888 nova_compute[186788]: 2025-11-22 08:57:29.098 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:29 np0005531888 nova_compute[186788]: 2025-11-22 08:57:29.126 186792 INFO nova.scheduler.client.report [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Deleted allocations for instance eef702de-73fd-4f21-bb6c-40b922ce92b7#033[00m
Nov 22 03:57:29 np0005531888 nova_compute[186788]: 2025-11-22 08:57:29.190 186792 DEBUG oslo_concurrency.lockutils [None req-356f1760-dffe-46d3-8ff2-0e958346d3a6 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:29 np0005531888 nova_compute[186788]: 2025-11-22 08:57:29.844 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:30.217 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:57:30 np0005531888 nova_compute[186788]: 2025-11-22 08:57:30.218 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:30 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:30.219 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:57:30 np0005531888 nova_compute[186788]: 2025-11-22 08:57:30.333 186792 DEBUG nova.compute.manager [req-5167ee92-cab1-4e86-be58-ffc78de44685 req-1b546f67-dd49-4b29-a648-204f4ca2654d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received event network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:57:30 np0005531888 nova_compute[186788]: 2025-11-22 08:57:30.334 186792 DEBUG oslo_concurrency.lockutils [req-5167ee92-cab1-4e86-be58-ffc78de44685 req-1b546f67-dd49-4b29-a648-204f4ca2654d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:30 np0005531888 nova_compute[186788]: 2025-11-22 08:57:30.334 186792 DEBUG oslo_concurrency.lockutils [req-5167ee92-cab1-4e86-be58-ffc78de44685 req-1b546f67-dd49-4b29-a648-204f4ca2654d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:30 np0005531888 nova_compute[186788]: 2025-11-22 08:57:30.335 186792 DEBUG oslo_concurrency.lockutils [req-5167ee92-cab1-4e86-be58-ffc78de44685 req-1b546f67-dd49-4b29-a648-204f4ca2654d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eef702de-73fd-4f21-bb6c-40b922ce92b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:30 np0005531888 nova_compute[186788]: 2025-11-22 08:57:30.335 186792 DEBUG nova.compute.manager [req-5167ee92-cab1-4e86-be58-ffc78de44685 req-1b546f67-dd49-4b29-a648-204f4ca2654d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] No waiting events found dispatching network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:57:30 np0005531888 nova_compute[186788]: 2025-11-22 08:57:30.335 186792 WARNING nova.compute.manager [req-5167ee92-cab1-4e86-be58-ffc78de44685 req-1b546f67-dd49-4b29-a648-204f4ca2654d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received unexpected event network-vif-plugged-71c0be23-6331-45a3-8a79-5b58adf7f6bd for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:57:30 np0005531888 nova_compute[186788]: 2025-11-22 08:57:30.335 186792 DEBUG nova.compute.manager [req-5167ee92-cab1-4e86-be58-ffc78de44685 req-1b546f67-dd49-4b29-a648-204f4ca2654d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Received event network-vif-deleted-71c0be23-6331-45a3-8a79-5b58adf7f6bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:57:32 np0005531888 nova_compute[186788]: 2025-11-22 08:57:32.749 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:34 np0005531888 nova_compute[186788]: 2025-11-22 08:57:34.846 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:36.879 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:36.880 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:36.880 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:36 np0005531888 nova_compute[186788]: 2025-11-22 08:57:36.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:37 np0005531888 nova_compute[186788]: 2025-11-22 08:57:37.722 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:37 np0005531888 nova_compute[186788]: 2025-11-22 08:57:37.750 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:37 np0005531888 nova_compute[186788]: 2025-11-22 08:57:37.788 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:37 np0005531888 nova_compute[186788]: 2025-11-22 08:57:37.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:37 np0005531888 nova_compute[186788]: 2025-11-22 08:57:37.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:57:37 np0005531888 nova_compute[186788]: 2025-11-22 08:57:37.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:57:37 np0005531888 nova_compute[186788]: 2025-11-22 08:57:37.979 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:57:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:57:38.221 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:38 np0005531888 podman[254810]: 2025-11-22 08:57:38.680442108 +0000 UTC m=+0.051785755 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:57:38 np0005531888 podman[254811]: 2025-11-22 08:57:38.691245614 +0000 UTC m=+0.054250596 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:57:39 np0005531888 nova_compute[186788]: 2025-11-22 08:57:39.847 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:41 np0005531888 nova_compute[186788]: 2025-11-22 08:57:41.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:42 np0005531888 nova_compute[186788]: 2025-11-22 08:57:42.727 186792 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801847.7263017, eef702de-73fd-4f21-bb6c-40b922ce92b7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:57:42 np0005531888 nova_compute[186788]: 2025-11-22 08:57:42.727 186792 INFO nova.compute.manager [-] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:57:42 np0005531888 nova_compute[186788]: 2025-11-22 08:57:42.754 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:42 np0005531888 nova_compute[186788]: 2025-11-22 08:57:42.756 186792 DEBUG nova.compute.manager [None req-646c34c2-d32f-4ae0-b0ae-841db825dbac - - - - - -] [instance: eef702de-73fd-4f21-bb6c-40b922ce92b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:57:44 np0005531888 nova_compute[186788]: 2025-11-22 08:57:44.848 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:47 np0005531888 nova_compute[186788]: 2025-11-22 08:57:47.757 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:48 np0005531888 nova_compute[186788]: 2025-11-22 08:57:48.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:49 np0005531888 podman[254849]: 2025-11-22 08:57:49.679379959 +0000 UTC m=+0.056588664 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:57:49 np0005531888 podman[254850]: 2025-11-22 08:57:49.697642878 +0000 UTC m=+0.073405337 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:57:49 np0005531888 nova_compute[186788]: 2025-11-22 08:57:49.851 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:51 np0005531888 nova_compute[186788]: 2025-11-22 08:57:51.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:51 np0005531888 nova_compute[186788]: 2025-11-22 08:57:51.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:52 np0005531888 nova_compute[186788]: 2025-11-22 08:57:52.759 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:53 np0005531888 nova_compute[186788]: 2025-11-22 08:57:53.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:53 np0005531888 nova_compute[186788]: 2025-11-22 08:57:53.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:57:54 np0005531888 podman[254895]: 2025-11-22 08:57:54.691219267 +0000 UTC m=+0.060126210 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:57:54 np0005531888 podman[254894]: 2025-11-22 08:57:54.698322222 +0000 UTC m=+0.066458315 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:57:54 np0005531888 podman[254896]: 2025-11-22 08:57:54.720150079 +0000 UTC m=+0.080291605 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:57:54 np0005531888 nova_compute[186788]: 2025-11-22 08:57:54.852 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:54 np0005531888 nova_compute[186788]: 2025-11-22 08:57:54.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:56 np0005531888 nova_compute[186788]: 2025-11-22 08:57:56.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:56 np0005531888 nova_compute[186788]: 2025-11-22 08:57:56.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:56 np0005531888 nova_compute[186788]: 2025-11-22 08:57:56.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:56 np0005531888 nova_compute[186788]: 2025-11-22 08:57:56.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:56 np0005531888 nova_compute[186788]: 2025-11-22 08:57:56.982 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.295 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.296 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5711MB free_disk=73.25900268554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.296 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.297 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.546 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.546 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.595 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.612 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.676 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.677 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:57 np0005531888 nova_compute[186788]: 2025-11-22 08:57:57.762 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:59 np0005531888 nova_compute[186788]: 2025-11-22 08:57:59.853 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:02 np0005531888 nova_compute[186788]: 2025-11-22 08:58:02.766 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:04 np0005531888 nova_compute[186788]: 2025-11-22 08:58:04.855 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:07 np0005531888 nova_compute[186788]: 2025-11-22 08:58:07.769 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:09 np0005531888 podman[254958]: 2025-11-22 08:58:09.695512258 +0000 UTC m=+0.053403686 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 03:58:09 np0005531888 podman[254957]: 2025-11-22 08:58:09.704461818 +0000 UTC m=+0.070697541 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:58:09 np0005531888 nova_compute[186788]: 2025-11-22 08:58:09.857 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:12 np0005531888 nova_compute[186788]: 2025-11-22 08:58:12.772 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:14 np0005531888 nova_compute[186788]: 2025-11-22 08:58:14.859 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:17 np0005531888 nova_compute[186788]: 2025-11-22 08:58:17.775 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:19 np0005531888 nova_compute[186788]: 2025-11-22 08:58:19.860 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:20 np0005531888 podman[255001]: 2025-11-22 08:58:20.685012373 +0000 UTC m=+0.055251600 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:58:20 np0005531888 podman[255000]: 2025-11-22 08:58:20.69218011 +0000 UTC m=+0.065429531 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Nov 22 03:58:22 np0005531888 nova_compute[186788]: 2025-11-22 08:58:22.779 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:23 np0005531888 ovn_controller[95067]: 2025-11-22T08:58:23Z|00788|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 22 03:58:24 np0005531888 nova_compute[186788]: 2025-11-22 08:58:24.862 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:25 np0005531888 podman[255044]: 2025-11-22 08:58:25.68905816 +0000 UTC m=+0.055551327 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 22 03:58:25 np0005531888 podman[255046]: 2025-11-22 08:58:25.728280185 +0000 UTC m=+0.082920281 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 22 03:58:25 np0005531888 podman[255045]: 2025-11-22 08:58:25.730455149 +0000 UTC m=+0.088825356 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:58:27 np0005531888 nova_compute[186788]: 2025-11-22 08:58:27.782 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:29 np0005531888 nova_compute[186788]: 2025-11-22 08:58:29.864 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:32 np0005531888 nova_compute[186788]: 2025-11-22 08:58:32.784 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:58:32.925 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:58:32 np0005531888 nova_compute[186788]: 2025-11-22 08:58:32.926 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:32 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:58:32.927 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:58:34 np0005531888 nova_compute[186788]: 2025-11-22 08:58:34.866 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 08:58:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:58:36.880 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:58:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:58:36.880 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:58:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:58:36.881 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:58:37 np0005531888 nova_compute[186788]: 2025-11-22 08:58:37.788 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:38 np0005531888 nova_compute[186788]: 2025-11-22 08:58:38.678 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:38 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:58:38.930 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:58:38 np0005531888 nova_compute[186788]: 2025-11-22 08:58:38.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:38 np0005531888 nova_compute[186788]: 2025-11-22 08:58:38.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:58:38 np0005531888 nova_compute[186788]: 2025-11-22 08:58:38.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:58:38 np0005531888 nova_compute[186788]: 2025-11-22 08:58:38.991 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:58:39 np0005531888 nova_compute[186788]: 2025-11-22 08:58:39.867 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:40 np0005531888 podman[255113]: 2025-11-22 08:58:40.680944533 +0000 UTC m=+0.056335787 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:58:40 np0005531888 podman[255114]: 2025-11-22 08:58:40.689650167 +0000 UTC m=+0.061632087 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:58:41 np0005531888 nova_compute[186788]: 2025-11-22 08:58:41.985 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:42 np0005531888 nova_compute[186788]: 2025-11-22 08:58:42.791 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:44 np0005531888 nova_compute[186788]: 2025-11-22 08:58:44.868 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:47 np0005531888 nova_compute[186788]: 2025-11-22 08:58:47.794 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:49 np0005531888 nova_compute[186788]: 2025-11-22 08:58:49.870 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:49 np0005531888 nova_compute[186788]: 2025-11-22 08:58:49.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:51 np0005531888 podman[255156]: 2025-11-22 08:58:51.704769887 +0000 UTC m=+0.064118699 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:58:51 np0005531888 podman[255155]: 2025-11-22 08:58:51.735141725 +0000 UTC m=+0.094714212 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:58:51 np0005531888 nova_compute[186788]: 2025-11-22 08:58:51.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:52 np0005531888 nova_compute[186788]: 2025-11-22 08:58:52.796 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:53 np0005531888 nova_compute[186788]: 2025-11-22 08:58:53.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:54 np0005531888 nova_compute[186788]: 2025-11-22 08:58:54.872 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:54 np0005531888 nova_compute[186788]: 2025-11-22 08:58:54.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:54 np0005531888 nova_compute[186788]: 2025-11-22 08:58:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:58:55 np0005531888 nova_compute[186788]: 2025-11-22 08:58:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:56 np0005531888 podman[255197]: 2025-11-22 08:58:56.695502998 +0000 UTC m=+0.065716558 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible)
Nov 22 03:58:56 np0005531888 podman[255198]: 2025-11-22 08:58:56.742554235 +0000 UTC m=+0.102576034 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:58:56 np0005531888 podman[255199]: 2025-11-22 08:58:56.744546844 +0000 UTC m=+0.107760892 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:58:57 np0005531888 nova_compute[186788]: 2025-11-22 08:58:57.798 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:57 np0005531888 nova_compute[186788]: 2025-11-22 08:58:57.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:57 np0005531888 nova_compute[186788]: 2025-11-22 08:58:57.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:58:57 np0005531888 nova_compute[186788]: 2025-11-22 08:58:57.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:58:57 np0005531888 nova_compute[186788]: 2025-11-22 08:58:57.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:58:57 np0005531888 nova_compute[186788]: 2025-11-22 08:58:57.989 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.163 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.164 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5714MB free_disk=73.25900268554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.164 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.164 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.532 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.533 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.565 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.584 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.587 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:58:58 np0005531888 nova_compute[186788]: 2025-11-22 08:58:58.587 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:58:59 np0005531888 nova_compute[186788]: 2025-11-22 08:58:59.874 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:02 np0005531888 nova_compute[186788]: 2025-11-22 08:59:02.801 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:04 np0005531888 nova_compute[186788]: 2025-11-22 08:59:04.876 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:07 np0005531888 nova_compute[186788]: 2025-11-22 08:59:07.804 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:09 np0005531888 nova_compute[186788]: 2025-11-22 08:59:09.877 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:11 np0005531888 podman[255262]: 2025-11-22 08:59:11.676805611 +0000 UTC m=+0.053520947 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:59:11 np0005531888 podman[255263]: 2025-11-22 08:59:11.676787261 +0000 UTC m=+0.048701269 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 22 03:59:12 np0005531888 nova_compute[186788]: 2025-11-22 08:59:12.807 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:14 np0005531888 nova_compute[186788]: 2025-11-22 08:59:14.879 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:17 np0005531888 nova_compute[186788]: 2025-11-22 08:59:17.810 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:19 np0005531888 nova_compute[186788]: 2025-11-22 08:59:19.880 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:22 np0005531888 podman[255307]: 2025-11-22 08:59:22.706305585 +0000 UTC m=+0.068979908 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 22 03:59:22 np0005531888 podman[255308]: 2025-11-22 08:59:22.719331315 +0000 UTC m=+0.085990236 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:59:22 np0005531888 nova_compute[186788]: 2025-11-22 08:59:22.813 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:24 np0005531888 nova_compute[186788]: 2025-11-22 08:59:24.582 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:24 np0005531888 nova_compute[186788]: 2025-11-22 08:59:24.882 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:27 np0005531888 podman[255348]: 2025-11-22 08:59:27.682226999 +0000 UTC m=+0.056685516 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:59:27 np0005531888 podman[255347]: 2025-11-22 08:59:27.685770396 +0000 UTC m=+0.060385876 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public)
Nov 22 03:59:27 np0005531888 podman[255349]: 2025-11-22 08:59:27.700857287 +0000 UTC m=+0.072844003 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 03:59:27 np0005531888 nova_compute[186788]: 2025-11-22 08:59:27.815 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:29 np0005531888 nova_compute[186788]: 2025-11-22 08:59:29.884 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:32 np0005531888 nova_compute[186788]: 2025-11-22 08:59:32.818 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:34 np0005531888 nova_compute[186788]: 2025-11-22 08:59:34.886 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:59:36.880 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:59:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:59:36.881 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:59:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 08:59:36.881 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:59:37 np0005531888 nova_compute[186788]: 2025-11-22 08:59:37.821 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:39 np0005531888 nova_compute[186788]: 2025-11-22 08:59:39.887 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:39 np0005531888 nova_compute[186788]: 2025-11-22 08:59:39.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:39 np0005531888 nova_compute[186788]: 2025-11-22 08:59:39.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:59:39 np0005531888 nova_compute[186788]: 2025-11-22 08:59:39.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:59:39 np0005531888 nova_compute[186788]: 2025-11-22 08:59:39.968 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:59:39 np0005531888 nova_compute[186788]: 2025-11-22 08:59:39.969 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:42 np0005531888 podman[255409]: 2025-11-22 08:59:42.676436469 +0000 UTC m=+0.050053462 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:59:42 np0005531888 podman[255410]: 2025-11-22 08:59:42.682988901 +0000 UTC m=+0.051186121 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:59:42 np0005531888 nova_compute[186788]: 2025-11-22 08:59:42.823 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:42 np0005531888 nova_compute[186788]: 2025-11-22 08:59:42.963 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:44 np0005531888 nova_compute[186788]: 2025-11-22 08:59:44.888 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:47 np0005531888 nova_compute[186788]: 2025-11-22 08:59:47.826 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:49 np0005531888 nova_compute[186788]: 2025-11-22 08:59:49.889 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:51 np0005531888 nova_compute[186788]: 2025-11-22 08:59:51.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:52 np0005531888 nova_compute[186788]: 2025-11-22 08:59:52.829 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:52 np0005531888 nova_compute[186788]: 2025-11-22 08:59:52.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:53 np0005531888 podman[255450]: 2025-11-22 08:59:53.880053466 +0000 UTC m=+0.058811638 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:59:53 np0005531888 podman[255449]: 2025-11-22 08:59:53.90462323 +0000 UTC m=+0.080192094 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 03:59:54 np0005531888 nova_compute[186788]: 2025-11-22 08:59:54.893 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:54 np0005531888 nova_compute[186788]: 2025-11-22 08:59:54.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:54 np0005531888 nova_compute[186788]: 2025-11-22 08:59:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:59:55 np0005531888 nova_compute[186788]: 2025-11-22 08:59:55.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:57 np0005531888 nova_compute[186788]: 2025-11-22 08:59:57.832 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:57 np0005531888 nova_compute[186788]: 2025-11-22 08:59:57.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:57 np0005531888 podman[255492]: 2025-11-22 08:59:57.955867568 +0000 UTC m=+0.074545476 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 03:59:57 np0005531888 podman[255491]: 2025-11-22 08:59:57.960974493 +0000 UTC m=+0.084494290 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 03:59:57 np0005531888 podman[255493]: 2025-11-22 08:59:57.990466798 +0000 UTC m=+0.104675245 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 22 03:59:59 np0005531888 nova_compute[186788]: 2025-11-22 08:59:59.893 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:59 np0005531888 nova_compute[186788]: 2025-11-22 08:59:59.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:59 np0005531888 nova_compute[186788]: 2025-11-22 08:59:59.991 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:59:59 np0005531888 nova_compute[186788]: 2025-11-22 08:59:59.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:59:59 np0005531888 nova_compute[186788]: 2025-11-22 08:59:59.992 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:59:59 np0005531888 nova_compute[186788]: 2025-11-22 08:59:59.992 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.130 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.131 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5723MB free_disk=73.25900268554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.131 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.131 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.902 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.903 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.952 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.976 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.978 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:00:00 np0005531888 nova_compute[186788]: 2025-11-22 09:00:00.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:00:02 np0005531888 nova_compute[186788]: 2025-11-22 09:00:02.836 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:04 np0005531888 nova_compute[186788]: 2025-11-22 09:00:04.895 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:07 np0005531888 nova_compute[186788]: 2025-11-22 09:00:07.838 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:09 np0005531888 nova_compute[186788]: 2025-11-22 09:00:09.911 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:12 np0005531888 nova_compute[186788]: 2025-11-22 09:00:12.842 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:13 np0005531888 podman[255556]: 2025-11-22 09:00:13.680265792 +0000 UTC m=+0.053073887 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:00:13 np0005531888 podman[255557]: 2025-11-22 09:00:13.709528282 +0000 UTC m=+0.076506134 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 04:00:14 np0005531888 nova_compute[186788]: 2025-11-22 09:00:14.912 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:14 np0005531888 nova_compute[186788]: 2025-11-22 09:00:14.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:14 np0005531888 nova_compute[186788]: 2025-11-22 09:00:14.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 04:00:14 np0005531888 nova_compute[186788]: 2025-11-22 09:00:14.984 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 04:00:14 np0005531888 nova_compute[186788]: 2025-11-22 09:00:14.985 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:14 np0005531888 nova_compute[186788]: 2025-11-22 09:00:14.985 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 04:00:17 np0005531888 nova_compute[186788]: 2025-11-22 09:00:17.844 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:19 np0005531888 nova_compute[186788]: 2025-11-22 09:00:19.914 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:19 np0005531888 nova_compute[186788]: 2025-11-22 09:00:19.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:22 np0005531888 nova_compute[186788]: 2025-11-22 09:00:22.847 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:24 np0005531888 systemd-logind[825]: New session 49 of user zuul.
Nov 22 04:00:24 np0005531888 systemd[1]: Started Session 49 of User zuul.
Nov 22 04:00:24 np0005531888 podman[255602]: 2025-11-22 09:00:24.60302019 +0000 UTC m=+0.078811650 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:00:24 np0005531888 podman[255600]: 2025-11-22 09:00:24.631773438 +0000 UTC m=+0.112462358 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 04:00:24 np0005531888 nova_compute[186788]: 2025-11-22 09:00:24.917 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:27 np0005531888 nova_compute[186788]: 2025-11-22 09:00:27.850 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:28 np0005531888 podman[255780]: 2025-11-22 09:00:28.491354538 +0000 UTC m=+0.061119763 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6)
Nov 22 04:00:28 np0005531888 podman[255784]: 2025-11-22 09:00:28.492091987 +0000 UTC m=+0.062566260 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:00:28 np0005531888 podman[255785]: 2025-11-22 09:00:28.528846751 +0000 UTC m=+0.095109961 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 04:00:29 np0005531888 nova_compute[186788]: 2025-11-22 09:00:29.918 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:31 np0005531888 ovs-vsctl[255874]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 22 04:00:32 np0005531888 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 255666 (sos)
Nov 22 04:00:32 np0005531888 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 22 04:00:32 np0005531888 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 22 04:00:32 np0005531888 virtqemud[186358]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 22 04:00:32 np0005531888 virtqemud[186358]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 22 04:00:32 np0005531888 virtqemud[186358]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 04:00:32 np0005531888 nova_compute[186788]: 2025-11-22 09:00:32.853 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:34 np0005531888 nova_compute[186788]: 2025-11-22 09:00:34.919 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:36 np0005531888 systemd[1]: Starting Hostname Service...
Nov 22 04:00:36 np0005531888 systemd[1]: Started Hostname Service.
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:00:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:00:36.882 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:00:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:00:36.882 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:00:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:00:36.883 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:00:37 np0005531888 nova_compute[186788]: 2025-11-22 09:00:37.856 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:39 np0005531888 nova_compute[186788]: 2025-11-22 09:00:39.921 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:40 np0005531888 nova_compute[186788]: 2025-11-22 09:00:40.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:41 np0005531888 nova_compute[186788]: 2025-11-22 09:00:41.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:41 np0005531888 nova_compute[186788]: 2025-11-22 09:00:41.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:00:41 np0005531888 nova_compute[186788]: 2025-11-22 09:00:41.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:00:41 np0005531888 nova_compute[186788]: 2025-11-22 09:00:41.975 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:00:42 np0005531888 nova_compute[186788]: 2025-11-22 09:00:42.858 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:42 np0005531888 nova_compute[186788]: 2025-11-22 09:00:42.968 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:43 np0005531888 podman[257346]: 2025-11-22 09:00:43.821032123 +0000 UTC m=+0.073083198 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:00:43 np0005531888 podman[257351]: 2025-11-22 09:00:43.83712706 +0000 UTC m=+0.087876633 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 04:00:44 np0005531888 ovs-appctl[257686]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 04:00:44 np0005531888 ovs-appctl[257690]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 04:00:44 np0005531888 ovs-appctl[257694]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 04:00:44 np0005531888 nova_compute[186788]: 2025-11-22 09:00:44.923 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:45 np0005531888 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck1038762957-merged.mount: Deactivated successfully.
Nov 22 04:00:47 np0005531888 nova_compute[186788]: 2025-11-22 09:00:47.860 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:49 np0005531888 nova_compute[186788]: 2025-11-22 09:00:49.933 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:52 np0005531888 nova_compute[186788]: 2025-11-22 09:00:52.863 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:53 np0005531888 nova_compute[186788]: 2025-11-22 09:00:53.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:54 np0005531888 virtqemud[186358]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 04:00:54 np0005531888 podman[259002]: 2025-11-22 09:00:54.713519066 +0000 UTC m=+0.068598649 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:00:54 np0005531888 podman[259005]: 2025-11-22 09:00:54.748417526 +0000 UTC m=+0.094656721 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:00:54 np0005531888 nova_compute[186788]: 2025-11-22 09:00:54.934 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:54 np0005531888 nova_compute[186788]: 2025-11-22 09:00:54.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:55 np0005531888 systemd[1]: Starting Time & Date Service...
Nov 22 04:00:55 np0005531888 nova_compute[186788]: 2025-11-22 09:00:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:55 np0005531888 nova_compute[186788]: 2025-11-22 09:00:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:55 np0005531888 nova_compute[186788]: 2025-11-22 09:00:55.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:00:56 np0005531888 systemd[1]: Started Time & Date Service.
Nov 22 04:00:57 np0005531888 nova_compute[186788]: 2025-11-22 09:00:57.866 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:57 np0005531888 nova_compute[186788]: 2025-11-22 09:00:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:58 np0005531888 podman[259193]: 2025-11-22 09:00:58.694100614 +0000 UTC m=+0.070642602 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Nov 22 04:00:58 np0005531888 podman[259194]: 2025-11-22 09:00:58.696839412 +0000 UTC m=+0.069482073 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:00:58 np0005531888 podman[259195]: 2025-11-22 09:00:58.732319475 +0000 UTC m=+0.103900840 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 04:00:59 np0005531888 nova_compute[186788]: 2025-11-22 09:00:59.936 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:59 np0005531888 nova_compute[186788]: 2025-11-22 09:00:59.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:59 np0005531888 nova_compute[186788]: 2025-11-22 09:00:59.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:00:59 np0005531888 nova_compute[186788]: 2025-11-22 09:00:59.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:00:59 np0005531888 nova_compute[186788]: 2025-11-22 09:00:59.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:00:59 np0005531888 nova_compute[186788]: 2025-11-22 09:00:59.981 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.174 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.176 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5453MB free_disk=72.73851776123047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.176 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.176 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.251 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.253 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.279 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.317 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.318 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.396 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.488 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.619 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.722 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.766 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:01:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:00.766 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:01:02 np0005531888 nova_compute[186788]: 2025-11-22 09:01:02.868 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:04 np0005531888 nova_compute[186788]: 2025-11-22 09:01:04.938 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:07 np0005531888 nova_compute[186788]: 2025-11-22 09:01:07.870 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:09 np0005531888 nova_compute[186788]: 2025-11-22 09:01:09.940 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:12 np0005531888 nova_compute[186788]: 2025-11-22 09:01:12.874 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:14 np0005531888 podman[259268]: 2025-11-22 09:01:14.722930669 +0000 UTC m=+0.068229582 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Nov 22 04:01:14 np0005531888 podman[259267]: 2025-11-22 09:01:14.757554701 +0000 UTC m=+0.103333026 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:01:14 np0005531888 nova_compute[186788]: 2025-11-22 09:01:14.947 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:17 np0005531888 nova_compute[186788]: 2025-11-22 09:01:17.876 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:19 np0005531888 nova_compute[186788]: 2025-11-22 09:01:19.950 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:22 np0005531888 nova_compute[186788]: 2025-11-22 09:01:22.879 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:24 np0005531888 nova_compute[186788]: 2025-11-22 09:01:24.953 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:25 np0005531888 podman[259309]: 2025-11-22 09:01:25.687115079 +0000 UTC m=+0.060308217 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:01:25 np0005531888 podman[259308]: 2025-11-22 09:01:25.695368032 +0000 UTC m=+0.067999525 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 04:01:26 np0005531888 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 04:01:26 np0005531888 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 04:01:26 np0005531888 nova_compute[186788]: 2025-11-22 09:01:26.763 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:27 np0005531888 systemd[1]: session-49.scope: Deactivated successfully.
Nov 22 04:01:27 np0005531888 systemd[1]: session-49.scope: Consumed 1min 30.921s CPU time, 810.1M memory peak, read 297.4M from disk, written 60.7M to disk.
Nov 22 04:01:27 np0005531888 systemd-logind[825]: Session 49 logged out. Waiting for processes to exit.
Nov 22 04:01:27 np0005531888 systemd-logind[825]: Removed session 49.
Nov 22 04:01:27 np0005531888 systemd-logind[825]: New session 50 of user zuul.
Nov 22 04:01:27 np0005531888 systemd[1]: Started Session 50 of User zuul.
Nov 22 04:01:27 np0005531888 nova_compute[186788]: 2025-11-22 09:01:27.881 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:28 np0005531888 systemd[1]: session-50.scope: Deactivated successfully.
Nov 22 04:01:28 np0005531888 systemd-logind[825]: Session 50 logged out. Waiting for processes to exit.
Nov 22 04:01:28 np0005531888 systemd-logind[825]: Removed session 50.
Nov 22 04:01:28 np0005531888 systemd-logind[825]: New session 51 of user zuul.
Nov 22 04:01:28 np0005531888 systemd[1]: Started Session 51 of User zuul.
Nov 22 04:01:28 np0005531888 systemd[1]: session-51.scope: Deactivated successfully.
Nov 22 04:01:28 np0005531888 systemd-logind[825]: Session 51 logged out. Waiting for processes to exit.
Nov 22 04:01:28 np0005531888 systemd-logind[825]: Removed session 51.
Nov 22 04:01:29 np0005531888 podman[259414]: 2025-11-22 09:01:29.701660644 +0000 UTC m=+0.067382201 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:01:29 np0005531888 podman[259413]: 2025-11-22 09:01:29.712413728 +0000 UTC m=+0.074593568 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal)
Nov 22 04:01:29 np0005531888 podman[259415]: 2025-11-22 09:01:29.731708734 +0000 UTC m=+0.097511263 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 04:01:29 np0005531888 nova_compute[186788]: 2025-11-22 09:01:29.955 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:32 np0005531888 nova_compute[186788]: 2025-11-22 09:01:32.884 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:34 np0005531888 nova_compute[186788]: 2025-11-22 09:01:34.958 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:01:36.883 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:01:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:01:36.884 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:01:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:01:36.884 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:01:37 np0005531888 nova_compute[186788]: 2025-11-22 09:01:37.887 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:39 np0005531888 nova_compute[186788]: 2025-11-22 09:01:39.960 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:40 np0005531888 nova_compute[186788]: 2025-11-22 09:01:40.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:42 np0005531888 nova_compute[186788]: 2025-11-22 09:01:42.889 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:42 np0005531888 nova_compute[186788]: 2025-11-22 09:01:42.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:42 np0005531888 nova_compute[186788]: 2025-11-22 09:01:42.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:01:42 np0005531888 nova_compute[186788]: 2025-11-22 09:01:42.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:01:42 np0005531888 nova_compute[186788]: 2025-11-22 09:01:42.974 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:01:44 np0005531888 nova_compute[186788]: 2025-11-22 09:01:44.962 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:44 np0005531888 nova_compute[186788]: 2025-11-22 09:01:44.968 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:45 np0005531888 podman[259477]: 2025-11-22 09:01:45.693380004 +0000 UTC m=+0.058093951 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 04:01:45 np0005531888 podman[259476]: 2025-11-22 09:01:45.703155005 +0000 UTC m=+0.070580659 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:01:47 np0005531888 nova_compute[186788]: 2025-11-22 09:01:47.891 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:49 np0005531888 nova_compute[186788]: 2025-11-22 09:01:49.963 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:52 np0005531888 nova_compute[186788]: 2025-11-22 09:01:52.894 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:53 np0005531888 nova_compute[186788]: 2025-11-22 09:01:53.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:54 np0005531888 nova_compute[186788]: 2025-11-22 09:01:54.964 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:55 np0005531888 nova_compute[186788]: 2025-11-22 09:01:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:56 np0005531888 podman[259519]: 2025-11-22 09:01:56.683077146 +0000 UTC m=+0.053320254 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:01:56 np0005531888 podman[259518]: 2025-11-22 09:01:56.685065546 +0000 UTC m=+0.059795964 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 04:01:57 np0005531888 nova_compute[186788]: 2025-11-22 09:01:57.896 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:57 np0005531888 nova_compute[186788]: 2025-11-22 09:01:57.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:57 np0005531888 nova_compute[186788]: 2025-11-22 09:01:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:57 np0005531888 nova_compute[186788]: 2025-11-22 09:01:57.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:01:58 np0005531888 nova_compute[186788]: 2025-11-22 09:01:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:59 np0005531888 nova_compute[186788]: 2025-11-22 09:01:59.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:59 np0005531888 nova_compute[186788]: 2025-11-22 09:01:59.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:59 np0005531888 nova_compute[186788]: 2025-11-22 09:01:59.998 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:01:59 np0005531888 nova_compute[186788]: 2025-11-22 09:01:59.998 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:01:59 np0005531888 nova_compute[186788]: 2025-11-22 09:01:59.998 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:01:59.999 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.220 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.221 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5702MB free_disk=73.25876998901367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.221 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.221 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.498 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.499 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:02:00 np0005531888 podman[259557]: 2025-11-22 09:02:00.710642682 +0000 UTC m=+0.068273263 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal)
Nov 22 04:02:00 np0005531888 podman[259558]: 2025-11-22 09:02:00.742670461 +0000 UTC m=+0.084944164 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.764 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:02:00 np0005531888 podman[259564]: 2025-11-22 09:02:00.779648652 +0000 UTC m=+0.116627074 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.792 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.893 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:02:00 np0005531888 nova_compute[186788]: 2025-11-22 09:02:00.893 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:02:02 np0005531888 nova_compute[186788]: 2025-11-22 09:02:02.899 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:04 np0005531888 nova_compute[186788]: 2025-11-22 09:02:04.968 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:07 np0005531888 nova_compute[186788]: 2025-11-22 09:02:07.901 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:09 np0005531888 nova_compute[186788]: 2025-11-22 09:02:09.969 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:12 np0005531888 nova_compute[186788]: 2025-11-22 09:02:12.904 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:14 np0005531888 nova_compute[186788]: 2025-11-22 09:02:14.973 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:16 np0005531888 podman[259621]: 2025-11-22 09:02:16.683335845 +0000 UTC m=+0.055399167 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:02:16 np0005531888 podman[259622]: 2025-11-22 09:02:16.706507305 +0000 UTC m=+0.078687029 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:02:17 np0005531888 nova_compute[186788]: 2025-11-22 09:02:17.905 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:19 np0005531888 nova_compute[186788]: 2025-11-22 09:02:19.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:22 np0005531888 nova_compute[186788]: 2025-11-22 09:02:22.908 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:24 np0005531888 nova_compute[186788]: 2025-11-22 09:02:24.977 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:27 np0005531888 podman[259662]: 2025-11-22 09:02:27.681234489 +0000 UTC m=+0.049719977 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:02:27 np0005531888 podman[259661]: 2025-11-22 09:02:27.682020057 +0000 UTC m=+0.055069687 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:02:27 np0005531888 nova_compute[186788]: 2025-11-22 09:02:27.911 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:29 np0005531888 nova_compute[186788]: 2025-11-22 09:02:29.978 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:31 np0005531888 podman[259703]: 2025-11-22 09:02:31.681149062 +0000 UTC m=+0.055612880 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 04:02:31 np0005531888 podman[259704]: 2025-11-22 09:02:31.683211034 +0000 UTC m=+0.057849887 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Nov 22 04:02:31 np0005531888 podman[259705]: 2025-11-22 09:02:31.716696669 +0000 UTC m=+0.084537455 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 04:02:32 np0005531888 nova_compute[186788]: 2025-11-22 09:02:32.912 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:34 np0005531888 nova_compute[186788]: 2025-11-22 09:02:34.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:02:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:02:36.883 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:02:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:02:36.884 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:02:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:02:36.884 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:02:37 np0005531888 nova_compute[186788]: 2025-11-22 09:02:37.915 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:39 np0005531888 nova_compute[186788]: 2025-11-22 09:02:39.981 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:41 np0005531888 nova_compute[186788]: 2025-11-22 09:02:41.893 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:42 np0005531888 nova_compute[186788]: 2025-11-22 09:02:42.918 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:43 np0005531888 nova_compute[186788]: 2025-11-22 09:02:43.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:43 np0005531888 nova_compute[186788]: 2025-11-22 09:02:43.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:02:43 np0005531888 nova_compute[186788]: 2025-11-22 09:02:43.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:02:43 np0005531888 nova_compute[186788]: 2025-11-22 09:02:43.979 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:02:44 np0005531888 nova_compute[186788]: 2025-11-22 09:02:44.982 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:45 np0005531888 nova_compute[186788]: 2025-11-22 09:02:45.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:47 np0005531888 podman[259770]: 2025-11-22 09:02:47.687882191 +0000 UTC m=+0.051901161 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:02:47 np0005531888 podman[259771]: 2025-11-22 09:02:47.689431759 +0000 UTC m=+0.049548212 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:02:47 np0005531888 nova_compute[186788]: 2025-11-22 09:02:47.920 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:49 np0005531888 nova_compute[186788]: 2025-11-22 09:02:49.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:52 np0005531888 nova_compute[186788]: 2025-11-22 09:02:52.925 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:54 np0005531888 nova_compute[186788]: 2025-11-22 09:02:54.984 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:55 np0005531888 nova_compute[186788]: 2025-11-22 09:02:55.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:56 np0005531888 nova_compute[186788]: 2025-11-22 09:02:56.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:57 np0005531888 nova_compute[186788]: 2025-11-22 09:02:57.928 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:57 np0005531888 nova_compute[186788]: 2025-11-22 09:02:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:57 np0005531888 nova_compute[186788]: 2025-11-22 09:02:57.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:02:58 np0005531888 podman[259814]: 2025-11-22 09:02:58.678521075 +0000 UTC m=+0.050279180 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:02:58 np0005531888 podman[259813]: 2025-11-22 09:02:58.688614233 +0000 UTC m=+0.062424278 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 04:02:59 np0005531888 nova_compute[186788]: 2025-11-22 09:02:59.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:59 np0005531888 nova_compute[186788]: 2025-11-22 09:02:59.986 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:00 np0005531888 nova_compute[186788]: 2025-11-22 09:03:00.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:00 np0005531888 nova_compute[186788]: 2025-11-22 09:03:00.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.000 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.001 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.001 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.001 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.188 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.189 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5714MB free_disk=73.25920486450195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.189 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.189 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.376 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.376 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.684 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.715 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.717 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:03:01 np0005531888 nova_compute[186788]: 2025-11-22 09:03:01.717 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:03:02 np0005531888 podman[259857]: 2025-11-22 09:03:02.705100355 +0000 UTC m=+0.072129947 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 04:03:02 np0005531888 podman[259856]: 2025-11-22 09:03:02.715945003 +0000 UTC m=+0.080737940 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6)
Nov 22 04:03:02 np0005531888 podman[259858]: 2025-11-22 09:03:02.724449652 +0000 UTC m=+0.092270014 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 04:03:02 np0005531888 nova_compute[186788]: 2025-11-22 09:03:02.929 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:04 np0005531888 nova_compute[186788]: 2025-11-22 09:03:04.988 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:07 np0005531888 nova_compute[186788]: 2025-11-22 09:03:07.933 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:09 np0005531888 nova_compute[186788]: 2025-11-22 09:03:09.990 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:12 np0005531888 nova_compute[186788]: 2025-11-22 09:03:12.936 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:14 np0005531888 nova_compute[186788]: 2025-11-22 09:03:14.992 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:17 np0005531888 nova_compute[186788]: 2025-11-22 09:03:17.939 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:18 np0005531888 podman[259924]: 2025-11-22 09:03:18.7103799 +0000 UTC m=+0.073305836 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 04:03:18 np0005531888 podman[259923]: 2025-11-22 09:03:18.727767189 +0000 UTC m=+0.088739977 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:03:19 np0005531888 nova_compute[186788]: 2025-11-22 09:03:19.993 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:22 np0005531888 nova_compute[186788]: 2025-11-22 09:03:22.942 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:24 np0005531888 nova_compute[186788]: 2025-11-22 09:03:24.996 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:26 np0005531888 nova_compute[186788]: 2025-11-22 09:03:26.712 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:27 np0005531888 nova_compute[186788]: 2025-11-22 09:03:27.945 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:29 np0005531888 podman[259966]: 2025-11-22 09:03:29.728357019 +0000 UTC m=+0.091229578 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:03:29 np0005531888 podman[259967]: 2025-11-22 09:03:29.745140562 +0000 UTC m=+0.096511498 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:03:30 np0005531888 nova_compute[186788]: 2025-11-22 09:03:29.999 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:32 np0005531888 nova_compute[186788]: 2025-11-22 09:03:32.947 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:33 np0005531888 podman[260008]: 2025-11-22 09:03:33.716464433 +0000 UTC m=+0.083838507 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 04:03:33 np0005531888 podman[260009]: 2025-11-22 09:03:33.728281293 +0000 UTC m=+0.096132608 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 04:03:33 np0005531888 podman[260010]: 2025-11-22 09:03:33.770983836 +0000 UTC m=+0.120733945 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 04:03:35 np0005531888 nova_compute[186788]: 2025-11-22 09:03:35.000 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:03:36.884 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:03:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:03:36.885 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:03:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:03:36.885 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:03:37 np0005531888 nova_compute[186788]: 2025-11-22 09:03:37.950 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:40 np0005531888 nova_compute[186788]: 2025-11-22 09:03:40.001 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:42 np0005531888 nova_compute[186788]: 2025-11-22 09:03:42.951 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:42 np0005531888 nova_compute[186788]: 2025-11-22 09:03:42.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:45 np0005531888 nova_compute[186788]: 2025-11-22 09:03:45.003 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:45 np0005531888 nova_compute[186788]: 2025-11-22 09:03:45.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:45 np0005531888 nova_compute[186788]: 2025-11-22 09:03:45.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:45 np0005531888 nova_compute[186788]: 2025-11-22 09:03:45.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:03:45 np0005531888 nova_compute[186788]: 2025-11-22 09:03:45.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:03:45 np0005531888 nova_compute[186788]: 2025-11-22 09:03:45.975 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:03:47 np0005531888 nova_compute[186788]: 2025-11-22 09:03:47.954 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:49 np0005531888 podman[260072]: 2025-11-22 09:03:49.685762878 +0000 UTC m=+0.054736039 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:03:49 np0005531888 podman[260073]: 2025-11-22 09:03:49.71951314 +0000 UTC m=+0.088689786 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Nov 22 04:03:50 np0005531888 nova_compute[186788]: 2025-11-22 09:03:50.006 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:52 np0005531888 nova_compute[186788]: 2025-11-22 09:03:52.959 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:55 np0005531888 nova_compute[186788]: 2025-11-22 09:03:55.008 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:55 np0005531888 nova_compute[186788]: 2025-11-22 09:03:55.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:57 np0005531888 nova_compute[186788]: 2025-11-22 09:03:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:57 np0005531888 nova_compute[186788]: 2025-11-22 09:03:57.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:57 np0005531888 nova_compute[186788]: 2025-11-22 09:03:57.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:03:57 np0005531888 nova_compute[186788]: 2025-11-22 09:03:57.961 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:00 np0005531888 nova_compute[186788]: 2025-11-22 09:04:00.010 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:00 np0005531888 podman[260115]: 2025-11-22 09:04:00.681121868 +0000 UTC m=+0.054478453 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 04:04:00 np0005531888 podman[260116]: 2025-11-22 09:04:00.685663401 +0000 UTC m=+0.056216107 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:04:00 np0005531888 nova_compute[186788]: 2025-11-22 09:04:00.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:01 np0005531888 nova_compute[186788]: 2025-11-22 09:04:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:02 np0005531888 nova_compute[186788]: 2025-11-22 09:04:02.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:02 np0005531888 nova_compute[186788]: 2025-11-22 09:04:02.964 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:02 np0005531888 nova_compute[186788]: 2025-11-22 09:04:02.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:04:02 np0005531888 nova_compute[186788]: 2025-11-22 09:04:02.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:04:02 np0005531888 nova_compute[186788]: 2025-11-22 09:04:02.989 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:04:02 np0005531888 nova_compute[186788]: 2025-11-22 09:04:02.989 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.161 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.162 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5699MB free_disk=73.25920486450195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.162 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.163 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.313 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.314 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.383 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.404 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.406 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:04:03 np0005531888 nova_compute[186788]: 2025-11-22 09:04:03.406 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:04:04 np0005531888 podman[260158]: 2025-11-22 09:04:04.694371851 +0000 UTC m=+0.065973286 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 04:04:04 np0005531888 podman[260159]: 2025-11-22 09:04:04.69515573 +0000 UTC m=+0.060467150 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm)
Nov 22 04:04:04 np0005531888 podman[260160]: 2025-11-22 09:04:04.754655426 +0000 UTC m=+0.115375074 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:04:05 np0005531888 nova_compute[186788]: 2025-11-22 09:04:05.012 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:07 np0005531888 nova_compute[186788]: 2025-11-22 09:04:07.967 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:10 np0005531888 nova_compute[186788]: 2025-11-22 09:04:10.013 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:12 np0005531888 nova_compute[186788]: 2025-11-22 09:04:12.970 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:15 np0005531888 nova_compute[186788]: 2025-11-22 09:04:15.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:17 np0005531888 nova_compute[186788]: 2025-11-22 09:04:17.974 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:20 np0005531888 nova_compute[186788]: 2025-11-22 09:04:20.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:20 np0005531888 podman[260225]: 2025-11-22 09:04:20.687875095 +0000 UTC m=+0.060243515 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:04:20 np0005531888 podman[260226]: 2025-11-22 09:04:20.69336932 +0000 UTC m=+0.056508422 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 04:04:22 np0005531888 nova_compute[186788]: 2025-11-22 09:04:22.976 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:25 np0005531888 nova_compute[186788]: 2025-11-22 09:04:25.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:27 np0005531888 nova_compute[186788]: 2025-11-22 09:04:27.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:30 np0005531888 nova_compute[186788]: 2025-11-22 09:04:30.019 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:31 np0005531888 podman[260269]: 2025-11-22 09:04:31.686095695 +0000 UTC m=+0.062980772 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 04:04:31 np0005531888 podman[260270]: 2025-11-22 09:04:31.689491569 +0000 UTC m=+0.060433910 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:04:32 np0005531888 nova_compute[186788]: 2025-11-22 09:04:32.982 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:35 np0005531888 nova_compute[186788]: 2025-11-22 09:04:35.020 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:35 np0005531888 podman[260312]: 2025-11-22 09:04:35.69823247 +0000 UTC m=+0.064414008 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 04:04:35 np0005531888 podman[260313]: 2025-11-22 09:04:35.711008264 +0000 UTC m=+0.073649254 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 04:04:35 np0005531888 podman[260314]: 2025-11-22 09:04:35.740721036 +0000 UTC m=+0.099401319 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:04:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:04:36.886 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:04:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:04:36.886 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:04:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:04:36.887 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:04:37 np0005531888 nova_compute[186788]: 2025-11-22 09:04:37.984 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:40 np0005531888 nova_compute[186788]: 2025-11-22 09:04:40.023 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:42 np0005531888 nova_compute[186788]: 2025-11-22 09:04:42.987 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:43 np0005531888 nova_compute[186788]: 2025-11-22 09:04:43.407 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:45 np0005531888 nova_compute[186788]: 2025-11-22 09:04:45.024 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:46 np0005531888 nova_compute[186788]: 2025-11-22 09:04:46.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:47 np0005531888 nova_compute[186788]: 2025-11-22 09:04:47.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:47 np0005531888 nova_compute[186788]: 2025-11-22 09:04:47.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:04:47 np0005531888 nova_compute[186788]: 2025-11-22 09:04:47.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:04:47 np0005531888 nova_compute[186788]: 2025-11-22 09:04:47.972 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:04:47 np0005531888 nova_compute[186788]: 2025-11-22 09:04:47.993 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:50 np0005531888 nova_compute[186788]: 2025-11-22 09:04:50.024 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:51 np0005531888 podman[260376]: 2025-11-22 09:04:51.7365099 +0000 UTC m=+0.084888542 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:04:51 np0005531888 podman[260377]: 2025-11-22 09:04:51.753771755 +0000 UTC m=+0.099859210 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:04:52 np0005531888 nova_compute[186788]: 2025-11-22 09:04:52.994 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:55 np0005531888 nova_compute[186788]: 2025-11-22 09:04:55.026 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:55 np0005531888 nova_compute[186788]: 2025-11-22 09:04:55.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:57 np0005531888 nova_compute[186788]: 2025-11-22 09:04:57.995 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:58 np0005531888 nova_compute[186788]: 2025-11-22 09:04:58.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:58 np0005531888 nova_compute[186788]: 2025-11-22 09:04:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:58 np0005531888 nova_compute[186788]: 2025-11-22 09:04:58.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:05:00 np0005531888 nova_compute[186788]: 2025-11-22 09:05:00.029 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:01 np0005531888 nova_compute[186788]: 2025-11-22 09:05:01.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:02 np0005531888 podman[260419]: 2025-11-22 09:05:02.713803655 +0000 UTC m=+0.080188106 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 04:05:02 np0005531888 podman[260420]: 2025-11-22 09:05:02.718443879 +0000 UTC m=+0.076402713 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:05:03 np0005531888 nova_compute[186788]: 2025-11-22 09:05:02.999 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:03 np0005531888 nova_compute[186788]: 2025-11-22 09:05:03.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:03 np0005531888 nova_compute[186788]: 2025-11-22 09:05:03.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.237 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.238 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.238 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.238 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.421 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.422 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5709MB free_disk=73.25920486450195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.422 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.422 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.744 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.744 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.772 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.789 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.790 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:05:04 np0005531888 nova_compute[186788]: 2025-11-22 09:05:04.790 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:05:05 np0005531888 nova_compute[186788]: 2025-11-22 09:05:05.031 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:06 np0005531888 podman[260464]: 2025-11-22 09:05:06.69628017 +0000 UTC m=+0.071934463 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible)
Nov 22 04:05:06 np0005531888 podman[260465]: 2025-11-22 09:05:06.720432924 +0000 UTC m=+0.092320474 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 04:05:06 np0005531888 podman[260466]: 2025-11-22 09:05:06.737203147 +0000 UTC m=+0.108465142 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:05:08 np0005531888 nova_compute[186788]: 2025-11-22 09:05:08.001 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:10 np0005531888 nova_compute[186788]: 2025-11-22 09:05:10.033 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:13 np0005531888 nova_compute[186788]: 2025-11-22 09:05:13.004 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:14 np0005531888 nova_compute[186788]: 2025-11-22 09:05:14.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:14 np0005531888 nova_compute[186788]: 2025-11-22 09:05:14.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 04:05:14 np0005531888 nova_compute[186788]: 2025-11-22 09:05:14.971 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 04:05:15 np0005531888 nova_compute[186788]: 2025-11-22 09:05:15.036 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:18 np0005531888 nova_compute[186788]: 2025-11-22 09:05:18.008 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:19 np0005531888 nova_compute[186788]: 2025-11-22 09:05:19.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:20 np0005531888 nova_compute[186788]: 2025-11-22 09:05:20.038 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:22 np0005531888 podman[260530]: 2025-11-22 09:05:22.706484126 +0000 UTC m=+0.067721089 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 04:05:22 np0005531888 podman[260529]: 2025-11-22 09:05:22.710659719 +0000 UTC m=+0.071194865 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:05:23 np0005531888 nova_compute[186788]: 2025-11-22 09:05:23.010 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:23 np0005531888 nova_compute[186788]: 2025-11-22 09:05:23.975 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:23 np0005531888 nova_compute[186788]: 2025-11-22 09:05:23.976 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 04:05:25 np0005531888 nova_compute[186788]: 2025-11-22 09:05:25.039 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:28 np0005531888 nova_compute[186788]: 2025-11-22 09:05:28.011 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:29 np0005531888 nova_compute[186788]: 2025-11-22 09:05:29.962 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:30 np0005531888 nova_compute[186788]: 2025-11-22 09:05:30.042 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:33 np0005531888 nova_compute[186788]: 2025-11-22 09:05:33.015 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:33 np0005531888 podman[260575]: 2025-11-22 09:05:33.692333952 +0000 UTC m=+0.055664592 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:05:33 np0005531888 podman[260574]: 2025-11-22 09:05:33.722989377 +0000 UTC m=+0.089116046 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:05:35 np0005531888 nova_compute[186788]: 2025-11-22 09:05:35.047 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:05:36.887 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:05:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:05:36.887 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:05:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:05:36.887 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:05:37 np0005531888 podman[260616]: 2025-11-22 09:05:37.710426833 +0000 UTC m=+0.069574145 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 22 04:05:37 np0005531888 podman[260617]: 2025-11-22 09:05:37.727269788 +0000 UTC m=+0.086742848 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 04:05:37 np0005531888 podman[260615]: 2025-11-22 09:05:37.730137499 +0000 UTC m=+0.086465621 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Nov 22 04:05:38 np0005531888 nova_compute[186788]: 2025-11-22 09:05:38.017 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:40 np0005531888 nova_compute[186788]: 2025-11-22 09:05:40.050 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:42 np0005531888 nova_compute[186788]: 2025-11-22 09:05:42.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:43 np0005531888 nova_compute[186788]: 2025-11-22 09:05:43.019 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:45 np0005531888 nova_compute[186788]: 2025-11-22 09:05:45.051 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:47 np0005531888 nova_compute[186788]: 2025-11-22 09:05:47.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:48 np0005531888 nova_compute[186788]: 2025-11-22 09:05:48.021 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:48 np0005531888 nova_compute[186788]: 2025-11-22 09:05:48.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:48 np0005531888 nova_compute[186788]: 2025-11-22 09:05:48.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:05:48 np0005531888 nova_compute[186788]: 2025-11-22 09:05:48.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:05:48 np0005531888 nova_compute[186788]: 2025-11-22 09:05:48.975 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:05:50 np0005531888 nova_compute[186788]: 2025-11-22 09:05:50.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:53 np0005531888 nova_compute[186788]: 2025-11-22 09:05:53.027 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:53 np0005531888 podman[260681]: 2025-11-22 09:05:53.690694031 +0000 UTC m=+0.059060386 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:05:53 np0005531888 podman[260682]: 2025-11-22 09:05:53.717622505 +0000 UTC m=+0.085829536 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 04:05:55 np0005531888 nova_compute[186788]: 2025-11-22 09:05:55.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:56 np0005531888 nova_compute[186788]: 2025-11-22 09:05:56.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:58 np0005531888 nova_compute[186788]: 2025-11-22 09:05:58.031 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:58 np0005531888 nova_compute[186788]: 2025-11-22 09:05:58.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:00 np0005531888 nova_compute[186788]: 2025-11-22 09:06:00.058 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:00 np0005531888 nova_compute[186788]: 2025-11-22 09:06:00.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:00 np0005531888 nova_compute[186788]: 2025-11-22 09:06:00.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:06:03 np0005531888 nova_compute[186788]: 2025-11-22 09:06:03.034 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:03 np0005531888 nova_compute[186788]: 2025-11-22 09:06:03.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:03 np0005531888 nova_compute[186788]: 2025-11-22 09:06:03.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:03 np0005531888 nova_compute[186788]: 2025-11-22 09:06:03.994 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:06:03 np0005531888 nova_compute[186788]: 2025-11-22 09:06:03.996 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:06:03 np0005531888 nova_compute[186788]: 2025-11-22 09:06:03.996 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:06:03 np0005531888 nova_compute[186788]: 2025-11-22 09:06:03.996 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.194 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.195 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5699MB free_disk=73.25920486450195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.196 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.196 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.274 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.275 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.300 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.381 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.382 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.407 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.475 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.508 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.557 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.558 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:06:04 np0005531888 nova_compute[186788]: 2025-11-22 09:06:04.558 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.363s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:06:04 np0005531888 podman[260726]: 2025-11-22 09:06:04.689818253 +0000 UTC m=+0.054888193 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:06:04 np0005531888 podman[260725]: 2025-11-22 09:06:04.721609927 +0000 UTC m=+0.079140301 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 22 04:06:05 np0005531888 nova_compute[186788]: 2025-11-22 09:06:05.060 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:06 np0005531888 nova_compute[186788]: 2025-11-22 09:06:06.558 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:08 np0005531888 nova_compute[186788]: 2025-11-22 09:06:08.037 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:08 np0005531888 podman[260772]: 2025-11-22 09:06:08.697347904 +0000 UTC m=+0.061916565 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 04:06:08 np0005531888 podman[260771]: 2025-11-22 09:06:08.711141624 +0000 UTC m=+0.080472262 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 04:06:08 np0005531888 podman[260773]: 2025-11-22 09:06:08.730478141 +0000 UTC m=+0.092735905 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 04:06:10 np0005531888 nova_compute[186788]: 2025-11-22 09:06:10.061 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:13 np0005531888 nova_compute[186788]: 2025-11-22 09:06:13.039 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:15 np0005531888 nova_compute[186788]: 2025-11-22 09:06:15.063 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:18 np0005531888 nova_compute[186788]: 2025-11-22 09:06:18.042 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:20 np0005531888 nova_compute[186788]: 2025-11-22 09:06:20.065 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:23 np0005531888 nova_compute[186788]: 2025-11-22 09:06:23.045 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:24 np0005531888 podman[260840]: 2025-11-22 09:06:24.681774506 +0000 UTC m=+0.053686725 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:06:24 np0005531888 podman[260841]: 2025-11-22 09:06:24.691253638 +0000 UTC m=+0.058454300 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 04:06:25 np0005531888 nova_compute[186788]: 2025-11-22 09:06:25.067 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:28 np0005531888 nova_compute[186788]: 2025-11-22 09:06:28.049 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:30 np0005531888 nova_compute[186788]: 2025-11-22 09:06:30.067 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:33 np0005531888 nova_compute[186788]: 2025-11-22 09:06:33.050 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:35 np0005531888 nova_compute[186788]: 2025-11-22 09:06:35.071 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:35 np0005531888 podman[260882]: 2025-11-22 09:06:35.691453218 +0000 UTC m=+0.054617816 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 04:06:35 np0005531888 podman[260883]: 2025-11-22 09:06:35.702758647 +0000 UTC m=+0.056834042 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:06:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:06:36.888 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:06:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:06:36.889 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:06:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:06:36.889 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:06:38 np0005531888 nova_compute[186788]: 2025-11-22 09:06:38.054 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:39 np0005531888 podman[260926]: 2025-11-22 09:06:39.682614946 +0000 UTC m=+0.054826731 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, release=1755695350, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 04:06:39 np0005531888 podman[260927]: 2025-11-22 09:06:39.704932996 +0000 UTC m=+0.069782310 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 04:06:39 np0005531888 podman[260928]: 2025-11-22 09:06:39.722076369 +0000 UTC m=+0.083804446 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:06:40 np0005531888 nova_compute[186788]: 2025-11-22 09:06:40.071 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:42 np0005531888 nova_compute[186788]: 2025-11-22 09:06:42.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:43 np0005531888 nova_compute[186788]: 2025-11-22 09:06:43.057 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:44 np0005531888 nova_compute[186788]: 2025-11-22 09:06:44.879 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:45 np0005531888 nova_compute[186788]: 2025-11-22 09:06:45.072 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:48 np0005531888 nova_compute[186788]: 2025-11-22 09:06:48.059 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:48 np0005531888 nova_compute[186788]: 2025-11-22 09:06:48.891 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:48 np0005531888 nova_compute[186788]: 2025-11-22 09:06:48.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:48 np0005531888 nova_compute[186788]: 2025-11-22 09:06:48.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:06:48 np0005531888 nova_compute[186788]: 2025-11-22 09:06:48.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:06:48 np0005531888 nova_compute[186788]: 2025-11-22 09:06:48.982 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:06:49 np0005531888 nova_compute[186788]: 2025-11-22 09:06:49.975 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:50 np0005531888 nova_compute[186788]: 2025-11-22 09:06:50.074 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:50 np0005531888 nova_compute[186788]: 2025-11-22 09:06:50.190 186792 DEBUG oslo_concurrency.processutils [None req-7f8d9728-2d8b-420d-a211-d58aa5b5dfd5 74ad5d4ed255439cafdb153ee87124a2 cb198b45e9034b108a19399d19c6cf14 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 04:06:50 np0005531888 nova_compute[186788]: 2025-11-22 09:06:50.216 186792 DEBUG oslo_concurrency.processutils [None req-7f8d9728-2d8b-420d-a211-d58aa5b5dfd5 74ad5d4ed255439cafdb153ee87124a2 cb198b45e9034b108a19399d19c6cf14 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 04:06:53 np0005531888 nova_compute[186788]: 2025-11-22 09:06:53.060 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:55 np0005531888 nova_compute[186788]: 2025-11-22 09:06:55.077 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:55 np0005531888 podman[260994]: 2025-11-22 09:06:55.677594417 +0000 UTC m=+0.046702311 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:06:55 np0005531888 podman[260995]: 2025-11-22 09:06:55.694283448 +0000 UTC m=+0.060291506 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 22 04:06:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:06:55.943 104023 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 04:06:55 np0005531888 nova_compute[186788]: 2025-11-22 09:06:55.944 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:55 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:06:55.945 104023 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 04:06:57 np0005531888 nova_compute[186788]: 2025-11-22 09:06:57.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:58 np0005531888 nova_compute[186788]: 2025-11-22 09:06:58.063 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:59 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:06:59.947 104023 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4984e16e-8f1c-4426-bfc6-5927f375ce79, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 04:06:59 np0005531888 nova_compute[186788]: 2025-11-22 09:06:59.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:00 np0005531888 nova_compute[186788]: 2025-11-22 09:07:00.078 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:01 np0005531888 nova_compute[186788]: 2025-11-22 09:07:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:01 np0005531888 nova_compute[186788]: 2025-11-22 09:07:01.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:07:03 np0005531888 nova_compute[186788]: 2025-11-22 09:07:03.066 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:03 np0005531888 nova_compute[186788]: 2025-11-22 09:07:03.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:03 np0005531888 nova_compute[186788]: 2025-11-22 09:07:03.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:07:03 np0005531888 nova_compute[186788]: 2025-11-22 09:07:03.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:07:03 np0005531888 nova_compute[186788]: 2025-11-22 09:07:03.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:07:03 np0005531888 nova_compute[186788]: 2025-11-22 09:07:03.984 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.148 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.150 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5692MB free_disk=73.25924682617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.150 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.151 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.348 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.348 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.420 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.436 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.438 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:07:04 np0005531888 nova_compute[186788]: 2025-11-22 09:07:04.438 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:07:05 np0005531888 nova_compute[186788]: 2025-11-22 09:07:05.081 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:06 np0005531888 nova_compute[186788]: 2025-11-22 09:07:06.438 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:06 np0005531888 podman[261038]: 2025-11-22 09:07:06.67959641 +0000 UTC m=+0.057401645 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 04:07:06 np0005531888 podman[261039]: 2025-11-22 09:07:06.67959648 +0000 UTC m=+0.052920865 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:07:06 np0005531888 nova_compute[186788]: 2025-11-22 09:07:06.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:08 np0005531888 nova_compute[186788]: 2025-11-22 09:07:08.069 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:10 np0005531888 nova_compute[186788]: 2025-11-22 09:07:10.083 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:10 np0005531888 podman[261082]: 2025-11-22 09:07:10.688129996 +0000 UTC m=+0.060394019 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 04:07:10 np0005531888 podman[261083]: 2025-11-22 09:07:10.688790152 +0000 UTC m=+0.057542629 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:07:10 np0005531888 podman[261084]: 2025-11-22 09:07:10.713719806 +0000 UTC m=+0.082475163 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:07:13 np0005531888 nova_compute[186788]: 2025-11-22 09:07:13.071 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:15 np0005531888 nova_compute[186788]: 2025-11-22 09:07:15.086 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:18 np0005531888 nova_compute[186788]: 2025-11-22 09:07:18.074 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:20 np0005531888 nova_compute[186788]: 2025-11-22 09:07:20.089 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:23 np0005531888 nova_compute[186788]: 2025-11-22 09:07:23.077 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:25 np0005531888 nova_compute[186788]: 2025-11-22 09:07:25.090 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:26 np0005531888 podman[261144]: 2025-11-22 09:07:26.680325168 +0000 UTC m=+0.051563492 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:07:26 np0005531888 podman[261143]: 2025-11-22 09:07:26.700415232 +0000 UTC m=+0.074089226 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:07:28 np0005531888 nova_compute[186788]: 2025-11-22 09:07:28.080 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:30 np0005531888 nova_compute[186788]: 2025-11-22 09:07:30.092 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:33 np0005531888 nova_compute[186788]: 2025-11-22 09:07:33.083 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:33 np0005531888 nova_compute[186788]: 2025-11-22 09:07:33.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:35 np0005531888 nova_compute[186788]: 2025-11-22 09:07:35.095 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:07:36.891 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:07:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:07:36.891 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:07:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:07:36.891 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:07:37 np0005531888 podman[261189]: 2025-11-22 09:07:37.686425911 +0000 UTC m=+0.057787304 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:07:37 np0005531888 podman[261188]: 2025-11-22 09:07:37.704293031 +0000 UTC m=+0.080549724 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 04:07:38 np0005531888 nova_compute[186788]: 2025-11-22 09:07:38.086 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:40 np0005531888 nova_compute[186788]: 2025-11-22 09:07:40.098 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:41 np0005531888 podman[261235]: 2025-11-22 09:07:41.687468813 +0000 UTC m=+0.062672315 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 04:07:41 np0005531888 podman[261236]: 2025-11-22 09:07:41.694992758 +0000 UTC m=+0.063639699 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:07:41 np0005531888 podman[261237]: 2025-11-22 09:07:41.72388602 +0000 UTC m=+0.089340522 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 04:07:43 np0005531888 nova_compute[186788]: 2025-11-22 09:07:43.088 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:43 np0005531888 nova_compute[186788]: 2025-11-22 09:07:43.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:45 np0005531888 nova_compute[186788]: 2025-11-22 09:07:45.100 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:48 np0005531888 nova_compute[186788]: 2025-11-22 09:07:48.089 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:49 np0005531888 nova_compute[186788]: 2025-11-22 09:07:49.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:49 np0005531888 nova_compute[186788]: 2025-11-22 09:07:49.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:07:49 np0005531888 nova_compute[186788]: 2025-11-22 09:07:49.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:07:49 np0005531888 nova_compute[186788]: 2025-11-22 09:07:49.977 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:07:50 np0005531888 nova_compute[186788]: 2025-11-22 09:07:50.100 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:50 np0005531888 nova_compute[186788]: 2025-11-22 09:07:50.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:53 np0005531888 nova_compute[186788]: 2025-11-22 09:07:53.091 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:55 np0005531888 nova_compute[186788]: 2025-11-22 09:07:55.105 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:57 np0005531888 podman[261302]: 2025-11-22 09:07:57.671846461 +0000 UTC m=+0.047495131 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:07:57 np0005531888 podman[261303]: 2025-11-22 09:07:57.675694206 +0000 UTC m=+0.047917051 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 04:07:58 np0005531888 nova_compute[186788]: 2025-11-22 09:07:58.094 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:59 np0005531888 nova_compute[186788]: 2025-11-22 09:07:59.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:00 np0005531888 nova_compute[186788]: 2025-11-22 09:08:00.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:00 np0005531888 nova_compute[186788]: 2025-11-22 09:08:00.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:02 np0005531888 nova_compute[186788]: 2025-11-22 09:08:02.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:02 np0005531888 nova_compute[186788]: 2025-11-22 09:08:02.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:08:03 np0005531888 nova_compute[186788]: 2025-11-22 09:08:03.096 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:05 np0005531888 nova_compute[186788]: 2025-11-22 09:08:05.109 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:05 np0005531888 nova_compute[186788]: 2025-11-22 09:08:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:05 np0005531888 nova_compute[186788]: 2025-11-22 09:08:05.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:08:05 np0005531888 nova_compute[186788]: 2025-11-22 09:08:05.987 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:08:05 np0005531888 nova_compute[186788]: 2025-11-22 09:08:05.988 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:08:05 np0005531888 nova_compute[186788]: 2025-11-22 09:08:05.988 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.138 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.139 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.25924682617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.139 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.139 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.292 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.292 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.339 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.354 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.356 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:08:06 np0005531888 nova_compute[186788]: 2025-11-22 09:08:06.356 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:08:07 np0005531888 nova_compute[186788]: 2025-11-22 09:08:07.357 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:07 np0005531888 nova_compute[186788]: 2025-11-22 09:08:07.357 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:08 np0005531888 nova_compute[186788]: 2025-11-22 09:08:08.099 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:08 np0005531888 podman[261346]: 2025-11-22 09:08:08.71830251 +0000 UTC m=+0.072140048 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 04:08:08 np0005531888 podman[261347]: 2025-11-22 09:08:08.732577512 +0000 UTC m=+0.089284221 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:08:10 np0005531888 nova_compute[186788]: 2025-11-22 09:08:10.111 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:12 np0005531888 podman[261391]: 2025-11-22 09:08:12.686294477 +0000 UTC m=+0.058325228 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 22 04:08:12 np0005531888 podman[261392]: 2025-11-22 09:08:12.690944931 +0000 UTC m=+0.057857486 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Nov 22 04:08:12 np0005531888 podman[261393]: 2025-11-22 09:08:12.721430822 +0000 UTC m=+0.087687681 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:08:13 np0005531888 nova_compute[186788]: 2025-11-22 09:08:13.101 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:15 np0005531888 nova_compute[186788]: 2025-11-22 09:08:15.113 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:18 np0005531888 nova_compute[186788]: 2025-11-22 09:08:18.104 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:20 np0005531888 nova_compute[186788]: 2025-11-22 09:08:20.114 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:23 np0005531888 nova_compute[186788]: 2025-11-22 09:08:23.107 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:25 np0005531888 nova_compute[186788]: 2025-11-22 09:08:25.115 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:28 np0005531888 nova_compute[186788]: 2025-11-22 09:08:28.110 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:28 np0005531888 podman[261452]: 2025-11-22 09:08:28.665044357 +0000 UTC m=+0.042627110 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:08:28 np0005531888 podman[261453]: 2025-11-22 09:08:28.67449527 +0000 UTC m=+0.049740166 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 04:08:30 np0005531888 nova_compute[186788]: 2025-11-22 09:08:30.118 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:33 np0005531888 nova_compute[186788]: 2025-11-22 09:08:33.112 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:35 np0005531888 nova_compute[186788]: 2025-11-22 09:08:35.121 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:08:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:08:36.892 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:08:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:08:36.893 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:08:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:08:36.893 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:08:38 np0005531888 nova_compute[186788]: 2025-11-22 09:08:38.115 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:39 np0005531888 podman[261497]: 2025-11-22 09:08:39.671629183 +0000 UTC m=+0.045441590 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:08:39 np0005531888 podman[261496]: 2025-11-22 09:08:39.704408861 +0000 UTC m=+0.081246532 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 22 04:08:40 np0005531888 nova_compute[186788]: 2025-11-22 09:08:40.123 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:43 np0005531888 nova_compute[186788]: 2025-11-22 09:08:43.117 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:43 np0005531888 podman[261538]: 2025-11-22 09:08:43.677649587 +0000 UTC m=+0.052424872 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Nov 22 04:08:43 np0005531888 podman[261539]: 2025-11-22 09:08:43.687585702 +0000 UTC m=+0.054437492 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 22 04:08:43 np0005531888 podman[261540]: 2025-11-22 09:08:43.715485179 +0000 UTC m=+0.082758690 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 04:08:43 np0005531888 nova_compute[186788]: 2025-11-22 09:08:43.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:45 np0005531888 nova_compute[186788]: 2025-11-22 09:08:45.125 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:48 np0005531888 nova_compute[186788]: 2025-11-22 09:08:48.133 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:50 np0005531888 nova_compute[186788]: 2025-11-22 09:08:50.127 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:50 np0005531888 nova_compute[186788]: 2025-11-22 09:08:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:50 np0005531888 nova_compute[186788]: 2025-11-22 09:08:50.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:08:50 np0005531888 nova_compute[186788]: 2025-11-22 09:08:50.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:08:50 np0005531888 nova_compute[186788]: 2025-11-22 09:08:50.977 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:08:52 np0005531888 nova_compute[186788]: 2025-11-22 09:08:52.972 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:53 np0005531888 nova_compute[186788]: 2025-11-22 09:08:53.146 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:55 np0005531888 nova_compute[186788]: 2025-11-22 09:08:55.129 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:58 np0005531888 nova_compute[186788]: 2025-11-22 09:08:58.148 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:59 np0005531888 podman[261604]: 2025-11-22 09:08:59.670146048 +0000 UTC m=+0.044727633 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 04:08:59 np0005531888 podman[261603]: 2025-11-22 09:08:59.678833012 +0000 UTC m=+0.053205231 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:08:59 np0005531888 nova_compute[186788]: 2025-11-22 09:08:59.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:00 np0005531888 nova_compute[186788]: 2025-11-22 09:09:00.132 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:02 np0005531888 nova_compute[186788]: 2025-11-22 09:09:02.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:02 np0005531888 nova_compute[186788]: 2025-11-22 09:09:02.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:02 np0005531888 nova_compute[186788]: 2025-11-22 09:09:02.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:09:03 np0005531888 nova_compute[186788]: 2025-11-22 09:09:03.149 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:05 np0005531888 nova_compute[186788]: 2025-11-22 09:09:05.133 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:06 np0005531888 nova_compute[186788]: 2025-11-22 09:09:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:06 np0005531888 nova_compute[186788]: 2025-11-22 09:09:06.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:09:06 np0005531888 nova_compute[186788]: 2025-11-22 09:09:06.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:09:06 np0005531888 nova_compute[186788]: 2025-11-22 09:09:06.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:09:06 np0005531888 nova_compute[186788]: 2025-11-22 09:09:06.980 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.125 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.126 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5698MB free_disk=73.25924682617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.126 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.126 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.225 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.225 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.254 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.277 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.278 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:09:07 np0005531888 nova_compute[186788]: 2025-11-22 09:09:07.279 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:09:08 np0005531888 nova_compute[186788]: 2025-11-22 09:09:08.152 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:08 np0005531888 nova_compute[186788]: 2025-11-22 09:09:08.279 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:08 np0005531888 nova_compute[186788]: 2025-11-22 09:09:08.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:10 np0005531888 nova_compute[186788]: 2025-11-22 09:09:10.135 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:10 np0005531888 podman[261646]: 2025-11-22 09:09:10.703429031 +0000 UTC m=+0.079624461 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:09:10 np0005531888 podman[261647]: 2025-11-22 09:09:10.703429321 +0000 UTC m=+0.073978452 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:09:13 np0005531888 nova_compute[186788]: 2025-11-22 09:09:13.153 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:14 np0005531888 podman[261691]: 2025-11-22 09:09:14.685332431 +0000 UTC m=+0.059436744 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Nov 22 04:09:14 np0005531888 podman[261692]: 2025-11-22 09:09:14.68728157 +0000 UTC m=+0.058154674 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:09:14 np0005531888 podman[261693]: 2025-11-22 09:09:14.718230622 +0000 UTC m=+0.083338454 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 04:09:15 np0005531888 nova_compute[186788]: 2025-11-22 09:09:15.151 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:18 np0005531888 nova_compute[186788]: 2025-11-22 09:09:18.154 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:20 np0005531888 nova_compute[186788]: 2025-11-22 09:09:20.155 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:23 np0005531888 nova_compute[186788]: 2025-11-22 09:09:23.155 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:25 np0005531888 nova_compute[186788]: 2025-11-22 09:09:25.230 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:28 np0005531888 nova_compute[186788]: 2025-11-22 09:09:28.157 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:30 np0005531888 nova_compute[186788]: 2025-11-22 09:09:30.232 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:30 np0005531888 podman[261756]: 2025-11-22 09:09:30.667328563 +0000 UTC m=+0.046186089 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:09:30 np0005531888 podman[261757]: 2025-11-22 09:09:30.673452133 +0000 UTC m=+0.048524405 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 04:09:33 np0005531888 nova_compute[186788]: 2025-11-22 09:09:33.160 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:33 np0005531888 nova_compute[186788]: 2025-11-22 09:09:33.948 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:35 np0005531888 nova_compute[186788]: 2025-11-22 09:09:35.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:09:36.893 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:09:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:09:36.893 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:09:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:09:36.893 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:09:38 np0005531888 nova_compute[186788]: 2025-11-22 09:09:38.163 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:40 np0005531888 nova_compute[186788]: 2025-11-22 09:09:40.277 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:41 np0005531888 podman[261800]: 2025-11-22 09:09:41.687387642 +0000 UTC m=+0.064984692 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, tcib_managed=true)
Nov 22 04:09:41 np0005531888 podman[261801]: 2025-11-22 09:09:41.699331686 +0000 UTC m=+0.062896890 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:09:43 np0005531888 nova_compute[186788]: 2025-11-22 09:09:43.166 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:43 np0005531888 nova_compute[186788]: 2025-11-22 09:09:43.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:45 np0005531888 nova_compute[186788]: 2025-11-22 09:09:45.280 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:45 np0005531888 podman[261847]: 2025-11-22 09:09:45.679498054 +0000 UTC m=+0.050612508 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 22 04:09:45 np0005531888 podman[261846]: 2025-11-22 09:09:45.703688589 +0000 UTC m=+0.078639348 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Nov 22 04:09:45 np0005531888 podman[261848]: 2025-11-22 09:09:45.708696393 +0000 UTC m=+0.076886455 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:09:48 np0005531888 nova_compute[186788]: 2025-11-22 09:09:48.207 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:50 np0005531888 nova_compute[186788]: 2025-11-22 09:09:50.282 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:50 np0005531888 nova_compute[186788]: 2025-11-22 09:09:50.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:50 np0005531888 nova_compute[186788]: 2025-11-22 09:09:50.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:09:50 np0005531888 nova_compute[186788]: 2025-11-22 09:09:50.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:09:51 np0005531888 nova_compute[186788]: 2025-11-22 09:09:51.006 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:09:53 np0005531888 nova_compute[186788]: 2025-11-22 09:09:53.002 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:53 np0005531888 nova_compute[186788]: 2025-11-22 09:09:53.209 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:55 np0005531888 nova_compute[186788]: 2025-11-22 09:09:55.805 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:58 np0005531888 nova_compute[186788]: 2025-11-22 09:09:58.212 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:59 np0005531888 nova_compute[186788]: 2025-11-22 09:09:59.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:00 np0005531888 nova_compute[186788]: 2025-11-22 09:10:00.285 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:01 np0005531888 podman[261910]: 2025-11-22 09:10:01.689434772 +0000 UTC m=+0.051129870 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 04:10:01 np0005531888 podman[261909]: 2025-11-22 09:10:01.710710197 +0000 UTC m=+0.079166102 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:10:02 np0005531888 nova_compute[186788]: 2025-11-22 09:10:02.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:02 np0005531888 nova_compute[186788]: 2025-11-22 09:10:02.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:10:03 np0005531888 nova_compute[186788]: 2025-11-22 09:10:03.214 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:04 np0005531888 nova_compute[186788]: 2025-11-22 09:10:04.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:05 np0005531888 nova_compute[186788]: 2025-11-22 09:10:05.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:06 np0005531888 nova_compute[186788]: 2025-11-22 09:10:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:06 np0005531888 nova_compute[186788]: 2025-11-22 09:10:06.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:10:06 np0005531888 nova_compute[186788]: 2025-11-22 09:10:06.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:10:06 np0005531888 nova_compute[186788]: 2025-11-22 09:10:06.979 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:10:06 np0005531888 nova_compute[186788]: 2025-11-22 09:10:06.979 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.120 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.121 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5699MB free_disk=73.25924682617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.121 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.122 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.201 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.201 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.446 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.459 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.461 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:10:07 np0005531888 nova_compute[186788]: 2025-11-22 09:10:07.461 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:10:08 np0005531888 nova_compute[186788]: 2025-11-22 09:10:08.217 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:09 np0005531888 nova_compute[186788]: 2025-11-22 09:10:09.462 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:09 np0005531888 nova_compute[186788]: 2025-11-22 09:10:09.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:10 np0005531888 nova_compute[186788]: 2025-11-22 09:10:10.290 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:12 np0005531888 podman[261953]: 2025-11-22 09:10:12.686419352 +0000 UTC m=+0.058128443 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:10:12 np0005531888 podman[261952]: 2025-11-22 09:10:12.686834832 +0000 UTC m=+0.061725851 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:10:13 np0005531888 nova_compute[186788]: 2025-11-22 09:10:13.219 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:15 np0005531888 nova_compute[186788]: 2025-11-22 09:10:15.293 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:16 np0005531888 podman[261997]: 2025-11-22 09:10:16.67755168 +0000 UTC m=+0.055030417 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 22 04:10:16 np0005531888 podman[261998]: 2025-11-22 09:10:16.678027841 +0000 UTC m=+0.051363406 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 04:10:16 np0005531888 podman[261999]: 2025-11-22 09:10:16.716705014 +0000 UTC m=+0.081554710 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 04:10:18 np0005531888 nova_compute[186788]: 2025-11-22 09:10:18.222 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:20 np0005531888 nova_compute[186788]: 2025-11-22 09:10:20.293 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:22 np0005531888 nova_compute[186788]: 2025-11-22 09:10:22.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:23 np0005531888 nova_compute[186788]: 2025-11-22 09:10:23.225 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:25 np0005531888 nova_compute[186788]: 2025-11-22 09:10:25.336 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:28 np0005531888 nova_compute[186788]: 2025-11-22 09:10:28.228 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:28 np0005531888 nova_compute[186788]: 2025-11-22 09:10:28.968 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:28 np0005531888 nova_compute[186788]: 2025-11-22 09:10:28.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 04:10:28 np0005531888 nova_compute[186788]: 2025-11-22 09:10:28.991 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 04:10:30 np0005531888 nova_compute[186788]: 2025-11-22 09:10:30.337 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:30 np0005531888 nova_compute[186788]: 2025-11-22 09:10:30.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:30 np0005531888 nova_compute[186788]: 2025-11-22 09:10:30.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 04:10:32 np0005531888 podman[262060]: 2025-11-22 09:10:32.669433315 +0000 UTC m=+0.044478127 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:10:32 np0005531888 podman[262061]: 2025-11-22 09:10:32.68548139 +0000 UTC m=+0.056013211 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 04:10:33 np0005531888 nova_compute[186788]: 2025-11-22 09:10:33.231 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:35 np0005531888 nova_compute[186788]: 2025-11-22 09:10:35.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:10:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:10:36.894 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:10:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:10:36.894 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:10:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:10:36.894 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:10:38 np0005531888 nova_compute[186788]: 2025-11-22 09:10:38.234 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:40 np0005531888 nova_compute[186788]: 2025-11-22 09:10:40.340 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:43 np0005531888 nova_compute[186788]: 2025-11-22 09:10:43.236 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:43 np0005531888 podman[262101]: 2025-11-22 09:10:43.678214395 +0000 UTC m=+0.051597701 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 04:10:43 np0005531888 podman[262102]: 2025-11-22 09:10:43.714011657 +0000 UTC m=+0.082819581 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:10:45 np0005531888 nova_compute[186788]: 2025-11-22 09:10:45.342 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:45 np0005531888 nova_compute[186788]: 2025-11-22 09:10:45.971 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:47 np0005531888 podman[262142]: 2025-11-22 09:10:47.691805466 +0000 UTC m=+0.061607888 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 04:10:47 np0005531888 podman[262143]: 2025-11-22 09:10:47.708959488 +0000 UTC m=+0.075607543 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:10:47 np0005531888 podman[262144]: 2025-11-22 09:10:47.73053423 +0000 UTC m=+0.093172986 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:10:48 np0005531888 nova_compute[186788]: 2025-11-22 09:10:48.238 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:50 np0005531888 nova_compute[186788]: 2025-11-22 09:10:50.344 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:51 np0005531888 nova_compute[186788]: 2025-11-22 09:10:51.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:51 np0005531888 nova_compute[186788]: 2025-11-22 09:10:51.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:10:51 np0005531888 nova_compute[186788]: 2025-11-22 09:10:51.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:10:51 np0005531888 nova_compute[186788]: 2025-11-22 09:10:51.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:10:52 np0005531888 nova_compute[186788]: 2025-11-22 09:10:52.963 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:53 np0005531888 nova_compute[186788]: 2025-11-22 09:10:53.240 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:55 np0005531888 nova_compute[186788]: 2025-11-22 09:10:55.345 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:58 np0005531888 nova_compute[186788]: 2025-11-22 09:10:58.245 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:00 np0005531888 nova_compute[186788]: 2025-11-22 09:11:00.389 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:01 np0005531888 nova_compute[186788]: 2025-11-22 09:11:01.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:03 np0005531888 nova_compute[186788]: 2025-11-22 09:11:03.247 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:03 np0005531888 podman[262205]: 2025-11-22 09:11:03.684411158 +0000 UTC m=+0.052737501 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:11:03 np0005531888 podman[262206]: 2025-11-22 09:11:03.690944879 +0000 UTC m=+0.054699538 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 04:11:04 np0005531888 nova_compute[186788]: 2025-11-22 09:11:04.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:04 np0005531888 nova_compute[186788]: 2025-11-22 09:11:04.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:11:05 np0005531888 nova_compute[186788]: 2025-11-22 09:11:05.392 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:05 np0005531888 nova_compute[186788]: 2025-11-22 09:11:05.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:06 np0005531888 nova_compute[186788]: 2025-11-22 09:11:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:06 np0005531888 nova_compute[186788]: 2025-11-22 09:11:06.980 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:11:06 np0005531888 nova_compute[186788]: 2025-11-22 09:11:06.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:11:06 np0005531888 nova_compute[186788]: 2025-11-22 09:11:06.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:11:06 np0005531888 nova_compute[186788]: 2025-11-22 09:11:06.981 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.121 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.121 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5701MB free_disk=73.25924682617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.122 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.122 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.201 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.201 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.221 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.248 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.249 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.276 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.324 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.359 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.380 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.382 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:11:07 np0005531888 nova_compute[186788]: 2025-11-22 09:11:07.382 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:11:08 np0005531888 nova_compute[186788]: 2025-11-22 09:11:08.251 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:09 np0005531888 nova_compute[186788]: 2025-11-22 09:11:09.383 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:10 np0005531888 nova_compute[186788]: 2025-11-22 09:11:10.394 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:11 np0005531888 nova_compute[186788]: 2025-11-22 09:11:11.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:13 np0005531888 nova_compute[186788]: 2025-11-22 09:11:13.254 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:14 np0005531888 podman[262251]: 2025-11-22 09:11:14.687195421 +0000 UTC m=+0.063877314 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 04:11:14 np0005531888 podman[262252]: 2025-11-22 09:11:14.700846528 +0000 UTC m=+0.071684077 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:11:15 np0005531888 nova_compute[186788]: 2025-11-22 09:11:15.395 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:18 np0005531888 nova_compute[186788]: 2025-11-22 09:11:18.257 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:18 np0005531888 podman[262296]: 2025-11-22 09:11:18.684272424 +0000 UTC m=+0.056120213 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Nov 22 04:11:18 np0005531888 podman[262298]: 2025-11-22 09:11:18.713287439 +0000 UTC m=+0.082265357 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 04:11:18 np0005531888 podman[262297]: 2025-11-22 09:11:18.7133274 +0000 UTC m=+0.085357353 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 04:11:20 np0005531888 nova_compute[186788]: 2025-11-22 09:11:20.399 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:23 np0005531888 nova_compute[186788]: 2025-11-22 09:11:23.258 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:25 np0005531888 nova_compute[186788]: 2025-11-22 09:11:25.402 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:28 np0005531888 nova_compute[186788]: 2025-11-22 09:11:28.263 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:30 np0005531888 nova_compute[186788]: 2025-11-22 09:11:30.437 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:33 np0005531888 nova_compute[186788]: 2025-11-22 09:11:33.265 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:34 np0005531888 podman[262363]: 2025-11-22 09:11:34.674968811 +0000 UTC m=+0.047760558 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:11:34 np0005531888 podman[262364]: 2025-11-22 09:11:34.686143086 +0000 UTC m=+0.056301138 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 04:11:35 np0005531888 nova_compute[186788]: 2025-11-22 09:11:35.440 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:35 np0005531888 nova_compute[186788]: 2025-11-22 09:11:35.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:11:36.895 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:11:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:11:36.896 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:11:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:11:36.896 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:11:38 np0005531888 nova_compute[186788]: 2025-11-22 09:11:38.268 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:40 np0005531888 nova_compute[186788]: 2025-11-22 09:11:40.442 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:43 np0005531888 nova_compute[186788]: 2025-11-22 09:11:43.271 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:45 np0005531888 nova_compute[186788]: 2025-11-22 09:11:45.443 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:45 np0005531888 podman[262408]: 2025-11-22 09:11:45.671049329 +0000 UTC m=+0.044237811 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:11:45 np0005531888 podman[262407]: 2025-11-22 09:11:45.677299793 +0000 UTC m=+0.053627772 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 04:11:47 np0005531888 nova_compute[186788]: 2025-11-22 09:11:47.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:48 np0005531888 nova_compute[186788]: 2025-11-22 09:11:48.274 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:49 np0005531888 podman[262452]: 2025-11-22 09:11:49.689757046 +0000 UTC m=+0.061878665 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute)
Nov 22 04:11:49 np0005531888 podman[262451]: 2025-11-22 09:11:49.710114157 +0000 UTC m=+0.086498532 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 04:11:49 np0005531888 podman[262453]: 2025-11-22 09:11:49.711009839 +0000 UTC m=+0.080163665 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:11:50 np0005531888 nova_compute[186788]: 2025-11-22 09:11:50.445 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:51 np0005531888 nova_compute[186788]: 2025-11-22 09:11:51.956 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:51 np0005531888 nova_compute[186788]: 2025-11-22 09:11:51.957 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:11:51 np0005531888 nova_compute[186788]: 2025-11-22 09:11:51.957 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:11:51 np0005531888 nova_compute[186788]: 2025-11-22 09:11:51.983 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:11:53 np0005531888 nova_compute[186788]: 2025-11-22 09:11:53.276 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:53 np0005531888 nova_compute[186788]: 2025-11-22 09:11:53.976 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:55 np0005531888 nova_compute[186788]: 2025-11-22 09:11:55.447 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:58 np0005531888 nova_compute[186788]: 2025-11-22 09:11:58.278 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:00 np0005531888 nova_compute[186788]: 2025-11-22 09:12:00.450 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:02 np0005531888 nova_compute[186788]: 2025-11-22 09:12:02.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:03 np0005531888 nova_compute[186788]: 2025-11-22 09:12:03.286 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:05 np0005531888 nova_compute[186788]: 2025-11-22 09:12:05.492 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:05 np0005531888 podman[262515]: 2025-11-22 09:12:05.671307244 +0000 UTC m=+0.043513353 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:12:05 np0005531888 podman[262514]: 2025-11-22 09:12:05.671575151 +0000 UTC m=+0.045129743 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:12:05 np0005531888 nova_compute[186788]: 2025-11-22 09:12:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:05 np0005531888 nova_compute[186788]: 2025-11-22 09:12:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:05 np0005531888 nova_compute[186788]: 2025-11-22 09:12:05.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:12:07 np0005531888 nova_compute[186788]: 2025-11-22 09:12:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:07.999 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:07.999 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.000 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.000 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.156 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.157 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5692MB free_disk=73.25900268554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.157 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.157 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.242 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.242 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.273 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.287 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.289 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.289 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:12:08 np0005531888 nova_compute[186788]: 2025-11-22 09:12:08.289 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:09 np0005531888 nova_compute[186788]: 2025-11-22 09:12:09.290 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:10 np0005531888 nova_compute[186788]: 2025-11-22 09:12:10.494 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:13 np0005531888 nova_compute[186788]: 2025-11-22 09:12:13.290 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:13 np0005531888 nova_compute[186788]: 2025-11-22 09:12:13.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:15 np0005531888 nova_compute[186788]: 2025-11-22 09:12:15.496 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:16 np0005531888 podman[262556]: 2025-11-22 09:12:16.678444525 +0000 UTC m=+0.049220083 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:12:16 np0005531888 podman[262555]: 2025-11-22 09:12:16.678453516 +0000 UTC m=+0.053998251 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:12:18 np0005531888 nova_compute[186788]: 2025-11-22 09:12:18.294 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:20 np0005531888 nova_compute[186788]: 2025-11-22 09:12:20.498 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:20 np0005531888 podman[262601]: 2025-11-22 09:12:20.684505142 +0000 UTC m=+0.054640047 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 04:12:20 np0005531888 podman[262600]: 2025-11-22 09:12:20.68687344 +0000 UTC m=+0.059368453 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 04:12:20 np0005531888 podman[262602]: 2025-11-22 09:12:20.719227217 +0000 UTC m=+0.085181609 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:12:23 np0005531888 nova_compute[186788]: 2025-11-22 09:12:23.296 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:25 np0005531888 nova_compute[186788]: 2025-11-22 09:12:25.499 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:28 np0005531888 nova_compute[186788]: 2025-11-22 09:12:28.299 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:30 np0005531888 nova_compute[186788]: 2025-11-22 09:12:30.502 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:33 np0005531888 nova_compute[186788]: 2025-11-22 09:12:33.300 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:35 np0005531888 nova_compute[186788]: 2025-11-22 09:12:35.504 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:36 np0005531888 podman[262667]: 2025-11-22 09:12:36.684730388 +0000 UTC m=+0.053798246 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:12:36 np0005531888 podman[262668]: 2025-11-22 09:12:36.687105637 +0000 UTC m=+0.053721365 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:12:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:12:36.896 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:12:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:12:36.897 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:12:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:12:36.897 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:12:38 np0005531888 nova_compute[186788]: 2025-11-22 09:12:38.304 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:40 np0005531888 nova_compute[186788]: 2025-11-22 09:12:40.505 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:43 np0005531888 nova_compute[186788]: 2025-11-22 09:12:43.306 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:45 np0005531888 nova_compute[186788]: 2025-11-22 09:12:45.505 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:47 np0005531888 podman[262711]: 2025-11-22 09:12:47.692675828 +0000 UTC m=+0.063403932 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 04:12:47 np0005531888 podman[262712]: 2025-11-22 09:12:47.693814646 +0000 UTC m=+0.059042595 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:12:47 np0005531888 nova_compute[186788]: 2025-11-22 09:12:47.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:48 np0005531888 nova_compute[186788]: 2025-11-22 09:12:48.307 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:50 np0005531888 nova_compute[186788]: 2025-11-22 09:12:50.516 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:51 np0005531888 podman[262751]: 2025-11-22 09:12:51.699664177 +0000 UTC m=+0.062383048 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:12:51 np0005531888 podman[262750]: 2025-11-22 09:12:51.700624701 +0000 UTC m=+0.072252041 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 04:12:51 np0005531888 podman[262752]: 2025-11-22 09:12:51.723654488 +0000 UTC m=+0.086416430 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 04:12:53 np0005531888 nova_compute[186788]: 2025-11-22 09:12:53.432 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:53 np0005531888 nova_compute[186788]: 2025-11-22 09:12:53.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:53 np0005531888 nova_compute[186788]: 2025-11-22 09:12:53.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:12:53 np0005531888 nova_compute[186788]: 2025-11-22 09:12:53.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:12:53 np0005531888 nova_compute[186788]: 2025-11-22 09:12:53.970 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:12:55 np0005531888 nova_compute[186788]: 2025-11-22 09:12:55.517 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:55 np0005531888 nova_compute[186788]: 2025-11-22 09:12:55.964 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:58 np0005531888 nova_compute[186788]: 2025-11-22 09:12:58.435 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:00 np0005531888 nova_compute[186788]: 2025-11-22 09:13:00.521 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:03 np0005531888 nova_compute[186788]: 2025-11-22 09:13:03.437 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:04 np0005531888 nova_compute[186788]: 2025-11-22 09:13:04.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:05 np0005531888 nova_compute[186788]: 2025-11-22 09:13:05.523 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:05 np0005531888 nova_compute[186788]: 2025-11-22 09:13:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:05 np0005531888 nova_compute[186788]: 2025-11-22 09:13:05.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:13:07 np0005531888 podman[262814]: 2025-11-22 09:13:07.673333262 +0000 UTC m=+0.046981928 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:13:07 np0005531888 podman[262813]: 2025-11-22 09:13:07.696206356 +0000 UTC m=+0.071910033 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:13:07 np0005531888 nova_compute[186788]: 2025-11-22 09:13:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:07 np0005531888 nova_compute[186788]: 2025-11-22 09:13:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:07 np0005531888 nova_compute[186788]: 2025-11-22 09:13:07.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:13:07 np0005531888 nova_compute[186788]: 2025-11-22 09:13:07.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:13:07 np0005531888 nova_compute[186788]: 2025-11-22 09:13:07.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:13:07 np0005531888 nova_compute[186788]: 2025-11-22 09:13:07.985 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.131 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.132 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5699MB free_disk=73.25900268554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.132 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.132 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.198 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.198 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.452 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.478 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.479 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.479 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:13:08 np0005531888 nova_compute[186788]: 2025-11-22 09:13:08.484 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:09 np0005531888 nova_compute[186788]: 2025-11-22 09:13:09.480 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:10 np0005531888 nova_compute[186788]: 2025-11-22 09:13:10.526 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:13 np0005531888 nova_compute[186788]: 2025-11-22 09:13:13.488 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:14 np0005531888 nova_compute[186788]: 2025-11-22 09:13:14.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:15 np0005531888 nova_compute[186788]: 2025-11-22 09:13:15.528 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:18 np0005531888 nova_compute[186788]: 2025-11-22 09:13:18.489 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:18 np0005531888 podman[262852]: 2025-11-22 09:13:18.682766978 +0000 UTC m=+0.055124109 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 22 04:13:18 np0005531888 podman[262853]: 2025-11-22 09:13:18.688765896 +0000 UTC m=+0.054794621 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:13:20 np0005531888 nova_compute[186788]: 2025-11-22 09:13:20.529 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:22 np0005531888 podman[262894]: 2025-11-22 09:13:22.68344432 +0000 UTC m=+0.056337368 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc.)
Nov 22 04:13:22 np0005531888 podman[262895]: 2025-11-22 09:13:22.689428427 +0000 UTC m=+0.055775564 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:13:22 np0005531888 podman[262896]: 2025-11-22 09:13:22.71672764 +0000 UTC m=+0.077728126 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:13:23 np0005531888 nova_compute[186788]: 2025-11-22 09:13:23.492 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:25 np0005531888 nova_compute[186788]: 2025-11-22 09:13:25.531 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:28 np0005531888 nova_compute[186788]: 2025-11-22 09:13:28.494 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:30 np0005531888 nova_compute[186788]: 2025-11-22 09:13:30.573 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:33 np0005531888 nova_compute[186788]: 2025-11-22 09:13:33.498 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:35 np0005531888 nova_compute[186788]: 2025-11-22 09:13:35.575 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:13:36.898 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:13:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:13:36.899 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:13:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:13:36.899 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:13:37 np0005531888 nova_compute[186788]: 2025-11-22 09:13:37.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:38 np0005531888 nova_compute[186788]: 2025-11-22 09:13:38.501 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:38 np0005531888 systemd[1]: Starting dnf makecache...
Nov 22 04:13:38 np0005531888 podman[262958]: 2025-11-22 09:13:38.693239256 +0000 UTC m=+0.062224104 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 04:13:38 np0005531888 podman[262957]: 2025-11-22 09:13:38.697332316 +0000 UTC m=+0.069660316 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:13:38 np0005531888 dnf[262959]: Metadata cache refreshed recently.
Nov 22 04:13:38 np0005531888 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 22 04:13:38 np0005531888 systemd[1]: Finished dnf makecache.
Nov 22 04:13:40 np0005531888 nova_compute[186788]: 2025-11-22 09:13:40.605 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:43 np0005531888 nova_compute[186788]: 2025-11-22 09:13:43.503 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:45 np0005531888 nova_compute[186788]: 2025-11-22 09:13:45.607 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:48 np0005531888 nova_compute[186788]: 2025-11-22 09:13:48.505 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:49 np0005531888 podman[263001]: 2025-11-22 09:13:49.68840755 +0000 UTC m=+0.059829044 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 04:13:49 np0005531888 podman[263002]: 2025-11-22 09:13:49.712030012 +0000 UTC m=+0.084035801 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:13:49 np0005531888 nova_compute[186788]: 2025-11-22 09:13:49.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:50 np0005531888 nova_compute[186788]: 2025-11-22 09:13:50.608 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:53 np0005531888 nova_compute[186788]: 2025-11-22 09:13:53.508 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:53 np0005531888 nova_compute[186788]: 2025-11-22 09:13:53.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:53 np0005531888 nova_compute[186788]: 2025-11-22 09:13:53.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:13:53 np0005531888 nova_compute[186788]: 2025-11-22 09:13:53.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:13:53 np0005531888 nova_compute[186788]: 2025-11-22 09:13:53.978 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:13:54 np0005531888 podman[263044]: 2025-11-22 09:13:54.184549749 +0000 UTC m=+0.055667672 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 04:13:54 np0005531888 podman[263045]: 2025-11-22 09:13:54.188772083 +0000 UTC m=+0.056650377 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible)
Nov 22 04:13:54 np0005531888 podman[263046]: 2025-11-22 09:13:54.221147491 +0000 UTC m=+0.084170725 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118)
Nov 22 04:13:55 np0005531888 nova_compute[186788]: 2025-11-22 09:13:55.611 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:57 np0005531888 nova_compute[186788]: 2025-11-22 09:13:57.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:58 np0005531888 nova_compute[186788]: 2025-11-22 09:13:58.511 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:00 np0005531888 nova_compute[186788]: 2025-11-22 09:14:00.612 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:03 np0005531888 nova_compute[186788]: 2025-11-22 09:14:03.514 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:04 np0005531888 nova_compute[186788]: 2025-11-22 09:14:04.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:05 np0005531888 nova_compute[186788]: 2025-11-22 09:14:05.613 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:06 np0005531888 nova_compute[186788]: 2025-11-22 09:14:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:06 np0005531888 nova_compute[186788]: 2025-11-22 09:14:06.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:14:07 np0005531888 nova_compute[186788]: 2025-11-22 09:14:07.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:07 np0005531888 nova_compute[186788]: 2025-11-22 09:14:07.981 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:14:07 np0005531888 nova_compute[186788]: 2025-11-22 09:14:07.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:14:07 np0005531888 nova_compute[186788]: 2025-11-22 09:14:07.982 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:14:07 np0005531888 nova_compute[186788]: 2025-11-22 09:14:07.982 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.114 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.114 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5683MB free_disk=73.25900268554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.115 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.115 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.180 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.181 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.267 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.296 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.298 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.298 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:14:08 np0005531888 nova_compute[186788]: 2025-11-22 09:14:08.536 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:09 np0005531888 nova_compute[186788]: 2025-11-22 09:14:09.299 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:09 np0005531888 podman[263107]: 2025-11-22 09:14:09.678424975 +0000 UTC m=+0.051927621 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:14:09 np0005531888 podman[263108]: 2025-11-22 09:14:09.706546298 +0000 UTC m=+0.075940213 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 04:14:09 np0005531888 nova_compute[186788]: 2025-11-22 09:14:09.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:10 np0005531888 nova_compute[186788]: 2025-11-22 09:14:10.617 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:13 np0005531888 nova_compute[186788]: 2025-11-22 09:14:13.540 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:15 np0005531888 nova_compute[186788]: 2025-11-22 09:14:15.620 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:16 np0005531888 nova_compute[186788]: 2025-11-22 09:14:16.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:18 np0005531888 nova_compute[186788]: 2025-11-22 09:14:18.544 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:20 np0005531888 nova_compute[186788]: 2025-11-22 09:14:20.623 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:20 np0005531888 podman[263147]: 2025-11-22 09:14:20.674874123 +0000 UTC m=+0.052012752 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 22 04:14:20 np0005531888 podman[263148]: 2025-11-22 09:14:20.681723231 +0000 UTC m=+0.053278003 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:14:23 np0005531888 nova_compute[186788]: 2025-11-22 09:14:23.548 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:24 np0005531888 podman[263191]: 2025-11-22 09:14:24.680079938 +0000 UTC m=+0.053227682 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 04:14:24 np0005531888 podman[263190]: 2025-11-22 09:14:24.681285578 +0000 UTC m=+0.058165055 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Nov 22 04:14:24 np0005531888 podman[263192]: 2025-11-22 09:14:24.715332476 +0000 UTC m=+0.084526733 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 04:14:25 np0005531888 nova_compute[186788]: 2025-11-22 09:14:25.625 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:28 np0005531888 nova_compute[186788]: 2025-11-22 09:14:28.550 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:30 np0005531888 nova_compute[186788]: 2025-11-22 09:14:30.627 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:33 np0005531888 nova_compute[186788]: 2025-11-22 09:14:33.553 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:35 np0005531888 nova_compute[186788]: 2025-11-22 09:14:35.629 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:14:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:14:36.899 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:14:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:14:36.900 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:14:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:14:36.900 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:14:38 np0005531888 nova_compute[186788]: 2025-11-22 09:14:38.555 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:40 np0005531888 nova_compute[186788]: 2025-11-22 09:14:40.629 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:40 np0005531888 podman[263257]: 2025-11-22 09:14:40.681304303 +0000 UTC m=+0.054200457 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:14:40 np0005531888 podman[263258]: 2025-11-22 09:14:40.681375865 +0000 UTC m=+0.050306301 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 04:14:43 np0005531888 nova_compute[186788]: 2025-11-22 09:14:43.557 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:45 np0005531888 nova_compute[186788]: 2025-11-22 09:14:45.631 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:48 np0005531888 nova_compute[186788]: 2025-11-22 09:14:48.560 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:50 np0005531888 nova_compute[186788]: 2025-11-22 09:14:50.632 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:50 np0005531888 nova_compute[186788]: 2025-11-22 09:14:50.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:51 np0005531888 podman[263300]: 2025-11-22 09:14:51.676683653 +0000 UTC m=+0.047267224 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:14:51 np0005531888 podman[263299]: 2025-11-22 09:14:51.685493501 +0000 UTC m=+0.059832625 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 04:14:53 np0005531888 nova_compute[186788]: 2025-11-22 09:14:53.563 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:54 np0005531888 nova_compute[186788]: 2025-11-22 09:14:54.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:54 np0005531888 nova_compute[186788]: 2025-11-22 09:14:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:14:54 np0005531888 nova_compute[186788]: 2025-11-22 09:14:54.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:14:54 np0005531888 nova_compute[186788]: 2025-11-22 09:14:54.972 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:14:55 np0005531888 nova_compute[186788]: 2025-11-22 09:14:55.634 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:55 np0005531888 podman[263343]: 2025-11-22 09:14:55.673251035 +0000 UTC m=+0.051358576 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7)
Nov 22 04:14:55 np0005531888 podman[263344]: 2025-11-22 09:14:55.679578631 +0000 UTC m=+0.053156881 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:14:55 np0005531888 podman[263345]: 2025-11-22 09:14:55.714613564 +0000 UTC m=+0.084036791 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 04:14:57 np0005531888 nova_compute[186788]: 2025-11-22 09:14:57.966 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:58 np0005531888 nova_compute[186788]: 2025-11-22 09:14:58.566 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:00 np0005531888 nova_compute[186788]: 2025-11-22 09:15:00.636 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:03 np0005531888 nova_compute[186788]: 2025-11-22 09:15:03.568 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:05 np0005531888 nova_compute[186788]: 2025-11-22 09:15:05.637 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:05 np0005531888 nova_compute[186788]: 2025-11-22 09:15:05.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:06 np0005531888 nova_compute[186788]: 2025-11-22 09:15:06.953 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:06 np0005531888 nova_compute[186788]: 2025-11-22 09:15:06.953 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:15:08 np0005531888 nova_compute[186788]: 2025-11-22 09:15:08.570 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:08 np0005531888 nova_compute[186788]: 2025-11-22 09:15:08.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:09 np0005531888 nova_compute[186788]: 2025-11-22 09:15:09.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:09 np0005531888 nova_compute[186788]: 2025-11-22 09:15:09.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:09 np0005531888 nova_compute[186788]: 2025-11-22 09:15:09.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:15:09 np0005531888 nova_compute[186788]: 2025-11-22 09:15:09.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:15:09 np0005531888 nova_compute[186788]: 2025-11-22 09:15:09.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:15:09 np0005531888 nova_compute[186788]: 2025-11-22 09:15:09.978 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.129 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.130 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5702MB free_disk=73.25900268554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.130 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.131 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.230 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.230 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.252 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.278 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.280 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.280 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:15:10 np0005531888 nova_compute[186788]: 2025-11-22 09:15:10.640 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:11 np0005531888 podman[263408]: 2025-11-22 09:15:11.677430142 +0000 UTC m=+0.055320324 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:15:11 np0005531888 podman[263409]: 2025-11-22 09:15:11.678439297 +0000 UTC m=+0.054263378 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 04:15:13 np0005531888 nova_compute[186788]: 2025-11-22 09:15:13.574 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:15 np0005531888 nova_compute[186788]: 2025-11-22 09:15:15.644 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:18 np0005531888 nova_compute[186788]: 2025-11-22 09:15:18.576 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:19 np0005531888 nova_compute[186788]: 2025-11-22 09:15:19.280 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:20 np0005531888 nova_compute[186788]: 2025-11-22 09:15:20.647 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:22 np0005531888 podman[263450]: 2025-11-22 09:15:22.674595827 +0000 UTC m=+0.044414615 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:15:22 np0005531888 podman[263449]: 2025-11-22 09:15:22.677567111 +0000 UTC m=+0.051371917 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 04:15:23 np0005531888 nova_compute[186788]: 2025-11-22 09:15:23.578 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:25 np0005531888 nova_compute[186788]: 2025-11-22 09:15:25.647 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:26 np0005531888 podman[263491]: 2025-11-22 09:15:26.695849957 +0000 UTC m=+0.063777052 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 04:15:26 np0005531888 podman[263492]: 2025-11-22 09:15:26.71099182 +0000 UTC m=+0.075284476 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 04:15:26 np0005531888 podman[263490]: 2025-11-22 09:15:26.726448881 +0000 UTC m=+0.094017317 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 04:15:28 np0005531888 nova_compute[186788]: 2025-11-22 09:15:28.602 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:29 np0005531888 nova_compute[186788]: 2025-11-22 09:15:29.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:30 np0005531888 nova_compute[186788]: 2025-11-22 09:15:30.649 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:33 np0005531888 nova_compute[186788]: 2025-11-22 09:15:33.604 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:33 np0005531888 nova_compute[186788]: 2025-11-22 09:15:33.967 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:33 np0005531888 nova_compute[186788]: 2025-11-22 09:15:33.967 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 04:15:33 np0005531888 nova_compute[186788]: 2025-11-22 09:15:33.984 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 04:15:35 np0005531888 nova_compute[186788]: 2025-11-22 09:15:35.651 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:15:36.901 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:15:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:15:36.901 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:15:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:15:36.901 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:15:38 np0005531888 nova_compute[186788]: 2025-11-22 09:15:38.652 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:40 np0005531888 nova_compute[186788]: 2025-11-22 09:15:40.654 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:42 np0005531888 podman[263556]: 2025-11-22 09:15:42.689986688 +0000 UTC m=+0.061948068 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:15:42 np0005531888 podman[263557]: 2025-11-22 09:15:42.714416149 +0000 UTC m=+0.084817139 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 04:15:42 np0005531888 nova_compute[186788]: 2025-11-22 09:15:42.966 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:43 np0005531888 nova_compute[186788]: 2025-11-22 09:15:43.655 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:44 np0005531888 nova_compute[186788]: 2025-11-22 09:15:44.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:44 np0005531888 nova_compute[186788]: 2025-11-22 09:15:44.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 04:15:45 np0005531888 nova_compute[186788]: 2025-11-22 09:15:45.657 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:48 np0005531888 nova_compute[186788]: 2025-11-22 09:15:48.686 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:50 np0005531888 nova_compute[186788]: 2025-11-22 09:15:50.658 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:51 np0005531888 nova_compute[186788]: 2025-11-22 09:15:51.975 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:53 np0005531888 nova_compute[186788]: 2025-11-22 09:15:53.688 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:53 np0005531888 podman[263600]: 2025-11-22 09:15:53.698417459 +0000 UTC m=+0.058204015 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:15:53 np0005531888 podman[263599]: 2025-11-22 09:15:53.7061334 +0000 UTC m=+0.067449634 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 22 04:15:55 np0005531888 nova_compute[186788]: 2025-11-22 09:15:55.659 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:56 np0005531888 nova_compute[186788]: 2025-11-22 09:15:56.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:56 np0005531888 nova_compute[186788]: 2025-11-22 09:15:56.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:15:56 np0005531888 nova_compute[186788]: 2025-11-22 09:15:56.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:15:56 np0005531888 nova_compute[186788]: 2025-11-22 09:15:56.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:15:57 np0005531888 podman[263643]: 2025-11-22 09:15:57.683283122 +0000 UTC m=+0.050965566 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 22 04:15:57 np0005531888 podman[263644]: 2025-11-22 09:15:57.706989536 +0000 UTC m=+0.071608215 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:15:57 np0005531888 podman[263645]: 2025-11-22 09:15:57.738490733 +0000 UTC m=+0.099418510 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 04:15:57 np0005531888 nova_compute[186788]: 2025-11-22 09:15:57.963 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:58 np0005531888 nova_compute[186788]: 2025-11-22 09:15:58.690 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:00 np0005531888 nova_compute[186788]: 2025-11-22 09:16:00.692 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:03 np0005531888 nova_compute[186788]: 2025-11-22 09:16:03.693 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:05 np0005531888 nova_compute[186788]: 2025-11-22 09:16:05.696 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:05 np0005531888 nova_compute[186788]: 2025-11-22 09:16:05.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:07 np0005531888 nova_compute[186788]: 2025-11-22 09:16:07.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:07 np0005531888 nova_compute[186788]: 2025-11-22 09:16:07.954 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:16:08 np0005531888 nova_compute[186788]: 2025-11-22 09:16:08.697 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:08 np0005531888 nova_compute[186788]: 2025-11-22 09:16:08.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:10 np0005531888 nova_compute[186788]: 2025-11-22 09:16:10.738 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:10 np0005531888 nova_compute[186788]: 2025-11-22 09:16:10.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:11 np0005531888 nova_compute[186788]: 2025-11-22 09:16:11.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:11 np0005531888 nova_compute[186788]: 2025-11-22 09:16:11.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:16:11 np0005531888 nova_compute[186788]: 2025-11-22 09:16:11.983 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:16:11 np0005531888 nova_compute[186788]: 2025-11-22 09:16:11.984 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:16:11 np0005531888 nova_compute[186788]: 2025-11-22 09:16:11.984 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.145 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.146 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5701MB free_disk=73.25900268554688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.146 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.146 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.309 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.310 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.325 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing inventories for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.344 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating ProviderTree inventory for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.344 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Updating inventory in ProviderTree for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.361 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing aggregate associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.389 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Refreshing trait associations for resource provider 1afd6948-7df7-46e7-8718-35e2b3007a5d, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.414 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.437 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.438 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:16:12 np0005531888 nova_compute[186788]: 2025-11-22 09:16:12.438 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:16:13 np0005531888 podman[263713]: 2025-11-22 09:16:13.679954994 +0000 UTC m=+0.045723098 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 04:16:13 np0005531888 podman[263712]: 2025-11-22 09:16:13.680888037 +0000 UTC m=+0.050401342 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:16:13 np0005531888 nova_compute[186788]: 2025-11-22 09:16:13.699 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:15 np0005531888 nova_compute[186788]: 2025-11-22 09:16:15.740 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:18 np0005531888 nova_compute[186788]: 2025-11-22 09:16:18.701 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:20 np0005531888 nova_compute[186788]: 2025-11-22 09:16:20.787 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:21 np0005531888 nova_compute[186788]: 2025-11-22 09:16:21.437 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:23 np0005531888 nova_compute[186788]: 2025-11-22 09:16:23.703 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:24 np0005531888 podman[263752]: 2025-11-22 09:16:24.675830007 +0000 UTC m=+0.047805410 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:16:24 np0005531888 podman[263751]: 2025-11-22 09:16:24.688603541 +0000 UTC m=+0.062879620 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 22 04:16:25 np0005531888 nova_compute[186788]: 2025-11-22 09:16:25.790 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:28 np0005531888 podman[263792]: 2025-11-22 09:16:28.694714758 +0000 UTC m=+0.067376031 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 04:16:28 np0005531888 nova_compute[186788]: 2025-11-22 09:16:28.705 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:28 np0005531888 podman[263793]: 2025-11-22 09:16:28.70981096 +0000 UTC m=+0.076499976 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118)
Nov 22 04:16:28 np0005531888 podman[263794]: 2025-11-22 09:16:28.747315033 +0000 UTC m=+0.110807760 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 04:16:30 np0005531888 nova_compute[186788]: 2025-11-22 09:16:30.792 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:33 np0005531888 nova_compute[186788]: 2025-11-22 09:16:33.707 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:35 np0005531888 nova_compute[186788]: 2025-11-22 09:16:35.795 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.869 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ceilometer_agent_compute[197480]: 2025-11-22 09:16:36.870 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:16:36.902 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:16:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:16:36.903 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:16:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:16:36.903 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:16:38 np0005531888 nova_compute[186788]: 2025-11-22 09:16:38.710 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:40 np0005531888 nova_compute[186788]: 2025-11-22 09:16:40.796 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:43 np0005531888 nova_compute[186788]: 2025-11-22 09:16:43.712 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:44 np0005531888 podman[263855]: 2025-11-22 09:16:44.664964691 +0000 UTC m=+0.042272373 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:16:44 np0005531888 podman[263856]: 2025-11-22 09:16:44.674190498 +0000 UTC m=+0.047021450 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 22 04:16:45 np0005531888 nova_compute[186788]: 2025-11-22 09:16:45.845 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:48 np0005531888 nova_compute[186788]: 2025-11-22 09:16:48.715 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:50 np0005531888 nova_compute[186788]: 2025-11-22 09:16:50.847 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:53 np0005531888 nova_compute[186788]: 2025-11-22 09:16:53.720 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:53 np0005531888 nova_compute[186788]: 2025-11-22 09:16:53.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:55 np0005531888 podman[263900]: 2025-11-22 09:16:55.685725076 +0000 UTC m=+0.054664297 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:16:55 np0005531888 podman[263899]: 2025-11-22 09:16:55.692335439 +0000 UTC m=+0.061232599 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 04:16:55 np0005531888 nova_compute[186788]: 2025-11-22 09:16:55.913 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:57 np0005531888 nova_compute[186788]: 2025-11-22 09:16:57.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:57 np0005531888 nova_compute[186788]: 2025-11-22 09:16:57.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:16:57 np0005531888 nova_compute[186788]: 2025-11-22 09:16:57.956 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:16:57 np0005531888 nova_compute[186788]: 2025-11-22 09:16:57.969 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:16:58 np0005531888 nova_compute[186788]: 2025-11-22 09:16:58.729 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:59 np0005531888 podman[263942]: 2025-11-22 09:16:59.677770666 +0000 UTC m=+0.053967560 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 22 04:16:59 np0005531888 podman[263943]: 2025-11-22 09:16:59.688533142 +0000 UTC m=+0.056261557 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 04:16:59 np0005531888 podman[263944]: 2025-11-22 09:16:59.714182873 +0000 UTC m=+0.078221887 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 04:16:59 np0005531888 nova_compute[186788]: 2025-11-22 09:16:59.962 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:00 np0005531888 nova_compute[186788]: 2025-11-22 09:17:00.914 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:03 np0005531888 nova_compute[186788]: 2025-11-22 09:17:03.730 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:05 np0005531888 nova_compute[186788]: 2025-11-22 09:17:05.878 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:05 np0005531888 nova_compute[186788]: 2025-11-22 09:17:05.975 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:06 np0005531888 nova_compute[186788]: 2025-11-22 09:17:06.973 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:08 np0005531888 nova_compute[186788]: 2025-11-22 09:17:08.731 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:08 np0005531888 nova_compute[186788]: 2025-11-22 09:17:08.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:09 np0005531888 nova_compute[186788]: 2025-11-22 09:17:09.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:09 np0005531888 nova_compute[186788]: 2025-11-22 09:17:09.955 186792 DEBUG nova.compute.manager [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:17:10 np0005531888 nova_compute[186788]: 2025-11-22 09:17:10.976 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:11 np0005531888 nova_compute[186788]: 2025-11-22 09:17:11.954 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:11 np0005531888 nova_compute[186788]: 2025-11-22 09:17:11.955 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:11 np0005531888 nova_compute[186788]: 2025-11-22 09:17:11.977 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:17:11 np0005531888 nova_compute[186788]: 2025-11-22 09:17:11.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:17:11 np0005531888 nova_compute[186788]: 2025-11-22 09:17:11.978 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:17:11 np0005531888 nova_compute[186788]: 2025-11-22 09:17:11.978 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.133 186792 WARNING nova.virt.libvirt.driver [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.133 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5702MB free_disk=73.25939178466797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.134 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.134 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.203 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.204 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.234 186792 DEBUG nova.compute.provider_tree [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed in ProviderTree for provider: 1afd6948-7df7-46e7-8718-35e2b3007a5d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.248 186792 DEBUG nova.scheduler.client.report [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Inventory has not changed for provider 1afd6948-7df7-46e7-8718-35e2b3007a5d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.249 186792 DEBUG nova.compute.resource_tracker [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:17:12 np0005531888 nova_compute[186788]: 2025-11-22 09:17:12.250 186792 DEBUG oslo_concurrency.lockutils [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:17:13 np0005531888 nova_compute[186788]: 2025-11-22 09:17:13.795 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:15 np0005531888 podman[264010]: 2025-11-22 09:17:15.665206733 +0000 UTC m=+0.039768512 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:17:15 np0005531888 podman[264011]: 2025-11-22 09:17:15.677501936 +0000 UTC m=+0.047454610 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 04:17:15 np0005531888 nova_compute[186788]: 2025-11-22 09:17:15.977 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:18 np0005531888 nova_compute[186788]: 2025-11-22 09:17:18.797 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:20 np0005531888 nova_compute[186788]: 2025-11-22 09:17:20.980 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:23 np0005531888 nova_compute[186788]: 2025-11-22 09:17:23.250 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:23 np0005531888 nova_compute[186788]: 2025-11-22 09:17:23.800 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:25 np0005531888 nova_compute[186788]: 2025-11-22 09:17:25.982 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:26 np0005531888 podman[264053]: 2025-11-22 09:17:26.673416778 +0000 UTC m=+0.045178313 container health_status 936c8bbdb57670ae500c4ab52242478105418b532c854e5e800f17ad751b8256 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:17:26 np0005531888 podman[264052]: 2025-11-22 09:17:26.673747786 +0000 UTC m=+0.052903413 container health_status 700e348b810190593def724502052e0d521144de4dc6458bb680553f99b2e90b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:17:28 np0005531888 nova_compute[186788]: 2025-11-22 09:17:28.803 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:30 np0005531888 podman[264096]: 2025-11-22 09:17:30.687329608 +0000 UTC m=+0.056668118 container health_status 5b3720529f72f5f1103ee0695543d72d65d7ca32d5607b13f54fdc6ad54b0c2e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, config_id=edpm, distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 22 04:17:30 np0005531888 podman[264097]: 2025-11-22 09:17:30.69269351 +0000 UTC m=+0.056709109 container health_status 7fe368d40eda2c9c7b9f7c25e400ac33a3bddb4fb020b883b737e62d7c5026a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 04:17:30 np0005531888 podman[264098]: 2025-11-22 09:17:30.747584552 +0000 UTC m=+0.108775161 container health_status cbf38c1c1defd3515fa74fab0ac07853bcf1cd082b4b421b3d67fa40ca373006 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:17:30 np0005531888 nova_compute[186788]: 2025-11-22 09:17:30.983 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:33 np0005531888 nova_compute[186788]: 2025-11-22 09:17:33.806 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:35 np0005531888 nova_compute[186788]: 2025-11-22 09:17:35.985 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:17:36.903 104023 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:17:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:17:36.904 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:17:36 np0005531888 ovn_metadata_agent[104018]: 2025-11-22 09:17:36.904 104023 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:17:38 np0005531888 nova_compute[186788]: 2025-11-22 09:17:38.843 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:41 np0005531888 nova_compute[186788]: 2025-11-22 09:17:41.029 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:41 np0005531888 systemd-logind[825]: New session 52 of user zuul.
Nov 22 04:17:41 np0005531888 systemd[1]: Started Session 52 of User zuul.
Nov 22 04:17:42 np0005531888 nova_compute[186788]: 2025-11-22 09:17:42.949 186792 DEBUG oslo_service.periodic_task [None req-c61a496c-7970-4ec4-ae17-c72976159b50 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:43 np0005531888 nova_compute[186788]: 2025-11-22 09:17:43.846 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:46 np0005531888 nova_compute[186788]: 2025-11-22 09:17:46.032 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:46 np0005531888 ovs-vsctl[264338]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 22 04:17:46 np0005531888 podman[264362]: 2025-11-22 09:17:46.688217713 +0000 UTC m=+0.059670252 container health_status 1b615f93aa3857dba97d3784049b5715f451ccc3349f62e1a52994dfff388120 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:17:46 np0005531888 podman[264364]: 2025-11-22 09:17:46.70799041 +0000 UTC m=+0.079220723 container health_status c195c689eb916a118dbab45baf3adeeabf120ae853a7579782255416706191ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 04:17:47 np0005531888 virtqemud[186358]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 22 04:17:47 np0005531888 virtqemud[186358]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 22 04:17:47 np0005531888 virtqemud[186358]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 04:17:48 np0005531888 nova_compute[186788]: 2025-11-22 09:17:48.848 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:50 np0005531888 systemd[1]: Starting Hostname Service...
Nov 22 04:17:50 np0005531888 systemd[1]: Started Hostname Service.
Nov 22 04:17:51 np0005531888 nova_compute[186788]: 2025-11-22 09:17:51.033 186792 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
